Some Thoughts on the Robots
And the importance of remaining human.
As AI becomes more prolific in our culture, I think it is very important to consider critically its impact and influence. Like most things in life, it is neither all good nor all bad; however, it does seem to wield a certain kind of power, and one that should not be taken lightly or for granted.
Cautionary tales about the power of technology have been around for years. Some examples are the movies The Matrix, Her, and more recently Pets United (which pleasantly surprised me!). Even the 1962 book A Wrinkle in Time by Madeleine L’Engle, although not specifically about technology, tells of a dark world where people have become robotic, giving over all the difficulties of life to “It” and losing their humanity for the sake of ease and efficiency. It is a magnificent story about the vulnerability of love that can redeem us from our darkest places, and even though L’Engle probably did not have artificial intelligence in mind when she wrote the novel, it seems to hold a timeless message that we would do well to remember in our current circumstances.
I have come across various stories about AI recently, and want to share my impressions and reflections here. I see numerous advertisements for AI programs to help therapists write their notes more efficiently (and of course it is AI that puts these ads on my webpages). There are AI supports for therapy clients - programs that provide a check-in and kind of coaching for in-between session needs. A therapist was recently “fired” by a client who said they preferred the AI therapist (the therapist’s impression was that AI provided unending validation, which certainly feels good, while the therapist was challenging the client on some areas for growth). Jonathan Haidt’s book The Anxious Generation, and other publications by Haidt, describe the negative impact certain technologies are having on the younger generations. Ezra Klein’s podcast episode “We Have to Really Rethink the Purpose of Education” interviews Rebecca Winthrop and discusses the impact AI is having on education. A couple of articles in Futurism discuss humans dating AI companions and, perhaps the most alarming, people experiencing “ChatGP psychosis.”
Often, the positive aspects of AI are touted as making certain tasks easier and/or more efficient. This is described as a good thing - helpful, nothing harmful to it. But perhaps it is neither all good nor all bad. Getting a little help with seemingly mundane tasks can certainly ease some stress and free up time or energy for other things. Yet, I wonder if that should be the first response? If we need so much help getting things done, if we need to get things done so much more efficiently, is it possible that we’re simply doing too much? Sometimes the more helpful stress-relief is in slowing down, saying no to something, and lessening the load by simply having less to do.
Furthermore, what might we lose in offloading such tasks? Supposedly, we will have more time and energy to think about “more important” things. But could there be something important in making ourselves think through the seemingly mundane and menial things? For example, therapists using AI to help them with documentation means that therapists could have more time available to see more patients or do other kinds of work. But this opens the door to therapists over-working themselves and leading to burn out (which is what AI supposedly should help prevent, by taking some of the workload off). Maybe having to dedicate time to things like writing notes is a kind of safeguard, keeping a helpful limit on the workload. Of course, it’s possible that therapists could use the extra time for doing things with family or positive self-care activities, and in that case would help to prevent burnout. My point is that AI wants to help us overcome our natural limitations, and perhaps we are better off learning to embrace those limitations.
There is another potential loss in offloading these tasks, such as using AI to generate notes: having to slow down and engage the full thought-process of reviewing a session to write a note is a way for therapists to think about their work with clients. Quite often when I am writing a note, I have a new insight or question that I can take to the client in the next session and further their therapy. If I used AI to generate my note, I may very well miss out on such insights. By offloading a seemingly simple task like writing a note, I am also offloading a degree of thinking, and a kind of thinking that AI cannot do for me.
The other thing I think about is the impact AI might have on relationships and/or relational capacities. AI provides instant gratification and unending validation of the user’s thoughts or feelings; humans do not. So initially, an AI therapist, for example, might make you feel better than your human therapist. You don’t have to wait for a session and you will never be asked to face something about yourself that feels bad or ugly. While that might sound good to many people, I think the losses involved in this far outweigh the benefits.
The desire for instant gratification is present from birth - I am hungry, feed me now! This is an impulse that we then have to wrestle with greatly in childhood, and throughout our lives. Learning to tolerate not getting what we want right away develops the capacities for patience and perseverance. These are the building blocks for sticking with an unfinished project, doing things like exercise that are good for us but don’t give immediate results, and also wrestling through difficult times in our intimate relationships.
Which leads us to the desire for validation and feeling “good” versus the desire for growth. Yes, we all want to be understood, but I think we want an understanding that goes beyond simply validating our feelings. In the words of David Wallin, we want to “feel felt” (a phrase from his book Attachment in Psychotherapy). We don’t just want someone to know how we feel, we want someone to care enough to really get it - to empathize, feel it with us. A robot cannot do this. A robot will never care about you. Validation may help you feel better in the moment, but if all a robot can do is validate your feelings it cannot help you actually make changes that you want or that are beneficial for you. Validation on its own does not lead to growth; you may feel better for a moment, but your problems will not go away. You will not grow more capacities to deal with stress or difficulties. You will not learn how to make changes that lead to a more fulfilling life. For this kind of help and growth, you need a human that is willing to journey with you through the messiness of being human and relating to humans. A human therapist is able to care enough about you to help you see where you get in your own way, and to stick with you through the ups and downs of learning new ways of being and relating. And it doesn’t have to be a therapist to help you with this - any human relationship that has both empathy and the willingness to wrestle through difficulties can be healing. But I think it has to be a human, not a robot. If we turn to AI for relational needs, we greatly risk limiting, if not losing, our capacities for intimacy and love. And as L’Engle so eloquently depicts in A Wrinkle in Time, it is our capacity for love that makes us fully human.
Let us hold on to being human, with all the limitations, difficulties, and pain that involves.