Sophia the Robot and What it Tells Us about the Current State of AI

By Lelia Erscoi

The choice of pronouns is no arbitrary thing- Sophia’s (“Sophia – Hanson Robotics”, 2022) creators are doing all they can to make you think of it as a “her”. From its appearance, modeled on a mix of women’s faces – Audrey Hepburn’s, Egyptian Queen Nefertiti’s, and its own inventor David Hanson wife’s (Chung, 2022) – to the fact that it was granted human rights in Saudi Arabia (Parviainen, Coeckelbergh, 2021), the whole story behind Sophia is a very fascinating one that aims to inspire. However, that’s what it mostly is – a story.

Artificial intelligence has many vague definitions, the awe it generates doesn’t help with settling on one. Generally, AI is used in reference to software or machines that perform a task either at the level of or better than humans. For example, a self-driving car is considered an AI, and so is Netflix’s recommender system. Such instances of helper technology are so embedded in our everyday lives that they are hardly exciting anymore, a phenomenon actually called the “AI effect”: as soon as we get used to the technology, we tend to not recognize it as AI anymore (Bailey, 2016).

This is why research and development and PR teams work hand in hand to not only create innovative software but also keep it appealing to its user base. We’ve probably all interacted with a chatbot in the past month (I.e. some form of communication software with the purpose of either entertaining a conversation or offering resources on a specific topic such as customer support). Even if it performed seamlessly, leaving you feeling extremely satisfied with it, you probably didn’t go around telling your friends what an amazing and intriguing experience you had. Chatbots are already a common technology. Now what if this chatbot had a physical body, smiled at you, and even participated in conferences?

Sophia the Robot has by now been paraded at many tech events, being the staple of innovation in the field of AI. Sophia is considered a public figure, starring on Tonight Show Starring Jimmy Fallon and even being invited to “speak” to the United Nations. Perhaps Sophia’s most unique feature is its human-like appearance, tailored to look like a typically feminine woman. In fact, Sophia is gender-coded in such a stereotypical way that it was featured on the cover page of Cosmopolitan and Elle (Loza de Siles, 2020), fashion and entertainment magazines targeted at women.

The choice of creating a robot, whose main purpose is social interactions, which is to be perceived as a woman is sadly a very common one. This is due to the assumption that women are regarded as the more friendly, open, and helpful of the genders. A majority of voice assistants and chatbots, in general, are advertised as female and, more disturbingly, are allowing the user to flirt if they so desire. Apple’s Siri responds to being called a slut with a cheeky “I’d blush if I could” while its male counterpart promptly shuts such phrases down (Ferreira da Costa, 2018). Making “female” helper software that is obedient and submissive is a consequence of the lack of diversity in the field and reinforces toxic gender stereotypes. Such biases setting expectations for the ideal woman- sensual and subservient- feed into rape culture. This is an even more dangerous effect given how often such technologies are black boxes of hidden code and stakeholder intentions. Publicity campaigns such as the one around Sophia are equally hurtful to women. As mentioned before, Sophia was granted citizenship in Saudi Arabia as a PR stunt, a country that didn’t grant women the right to live alone without a male guardian’s permission until 2021 (Parviainen, Coeckelbergh, 2021). Without the proper information, which often requires some effort to get to, through all the smoke and mirrors, typical users often fall victim to such publicity campaigns and unconsciously end up perpetuating biases. Sophia is unfortunately an excellent example of this.   

When looking behind “the hood”, Sophia is a realistic gynoid (feminine robot) equipped with facial recognition, eye tracking, natural language processing, and a lot of actuators to mimic facial muscles (“Sophia – Hanson Robotics”, 2022). In truth, Sophia is fed a range of dialogue options by its creators, from which it can “choose” during a conversation. Associating these prompts with detailed facial expressions and just as expressive hand motions creates the illusion of talking to a life-like person (Gershgorn, 2017). While this impression is convincing, it is not an extremely difficult one to concoct (Rescorla, 2015). We tend to attribute human-like emotions and thoughts to many things and beings around us (I.e. theory of mind, the capacity to understand people by assigning mental states to them), all in an effort to relate and adequately interpret intentions and goals through a framework we are most familiar to- our own. We call the TV “stupid” when it fails to turn on, and we are impressed with how intelligent our dogs are when they react to a few familiar words. In reality, we have little idea what intelligence is, let alone consciousness, and we are most sure about our own experience and expectations, which is  why we use them as heuristics to interact with each other and the world. 

But are such “chatty helper robots” all that bad? Not necessarily, if used with the proper user education. Sophia’s “sister” technology from Hanson Robotics, Grace, is a medical robot meant to relieve some of the work surplus nurses have (Tham, 2021). In the context of the COVID-19 pandemic, Grace could take someone’s temperature and say the probability that a person would be sick with the virus. In addition, Grace’s language recognition software evaluates speech patterns and thus can recognize early signs of dementia, for example. Is Grace going to be the future of medicine? Studies often report that nurses are very unlikely to be replaced by AI in the near future, particularly due to the complex and often delicate interpersonal interaction aspect. However, looking back at how people tend to attribute thoughts and goals to technology that passed the uncanny valley barrier (i.e. the close resemblance of an object to a human and the emotional response that it creates), maybe the complex set of interactions between nurse and patient isn’t the most difficult gap to bridge.

Only recently, a Google employee sounded the alarm that one of the company’s chatbots had gained consciousness. Testing it with various scenarios, the employee testified that it felt like talking to a young child, and fought to bring awareness to this feat, much to Google’s disapproval (Tiku, 2022). Many media outlets quickly picked up on this story, and just like in Sophia’s case, questioned whether we are at the point of achieving singularity, or machine super-intelligence. But just like with Sophia, it didn’t matter that the AI experts quickly dismissed this claim, or that the employee already cautioned his statement by confessing its correlation with his religious belief. The general public was already fascinated, and the conversation was already going.

While Sophia the robot is not a relatively new invention, such discourses recursively appear to challenge us. The prospect of creating things that resemble or are even superior to humans in intelligence and intellect is often so exciting that the bleak reality has no real effect on debriefing the general population. AI doesn’t have to be excellent to be harmful, with Sophia being a prime example of this. And when we get used to technology such as Sophia the Robot, who is to say what the next exciting thing will be able to do.

Sources:

  1. Sophia – Hanson Robotics. Hanson Robotics. (2022). Retrieved from https://www.hansonrobotics.com/sophia/.
  2. Chung, S. (2022). Meet Sophia: The robot who laughs, smiles and frowns just like us. CNN. Retrieved from https://edition.cnn.com/style/article/sophia-robot-artificial-intelligence-smart-creativity/index.html.
  3. Parviainen, J., Coeckelbergh, M. The political choreography of the Sophia robot: beyond robot rights and citizenship to political performances for the social robotics market. AI & Soc 36, 715–724 (2021). https://doi.org/10.1007/s00146-020-01104-w
  4. Bailey, K. (2016). Reframing the “AI Effect”. Medium. Retrieved 15 August 2022, from https://medium.com/@katherinebailey/reframing-the-ai-effect-c445f87ea98b.
  5. Loza de Siles, E. (2020). AI, on the Law of Being: “Feminine” Imagery in Humanoid Robots, Evolving Law as to What Constitutes a Person [Abstract]. Duquesne University School of Law Research Paper No. 2020-12. Retrieved from https://dx.doi.org/10.2139/ssrn.3658667.
  6. Ferreira da Costa, P. (2018). Conversing with Personal Digital Assistants: on Gender and Artificial Intelligence. Journal Of Science And Technology Of The Arts, 10(3), 2. https://doi.org/10.7559/citarj.v10i3.563
  7. Gershgorn, D. (2017). Inside the mechanical brain of the world’s first robot citizen. Quartz. Retrieved 15 August 2022, from https://qz.com/1121547/how-smart-is-the-first-robot-citizen/.
  8. Rescorla, M. (2015). The Computational Theory of Mind. Stanford Encyclopedia of Philosophy. Retrieved 15 August 2022, from https://seop.illc.uva.nl/entries/computational-mind/.
  9. Tham, S. (2021). Meet Grace, the ultra-lifelike nurse robot. CNN. Retrieved 15 August 2022, from https://edition.cnn.com/2021/08/19/asia/grace-hanson-robotics-android-nurse-hnk-spc-intl/index.html.
  10. Tiku, N. (2022). The Google engineer who thinks the company’s AI has come to life. The Washington Post. Retrieved 15 August 2022, from https://www.washingtonpost.com/technology/2022/06/11/google-ai-lamda-blake-lemoine/

Cover image by DALL-E the picture generating AI

Leave a comment