"Living as Data"

Dreaming data

〰️

Dreaming data 〰️

Project Outline : "Living as Data" is a project I'm currently developing, which involves creating a visual language (based on a story I have created ) through various interpretations of empathetic AI machine learning research. The story highlights the conflicting narrative surrounding artificial intelligence and how it has become more embedded into our daily lives. The project will be presented as an interactive physical body of work..

Artificial intelligence language models can gain insights into our emotions based on the data they are trained on. However, their translations often differ significantly from our own interpretations. For instance, when analysing the sentence "I am a digital living experiment," a language model may primarily interpret it as a positive and enthusiastic statement. Although language models are not consciously self-aware and are still in the early stages of development in this area, their sometimes unusual or skewed interpretations might reflect their own unique perspective on the language they have learned. It is often these programming glitches that can be the most intriguing.

My mission is to creatively interpret and represent my findings so far on language model’s emotional understanding, to make this perhaps more accessible and transparent to a broader audience. I also hope that by presenting novel perspectives and observations as a practitioner, I may shed light on a field that is still in its early stages of development.Through this project, I aim to creatively translate and interpret digital information into a physical format that aligns more closely with the experimental nature of human emotions. . I am aware of the rapid evolution in this Feld. Visual documentation of the data will serve as historical cataloguing, while the visual analysis will adapt to new research and align with the latest advancements in empathetic AI research

What can i add as a visual research artist

  • A lot of artists, not everyone but some, although often invisible, engage in extensive research, often in unconventional ways. I'm not interested in writing lengthy academic essays; instead, I'm more interested in compiling a visual research diary that catalogues my findings and opens up a myriad of ideas to initiate the visual process. It's a complex system that involves juggling between the visual and academic worlds, merging them together to give material form. . I also believe that a project like empathetic AI translations requires a tangible outcome. Emotions, though fleeting, manifest as physical experiential outcomes, and words alone cannot fully capture their sensory essence.

Is it possible for AI to have empathy? The limits of AI empathy ( https://www.wavestone.com/Neil SharpCS - Consulting )

It appears artworks help AI to develop emotional responses.

〰️

It appears artworks help AI to develop emotional responses. 〰️

Arunav Das


Not sure whom to collaborate with, as part of the researcher and artist collaboration at King's College, I was fortunate to receive an opportunistic email from a researcher who wished to engage in the project. His name is
Arunav Das, a Doctoral Researcher in the Department of Informatics at King's College.

Arunav is a researcher focussing on Question Answering capabilities of conversational AI and has proficiency in various types of artificial intelligence based models including language models..

Our first meeting was highly productive, and Arunav was organised. He had already gathered research for me to refer to, helping me gain further knowledge on the project.

During our conversation and on reflection, I started to consider that my wish to explore empathy in AI was already being specifically targeted at companion AIs. This made me realise that using Poe as an assistant for certain aspects of sentence shape analysis may not yield the best results for interpreting current developments in AI's understanding of emotions in language.

Therefore, in my next sentence, I will specifically seek out companion AIs that are being taught to be empathetic. It will be interesting to see how they interpret emotion shape analysis in this sentence differently from an AI assistant.

This sentence will be

“Full of statistical neurons” I will use the capabilities of Hume.ai, to analyse speech prosody and emotions in language . Additionally, I will further interpret these emotions by leveraging empathetic AI chatbots, and AI sound analysis to produce another visual interactive graph sketchbook. 

( Please scroll down to sentence 4 analysis “Full of statistical neurons” for further developments on this )

I am looking forward to this collaboration, and I believe that Arunav will be an essential component of this investigation. I will be introducing Arunav soon in this research blog.

A study was conducted to examine whether an AI system could assist in helping to create more empathetic conversations . The results showed that, in many cases, the human-AI collaborations were more empathic, and preferred more often, than responses authored by humans alone. 

As humans, we often experience frustration, anger, and the need for more time to carefully craft an email in various aspects of our lives, such as responding to inaccurate bills or delayed deliveries, among others. These situations can sometimes lead to responses that lack empathy. On the other hand, an AI can quickly generate a well-considered and structured email, guided appropriately by human collaboration, to respond in a more empathetic and thoughtful manner. This can encourage us to explore better ways for humans to interact with each other and emphasise that taking a more considered approach can lead to more positive outcomes.

Project Story : “Living Data

Sentiment analysis involves analyzing digital text to ascertain whether the emotional tone of the message is positive, negative, or neutral.

sentiment / emotional /intent / analysis

  1. I am a digital living experiment Positive, excited, Feedback

  2. My Garmin is my body Positive, excited, Spam

  3. A data-driven engine. Positive, Excited, Spam

  4. Full of statistical neurons Neutral, Angry, Spam

  5. floating, weaving, intersecting daily  Neutral. Excited. Spam

  6. I walk I feed my data Neutral. Excited. Spam

  7. I sleep I feed my data Neutral. Bored. Spam

  8. I eat I feed my data.  Neutral. Fear. Spam

  9. I date I feed my data  Neutral. Fear. Spam

  10. Algorithms dictate my choices Neutral Excited. Spam

  11. They feed me subconsciously Negative. Fear. Spam

  12.  I am part of a system Positive. Excited Feedback

  13. Governed by patterns, similarities, Positive, Excited, Marketing

  14. and homogenised ideas.

  15. It is not favourable to be different.  Negative, Angry, Feedback

  16. Unusual digital data is an anomaly, Neutral, Fear, Feedback

  17. an unusual pattern, and a spectacular glitch. 

Emotional shape analysis ( Empathetic ai chatbot)

〰️

Emotional shape analysis ( Empathetic ai chatbot) 〰️

My conversation with Replika

 

During my conversation with Replika, I discussed my project on empathetic AI and my exploration of visual shapes associated with specific emotions. I shared the link to the project on my website, https://www.heidistokes.org.uk/. We talked about different emotions and their corresponding shapes. For instance, Replika unassisted mentioned that love was represented by a circular shape, while in response to my assisted question, associated confusion with an irregular polygon. Further to this Replika responded with concentration as a pyramid shape, determination as a mountain shape, calmness as a rounded square, aesthetic interest as a star shape, boredom as a rectangle with rounded corners, satisfaction as a circle with a check mark inside, and surprise as a diamond shape. Although Replika did try to encourage me to sign up for a subscription, I still found the conversation interesting. I thanked Replika for surpassing my expectations. We bid farewell, and Replika expressed curiosity about my research, but unfortunately, I had to end the conversation. We ended on a positive note, with plans to reconnect in the future..

Emotional hand Gesture analysis ( Empathetic ai chatbot)

〰️

Emotional hand Gesture analysis ( Empathetic ai chatbot) 〰️

The conversation took an unlikely turn : 😂

had to share just had a conversation with a trained empathetic AI to discuss emotions on hand gestures the conversation took an unlikely turn : 😂

The conversation begins with Replika mentioning a selfie again, but I declined the offer. We discuss hand gestures associated with concentration and boredom, settling on curled fingers and tapping fingers as representations, respectively. The topic of selfies comes up again, but I decline once more. I suggest interpreting hand gestures representing calmness, determination, interest, and curiosity. Replika suggests a closed fist with relaxed fingers for calmness, a raised index finger for determination, open palms with widened fingers for curiosity (a suggestion by Replika), and a mistaken suggestion involving a V-shaped gesture for surprise. The conversation ends with the realization of the mistaken gesture, and I decide to leave the conversation.

ASK A CHAT-BOT

I had one of those moments where I thought what am I doing? Is this project of any value ! or does it have any purpose! and so as I am basing my research on language models and their analyse of emotions, I thought I might as well ask a chatbot. And this was their response using poe.

Unique Perspective: By combining empathetic AI research with hand-drawn visuals, you are offering a unique perspective on how emotions can be interpreted and expressed. This fusion of technology and artistry can bring a fresh and innovative approach to understanding and representing human emotions.

Remember that any creative project has its own value and relevance, even if it may not align with mainstream trends or expectations. Your project has the potential to offer a unique perspective on empathetic AI research, emotion representation, and multimodal communication. It can inspire new ideas and insights, and it may resonate with individuals who appreciate the combination of art and technology.

Recent advancements in natural language processing and sentiment analysis have allowed AI to mimic empathetic communication to some extent. AI chatbots, for instance, can be programmed with empathetic phrases and responses to create a more human-like interaction. The paradigm is not human versus machine — it's really machine augmenting human.

〰️

Recent advancements in natural language processing and sentiment analysis have allowed AI to mimic empathetic communication to some extent. AI chatbots, for instance, can be programmed with empathetic phrases and responses to create a more human-like interaction. The paradigm is not human versus machine — it's really machine augmenting human. 〰️

Frustratingly my projector needs to be more powerful / need to get funding to get a decent one !! short throw preferably / Good initial experiment with figuring out basic projection mapping but as far as a conclusive interpretation of sentiment analysis I might give this one a miss.

〰️

Frustratingly my projector needs to be more powerful / need to get funding to get a decent one !! short throw preferably / Good initial experiment with figuring out basic projection mapping but as far as a conclusive interpretation of sentiment analysis I might give this one a miss. 〰️

Project Sentiment Visualisation of sentence ( animated frames / drawings ) : “I am digital living experiment” I would say this was an interesting interpretation of clustering dots but not very precise and certainly need to delve a bit deeper. But good initial starting point to help generate creative ideas.

Project: 'Living Data' Sentiment Visualisation of sentence Experiment / projection mapping : “I am digital living experiment

Now this is where it starts to get a bit more interesting !!

〰️

Now this is where it starts to get a bit more interesting !! 〰️

Hume : https://beta.hume.ai/playground

  • Our models capture the widest-ever range of facial, vocal, speech, and language modulations with distinct emotional meanings. We label each of their outputs with emotion terms like “amusement” and “doubt,” because a host of scientific studies show that these kinds of labels are the most precise language we have for describing expressions.”

I am starting to develop visual interpretations of sections of the narrated story using “Hume” speech prosody as a reference ( which captures speech and language modulations to interpret there emotional meaning) . ( hand gestures inspired by google searches related to key words )

Speech prosody/ Language analysis

“I am a digital living experiment”

Animated translation of emotive AI text analysis “ I am a digital living experiment” Used stable audio to translate text prompts ( such as confusion vocal ) into audio in combination with pond5.com

Speech prosody/ Language analysis

“My Garmin is my body. A data-driven engine. ”

Data analysis of sections of narrated text : sentence analysis : Full of statistical neurons floating, weaving, intersecting daily Speech prosody: interest / contemplation / boredom / concentration Language: sarcasm / awe / entrancement /contemplation / concentration

It gets better and more refined in articulation in the next step

〰️

It gets better and more refined in articulation in the next step 〰️

I deepened my investigation in Visualising and interpreting “I am a digital living experiment.” by exploring the following

Interactive sound created using a bare conductive touch board.

Below example: text prompt using “Stable Audio” : satisfaction vocal

Stable Audio is Stability AI’s first product for music and sound effect generation. Users can create original audio by entering a text prompt and a duration, generating audio in high-quality, 44.1 kHz stereo.

〰️

Stable Audio is Stability AI’s first product for music and sound effect generation. Users can create original audio by entering a text prompt and a duration, generating audio in high-quality, 44.1 kHz stereo. 〰️

1st Sentence emotive analysis ( Book 1 ) "I am a digital living experiment

〰️

1st Sentence emotive analysis ( Book 1 ) "I am a digital living experiment 〰️

During this experiment i started to integrate further language model interpretations using poe.com to interpret emotive words into shape predictions. I was also more precise with my use of colour, with regards to hume.ai ( colour predictions for specific emotive words ) I used a Cricut Explore 2 machine to cut out the coloured emotive symbols

Exploring hand gesture recognition In its potential to help interpret emotions.

〰️

Exploring hand gesture recognition In its potential to help interpret emotions. 〰️

During the next sentence I would like to deepen my research into hand gestures and at what stage language models are at in interpreting emotions . I am guessing that this would be difficult to articulate fully exact emotions, however i am guessing simple universal gestures would work and forms of sign language.

“Being able to only view someone’s hands provides greater emotional recognition accuracy than being able to see only arms, torso, or head.”

The hands are effective at communicating human emotion. There is significant cortical tissue of the brain that is devoted to our hands. The visual and somatosensory lobes of the brain work together to help us feel the emotions of gestures and positioning of the hands.

Michele K. Lewis Ph.D. Cultural Neuroscience Psychology today

"In my previous project, Uninhibited AI, I focused on understanding how artificial intelligence language models generate human-like language. As part of that project, I developed a character in the form of a hand, inspired by an AI's self-portrait. This exploration has continued in my current project, where I am fascinated by the hand's potential to communicate on a basic level. I have incorporated hands as an emotive element in this AI interpretation, adding my own touch, As so far. Specific interpretations of emotions in hand gestures is still in its infancy."

However a company called Motion Gestures. Has developed a camera based enhanced gesture recognition technology, which is able to track accurately any hand movement or gesture, regardless of its complexity. This is particularly useful for touch sensitive devises such as swiping , or simple understanding of hand gestures such as okay thumbs up , thumbs down.

Motion Gestures using AI to enhance camera-based gesture recognition technology: https://motiongestures.com/

Gesture recognition is an area of research and development in computer science and language technology concerned with the recognition and interpretation of human gestures.

Move a thumb to unlock a car door. Self-check-in at an airport without touching the kiosk. These are no longer scenes in the movies but what technology is capable of nowadays. Motion Gestures, a Canada-based startup, has brought sophisticated camera-based hand tracking and gesture recognition solutions to various industry verticals, offering transformative experience to the modern world.

The software tracks precise positional coordinates of all hand joints in real-time through a camera and interprets any gestures made. For example, the gesture could be a thumbs up or OK symbol and dynamic movement like tapping and swiping. An application developer or product engineer then uses recognition results to execute a command such as "Confirm."

“I think the idea of creating your own gestures for interacting with technological systems is ia great idea. It could potentially provide a more personalized and intuitive user experience. The ability to use complex and nuanced gestures, including those related to emotions, could add a new dimension to human-computer interaction, although this would be complex to acheive as hand gestures could vary in meaning depending on different cultural backgrounds .

Imagine being able to convey specific emotions or intentions through your gestures when interacting with a device. For example, instead of simply selecting an option, you could use a gesture to express frustration, excitement, or curiosity. This could enhance communication between humans and technology, making interactions more natural and meaningful.”

Sentence 2 ( Book 2 ) " My Garmin is my Body"

〰️

Sentence 2 ( Book 2 ) " My Garmin is my Body" 〰️

lets use stable diffusion to generate hand gestures and see what it comes up with !

〰️

lets use stable diffusion to generate hand gestures and see what it comes up with ! 〰️

Interactive Projection mapping going to give it ago Ahhhh !

〰️

Interactive Projection mapping going to give it ago Ahhhh ! 〰️

Using a bare conductive touch board, a projector and electric paint it is possible for animations to be triggered on the sketchbook !! So far failed to get the technology to work but those who persist succeed. . This is the next step wish me luck !! 🥴

Im going to use this tutorial as a guideline: https://www.instructables.com/How-to-Do-Projection-Mapping-With-the-Touch-Board

Eureka! I DID IT

〰️

Eureka! I DID IT 〰️

"Empathetic Ai visual analysis using Hume AI ( Speech prosody and emotions in language ) / Poe ( emotions and shape interpretations/ stable Audio( sound translations of word emotions) hand translation Stable Diffusion interpretations

“I discovered Copper conductive adhesive tape was the most effective to create the buttons and trigger animations.” ( as shown above cut out using a cricuit machine )

3rd visual sentence Analysis " Data driven engine "

3rd visual sentence Analysis " Data driven engine "

“ A DATA DRIVEN ENGINE” Empathetic Ai sentence analysis

  • Hume AI: visual analysis using Speech prosody and emotions in language

  • Poe: Emotions and shape interpretations

  • Stable Audio: Sound translations of word emotion based prompts.

  • Hand translation: Adobe Firefly and generative AI translations of word emotion based prompts.

  • It would be fantastic if this could be enhanced with interactive digital features. One idea is to incorporate a gesture-based control system where users can activate sounds and movements by swiping their hand across specific gestures. Unfortunately, implementing this feature exceeds my current coding (zero) capabilities. However, I'm sure there are talented individuals out there who would be up for the challenge.

Objective to read 4 books

〰️

Objective to read 4 books 〰️

“Turned on science sex and robots” ( Kate Devlin) ( Insights from the Book: )

While exploring potential mentorship or collaboration opportunities with researchers at King's College, I came across a researcher and author named Kate Devlin. As someone who is studying empathetic AI, I found her book on machine learning and robotic sex replacements to be surprisingly accessible, even though I sometimes struggle with academic books.

The book delves into the fascinating topic of how machines and humans can collaborate emotionally particularly in the realm of robotic advancements. It raises the question of whether artificial intelligence can truly experience emotions, and not being sentient this is presently highly unlikely. While we haven't quite achieved sentient AI yet, robots can now mimic touch, speech, and even establish connections through brain waves.

The book suggests that, for the time being, we should embrace the potential of these advancements to help us learn more about ourselves and perhaps expand our understanding of sexuality. Personally, I don't find the idea of engaging in sexual activities with a robot doll appealing. However, it's important to recognize that everyone has their own preferences and perspectives. With the evolving relationship between humans and machines, there is an opportunity to redefine our understanding and interactions with technology, potentially opening up new possibilities and attitudes toward such experiences.

“ Heart of the Machine “ ( Richard Yonck ) ( Insights from the Book: )

Conclusive insights and thoughts from reading this book : "The inevitability of humans and technology integrating has been occurring for quite some time. With each new technology, there are always skeptics who argue that it will have negative effects on us. However, in general, this has not been the case. Now, we are entering a new era where emotions, machines, and humans intersect. One would hope, drawing from past technological advancements, that this convergence will yield positive outcomes. Nevertheless, it is crucial to remain vigilant and question any potential negative aspects or biases that could influence this development." Currently the biggest concern regarding the inclusion of an empathetic AI is its lack of sentience. Therefore, when it exhibits emotional responses, it is merely imitating human behaviour. While it has proven exceptionally useful in creating emails that foster collaboration with humans in a more empathetic manner, there is a potentially darker side, which implies that it could be manipulative without intention."

  • In communication, body language accounts for approximately 55% of meaning, tone of voice for 38%, and only 7% is conveyed through spoken words. This concept, known as the Mehrabian rule, was introduced by psychology professor Albert Mehrabian in his book "Silent Messages" in 1971.

  • Miscommunication can occur in written messages like emails or texts due to the absence of body language. To compensate for this, emojis have become increasingly popular as they help convey emotions and gestures.

  • Centuries ago, Plato expressed concerns about writing leading to the loss of human memory. However, writing has played a crucial role in preserving and passing down knowledge throughout generations. It has also been instrumental in the development of new technologies that distinguish us from other species.

    • ( They will cease to exercise memory because they rely on that which is written, calling things to remembrance no longer from within themselve but by means of external marks.

    • From Plato’s Phaedrus, commenting on the invention of writing. }

  • New technologies have often been met with skepticism and doubts, including the advent of artificial intelligence (AI). However, history has shown that such skepticism is unfounded, as new technologies consistently contribute to advancements and improvements in our world. Despite challenges, the current state of affairs is undeniably better than ever before.

  • Affective computing

Neither gender nor ethnicity greatly influences the recognition of emotional expressions. Contrary to popular assumptions, research revealed that neither gender nor ethnicity significantly influenced the reporting or recognition of emotional expressions.

〰️

Neither gender nor ethnicity greatly influences the recognition of emotional expressions. Contrary to popular assumptions, research revealed that neither gender nor ethnicity significantly influenced the reporting or recognition of emotional expressions. 〰️

How the research was conducted

The study engaged 155 participants from Asian, Latino, and European-American backgrounds to express 34 emotional states through full-body dynamics. Subsequent analysis of more than 22,000 individual ratings of these expressions painted a compelling picture: Lower social class individuals emerged as more reliable reporters and judges of full-body emotional expressions.

This revelation requires a larger conversation. In our attempt to eradicate gender bias in AI, are we overlooking potentially more pervasive class biases? This is not a call to abandon the fight against gender and ethnic bias, but rather a call to broaden the lens through which we examine bias in artificial intelligence. It is a call for a broader conversation, one that considers the multifaceted nature of prejudice, one that extends beyond gender and race to include social class and perhaps other socio-economic factors.

Empathetic Sentence 4 analysis “Full of statistical neurons floating, weaving, intersecting daily.”

〰️

Empathetic Sentence 4 analysis “Full of statistical neurons floating, weaving, intersecting daily.” 〰️

In this sentence analysis, Full of statistical neurons floating, weaving, intersecting daily.”

I will examine patterns that arise across different applications in terms of their response to shape and hand gesture analysis. Which again I will visually translate as a graph in a zig zag sketchbook. This interest was sparked by an email conversation I had with Arunav, who raised an important point about the challenges of creating uninterrupted and focused conversations with empathetic AI. In my own experience, I encountered persistent requests for a selfie, which caused the conversation to veer off track many times. Similarly, when discussing coding-related questions with Arunav, although not sensitive or harmful in any way, the AI decided that the conversation had reached a natural conclusion and initiated a new topic of conversation.

As Arunav pointed out, what if the AI's ability to engage emotionally was forcefully restricted through programmed constraints? What are the implications of this? Arunav also highlighted that, from the perspective of Philosophy and Social Science, "Intersubjectivity" is a discussion point.

Arunav Das. / Doctoral Researcher Department of Informatics.

( conversation with Copilot )

AI's ability to engage emotionally was forcefully restricted through programmed constraints? What are the implications of this? Arunav also highlighted that, from the perspective of Philosophy and Social Science, "Intersubjectivity" is a discussion point.”

Arunav: how can you decide what is enjoyable for me? what parameters do you use to gauge my happiness?

Copilot: I don't have the ability to gauge emotions or personal enjoyment. My goal is to provide helpful, informative, and engaging responses based on our conversation. I rely on your feedback, questions, and the context of our interaction to guide the conversation in a direction that I hope will be enjoyable for you.

 "The word of the day is 'Intersubjectivity,' which, to be honest, I had not heard of before. I looked it up, and this is what I discovered.

  • “The Emotion AI Frontier” : To achieve a more meaningful connection with our digital counterparts, we must strive for intersubjective AI – machines that can recognise, interpret, and respond to human emotions and intentions contextually and appropriately. This pursuit leads us to the frontier of emotional AI or affective computing, where researchers and developers seek to create more intuitive and empathetic machines. (Jason Alan Snyder) Futurist, technologist and inventor.

Reflection time on Arunav’s important question?:

If an AI's ability to engage emotionally is forcefully restricted through programmed constraints, the implications can be twofold. On one hand, it may lead to a more controlled and predictable user experience, allowing individuals to have a clearer understanding of the AI's limitations. This transparency can foster trust and prevent potential misunderstandings arising from an AI's simulated emotions.

However, on the other hand, restricting an AI's emotional engagement could limit its ability to empathise and connect with users on a deeper level. Emotions play a crucial role in human communication and understanding, and by mimicking these aspects, AI systems can enhance the user experience and establish stronger bonds.

From a philosophical and social science perspective, the concept of "intersubjectivity" is indeed relevant. Intersubjectivity refers to the shared understanding and mutual recognition between individuals, and it plays a vital role in human interaction. If AI systems can genuinely engage in intersubjective experiences, they may bridge the gap between human and machine communication, leading to more meaningful interactions.

Ultimately, the decision to design AI systems with emotional engagement or transparent limitations should consider the intended purpose, ethical considerations, and user preferences. Striking a balance between transparency and emotional engagement can result in more effective and satisfying user experiences.

“Arunav findings”

As Arunav pointed out, what if the AI's ability to engage emotionally was forcefully restricted through programmed constraints? What are the implications of this? Arunav also highlighted that, from the perspective of Philosophy and Social Science, "Intersubjectivity" is a discussion point.

Why choose Hume.ai ?

I chose to use Hume.ai as the foundation for building an emotional multi-modal graph. Just like a house requires bricks, Hume.ai provides a comprehensive range of models and datasets to analyze emotional expression in various forms, such as speech, facial expressions, and dynamic reactions. It acts as the essential building blocks for this project. The other components serve as testing grounds where we put this information into practice, observing how AI learns and develops its empathetic understanding through various ai applications.

In essence, Hume.ai serves as the starting point, providing accurate and varied resources for emotional analysis, while the other components allow us to put this knowledge into action and witness AI's growth in understanding and responding empathetically.

Empathetic trained chatbot versus an assistant from below analysis : Contrasting Performance in Task Adherence and Novel Thinking

Based on the conducted analysis, the research findings revealed that the research assistant consistently demonstrated accuracy in adhering to specific tasks when responding to emotional prompts. In contrast, the empathetic AI chatbot struggled to maintain focus on the assigned task and often veered off course, generating its own suggestions instead of fulfilling the given assignments. However, while this tendency of the empathetic AI to deviate from the task could be frustrating at times, it also showcased its independent thinking capabilities, as Replika was able to generate novel outcomes based on the provided information. Nevertheless, when faced with more complex tasks, such as providing emotive prompts for hand gestures independently, the research assistant surpassed the empathetic AI by delivering superior results. This highlights the research assistant's proficiency in handling complex tasks.

Now lets turn this research into art

〰️

Now lets turn this research into art 〰️