This week I am attending the AAAI Symposium for Emotion, Personality and Social Behavior at Stanford. The symposium provides a forum for an interdisciplinary discussion of modeling affect and personality in social behavior. Attendees include researchers studying social computing, virtual reality, game design, robotics, believable agents, affective computing, psychotherapy and the arts. Panel discussions discussed successful interactions between artificial beings and humans.
Some of the discussion topics that I am particularly interested in include:
- How can compelling artificial characters be designed?
- How to facilitate social interaction between humans, and / or artificial beings?
- What are the components in the software toolbox?
- What are the emerging standards in affective artificial characters, robots and systems?
- What architectural designs work best?
- What knowledge base designs work best?
Matthias Scheutz, Associate Professor of Informatics at Indiana University, presented a paper entitled "Empirical Investigations into the Believability of Robot Affect". During the discussion afterwards he suggested that people are more critical of animatronic (physical) entities, compared to characters that are merely imaged. Seems that the bar for believable behavior for robots is higher than for characters which are merely rendered on a computer screen because people don't expect a rendered character to be real anyway.
Adriana Tapus' presentation, "Socially Assistive Robots: The Link between Personality, Empathy, Physiological Signals and Task Performance" provided me two pieces of interesting common-sense information:
- A robot that challenges the user during therapy rather than offering praise will be preferred by extroverts;
- A robot that offers nurturing praise rather than challenging-based motivation by introverts.
In the discussion that followed Adriana's talk, mention was made of (uncited) research which found that extroverts are more sensitive to robot personalities because they seek outside stimulus; introverts are self-stimulated. It seems logical that when a robot first encounters a person about which nothing is known regarding their personality, the robot should display mildly extroverted behavior, and then adapt according to social cues.
Megan Olsen discussed how adding an emotion framework to game engines improved performance in her talk entitled "Emotions for Strategic Real-time Systems". The emotional framework models fear and frustration in the form of emotional maps, which are overlaid on a physical map of terrain. Emotions diffuse out and are picked up by co-operating agents nearby. Emotions have an intensity value which linearly decays over time. She showed an interesting video which illustrated the aggregated emotions of many agents over time. She also showed that emotions can cause groups of synthetic individuals to be more successful in combat. Different game engines responded differently to each emotion; some responded better to the addition of the ability to model fear, which others responded better to the addition of frustration, and NicoWar was able to benefit from the addition of both emotions.
Dr. Eva Hudlicka of Psychometric Associates spoke on "Emotion Modeling 101". Emotion modeling incorporates emotion expression, recognition, generation, and the effect on agent behavior. One could model feelings, moods, emotions, affective states and personality traits; some people mean all of the above when discussing modeling emotion. As well, attitudes and preferences can be modeled. Interpersonal roles include: social coordination, rapid communication of intent; intrapsychic roles include: motivation, homeostatis, and adaptive behavior. Emotions are manifested across multiple interacting modalities: somatic / physiological, cognitive / interpretive, behavioral / motivational and experiential / subjective. Unfortunately, the literature on emotional models is inconsistent and terms are unclear. Eva suggested we view emotion models in terms of emotion generation and effects; that emotion modeling building blocks be identified. Emotion is generated from stimuli and appraisal by the individual; Eva briefly discussed both topics and the need to standardize the process of mapping stimulus to emotion.
During a coffee break, I chatted with Dr. Antonio Camurri, Associate Professor at the University of Genoa, Italy. Seems we both share a passion for boats. Dr. Camurri mentioned EyesWeb, an open software platform project that he leads, that enables the development of real-time multimodal distributed interactive applications.
In the wrap-up session, I drew attention to one of the issues mentioned in the preface to the Technical Report that the Symposium was to intended to address: "What are the emerging standards in affective artificial characters, robots and systems?" One member of the audience (Rosamaria Barone?) mentioned some standards work for characters (unfortunately, I didn't write it down), then an awkward silence followed. I believe that standards will be key to artificial personality and emotion moving from the lab to mainstream society.