Project Team
- Dr.Francis Quek
- Dr. Hwaryoung Seo
- Dr. Takashi Yamauchi
- Angela Chan, Lead Researcher
Introduction
Human interpersonal touch plays an important role in our social interaction in expressing affect and fostering intimacy. It is a special conduit through which humans convey love, comfort, care, trust, support, and appreciation. We argue that unlike other communication channels touch is immediate in that it carries emotional meaning in ways that words cannot accommodate. Touch is so critical to human existence, that a child developing devoid of touch is unimaginable. This begs the question as to why touch is absent as a channel of remote interaction (virtually all systems reported in the literature is used for message passing).Our touch system provides kinesthetic squeezes on the upper arm through a tightening fabric armband that reacts based on the changes in force applied to an input device. The goal is not to fully mimic a human touch, but to determine what characteristics of mediated touch are necessary to effectively communicate affect.
Fig. 1 and Fig. 2 illustrate our concept of immediacy. We reject the model shown in Fig. 1 where touch is construed as being ‘message passing’ where messages are explicitly encoded and then decoded in via the touch channel. Our model of immediacy illustrated in Fig. 2 posits that remote touch can convey affect immediately (without symbolic encoding). We hypothesize that remotely conveyed touch can keep its primal role in immediate affective interaction only in the presence of contextualizing symbolic channels.


By addressing the foundational question of how humans may interact at an immediately affective level, we could potentially transform digitally mediated interpersonal interaction. At the practical level, such interaction could provide a critical dimension in helping people maintain connectedness, and literally ‘staying in touch’. One can imagine, for example, that a parent on travel can increase the vividness of her conversation with a child through contextualized touch, and that loved ones may enhance the affective tone of their communication using such technology.
We are also looking at the application of this technology to health issues relating to disassociation. For example, with autism, the individual who has difficulty with direct contact may be able to interact (or a child calmed) using a variant of remote social touch. Similarly, individuals with anxiety disorders like post-traumatic stress disorder may be able to benefit from remote physical touch.
Devices and Studies
Based on s careful review of the meaning of touch types and locations (see Wang & Quek TEI 2010), we determined that an upper arm squeeze provides the broadest range of social interactions and semantic range of meanings that can be communicated. In the course of our research, we developed two remote touch armbands – the first using a shape memory alloy (SMA), and the second using a modified servomotor. The SMA version is silent and has a rapid rise-time but a limited ability to deliver force. The latter is able to deliver a greater force but is slower and is not as silent. We also developed a pair of input devices that can detect squeezes – the first shaped like an egg, and the latter is embedded in a smartphone case so that a user can deliver a squeeze while using the phone.
We designed a study where an actress read an emotional story with known emotional high points (through a pilot study). We thus able to present the story reading in both the with-touch and without-touch conditions. The story had an emotional twist, beginning sad, but turning happy at the end.
Summary of Results
To date, we conducted two different studies using our story presentation design. In the first, we presented the story in the with-touch, and without-touch conditions. This study, reported in TEI 2010, shows that remote touch can reinforce the meaning of a symbolic channel reducing sadness significantly and showing a trend to reduce general negative mood and to increase joviality.
While our first study showed promising results, we could not be sure that the affective impact came from the touch or if it came from the speech alone, the touch drawing attention to the emotion in the speech. In our second study, reported in CHI 2012 (the paper was awarded an Honorable Mention for Best Paper), we added a ‘light flash’ condition. In the light-condition, our research participants were exposed flashes in exactly the same time-points when the touch is administered to the touch-condition participants. A speech-only group listened to the story without either light or touch augmentation. Both the touch and light conditions were divided using a pair of cover stories. In the first, the participants were told that the touch or light were produced by the storyteller emoting into a touch-squeeze device to communicate her emotions (resulting in the Communicative-Touch (CT) and Communicative-Light (CL) conditions). The second cover story is that the touch and light are produced by a system that detected emotional high points using an EEG-like device. This produced the Measurement-Touch (MT) and Measurement-Light (ML) conditions.
Our results showed that the CT condition resulted in the participants feeling significantly more connected and took the viewpoint of the storyteller when compared with the speech condition. All CL, ML and MT resulted in some increase in connectedness that did not register as significant. Interestingly, the ML condition ranked second in connectedness, followed MT and CL respectively. The cross-over effect was significant, suggesting that Touch was better in the communicative condition while the Light-flash was better at conveying a measurement. This confirms our intuition that contextual expectation is a significant factor in the experience of emotive touch. Furthermore, our results shows that the flashing-light condition cannot easily replace touch for emotion conveyance.
That ML produced a stronger connectness result above MT and CL may be explained by the cultural expectation that measurements tend to be communicated by light indicators. While the results did not cross the threshold of significance, they nonetheless replicated all the trends of the TEI 2010 study.
Publications
- R. Wang, F. Quek, D. Tatar, J.K.S. Teh, and A.D. Cheok, Keep in Touch: Channel, Expectation and Experience, to appear in CHI ‘12, 30th ACM Conference on Human Factors in Computing Systems, 2012. This paper has been awarded an Honorable Mention for Best Paper at CHI
- Wang, R. and Quek, F. 2010. Touch & talk: contextualizing remote touch for affective interaction. In Proceedings of the Fourth international Conference on Tangible, Embedded, and Embodied interaction (Cambridge, Massachusetts, USA, January 24 – 27, 2010). TEI ’10. ACM, New York, NY, 13-20.
- Wang, R., Quek F., Teh, J.K.S., Cheok, A., and Lai, S.P. 2010. Design and evaluation of a wearable remote social touch device. In International Conference on Multimodal Interfaces and the Workshop on Machine Learning for Multimodal Interaction (ICMI-MLMI ’10). ACM, New York, NY, USA, , Article 45 , 4 pages.
- For more publications of Rongrong, please refer to Wang Rongrong’s homepage
Acknowledgement
This research has been partially funded by NSF grant, CRI: Interfaces for the embodied mind, IIS-0551610.