Introduction

Due to the increase in the number of senior citizens worldwide, achieving high-quality nursing care with technology is an essential challenge for researchers. One possible approach is using care-receiving robots that interact with seniors to encourage care-taking behaviors toward robots and for increasing their daily activities. In this context, robotics researchers have developed different kinds of social robots, including a pet type1 and a baby type2. Such social robots have manifested positive effects not only on seniors at senior facilities but also on nursing staffs whose mental loads were reduced3. Social acceptance is required to accelerate the spread of such robots. Recent studies have focused on two essential factors (of robots) to create acceptable social robots from the perspective of ordinary people: their emotional expressions and their cuteness.

Concerning the former, several past studies reported that emotional cues increase the social acceptance both for social robots and such other artificial agents as non-humanoid urban robots4 and virtual humans5. Recent human–robot interaction studies have focused on emotional expressions6 using facial aspects7, body movements8, and voice characteristics9. Touch behaviors from robots to people have also been used as emotional cues to enhance the expressed emotions of social robots10,11,12.

Concerning the cuteness of robots, many recent commercial robots have been designed with cute appearances, including LOVOT13, AIBO14, and Paro15. These robots (developed in Japan) share a common design policy, kawaii, a common Japanese adjective that conveys cute, lovely, or adorable16,17. Konrad Lorenz’s baby schema, a well-known factor that induces perceptions of cuteness18,19, has also been adapted for designing such robots. Robotics researchers investigated other elicitors to induce the feeling of kawaii except for the appearance of robots, including their number20 and their locomotion behaviors21. Recent human–robot interaction studies focused on touch behaviors from robots to an object as another elicitor related to the feeling of kawaii22.

Interestingly, recent studies suggest that touch behaviors from robots provided positive effects from both perspectives, i.e., emotional expressions and cuteness. In interaction with seniors, past studies reported the positive aspects of such touch behaviors as bonding between nurses and seniors23 and enhancing the long-term mental health care24 and psychiatric care of seniors25. Past studies also investigated the positive effects of touch interactions between robots and seniors, such as therapy effects, by interacting with a seal robot26 and explored the benefits of an animal-like robot for dementia care27. These studies focused on social acceptance and nursing care perspectives and reported the positive outcomes of touch interaction with robots from various views.

However, robot-initiated touch effects on the perceived emotions and the feeling of kawaii have never been investigated in the context of social robots for the elderly; the above previous studies that focused on robot-initiated touch behaviors mainly conducted their experiments with adults. Some previous studies suggested that the perceptions of emotional expressions and the feeling of kawaii differ between seniors and younger adults28,29,30,31. Visual and auditory features seem to become less effective in the emotional perceptions of seniors. On the other hand, although aging decreases touch sensitivity and acuity, past research reported that seniors described higher subjective pleasantness toward touch stimuli than other groups of adults32,33,34, perhaps indicating that touch behavior might be a useful tool for enhancing a robot’s expressed emotions.

Therefore, investigating the effects of robot-initiated touch behaviors with seniors on the emotional expressions and the feeling of kawaii might help build knowledge for socially acceptable robots for that age cohort. Contextualizing these findings to the current research, i.e., following the positive effects for seniors with touch stimuli32,33,34 and touch interaction effects with robots26,27, we hypothesized that robot-initiated touch interaction might have positive effects in the context of perceived emotions and the feeling of kawaii across different age groups. Based on this hypothesis, we made the following predictions:

  • Prediction 1: A baby robot’s touches will increase the perceived valence of the perceived emotion regardless of the user’s age.

  • Prediction 2: A baby robot’s touches will increase the perceived arousal of the perceived emotion regardless of the user’s age.

  • Prediction 3: A baby robot’s touches will increase the feeling of kawaii regardless of the user’s age.

With a background reflecting these considerations, we developed a robot that can actively touch users based on an existing baby robot (Fig. 1), which was specially designed for dementia sufferers. We experimentally investigated the effects of its touch behaviors on the perceptions of the emotional expressions and the feeling of kawaii with 48 adult and senior participants (24 people from ages 21 to 49 and 24 from 65 to 79, with identical gender ratios in both age groups.) They experienced four different interactions, following 2 × 2 × 2 experiment factors (age: adult and seniors, touch: with and without touch, and emotion: happy and angry expressions by voices). We evaluated their perceived feelings toward the expressed emotions (valence, arousal) through two questionnaire items and another item to evaluate the feeling of kawaii.

Fig. 1
figure 1

Baby robot in this study: Left figure shows its initial pose, and right figure shows its arm-closing pose, which enables it to touch user’s hands.

Results

Perceived feelings toward expressed emotions

We used an affective slider35, which investigates valence and arousal values toward the emotions expressed by the robot. We conducted a three-way mixed measure analysis of variance (ANOVA) for age, touch, and emotion factors to investigate the F-values (F), the p-values (p), and the effect size (partial eta squared, ηp2).

For valence (Fig. 2, left), prediction 1 was supported; the results showed that the robot’s touch behaviors significantly increased the perceived valence regardless of the expressed emotions (happy and angry) and the participant’s age cohorts (adults and seniors). The analysis found significant differences in the touch factor (F (1, 46) = 12.144, p < 0.001, ηp2 = 0.209), the emotion factor (F (1, 46) = 73.742, p < 0.001, ηp2 = 0.616), the age factor (F (1, 46) = 33.753, p < 0.001, ηp2 = 0.423), and the simple interaction effects between the emotion and age factors (F (1, 46) = 23.155, p < 0.001, ηp2 = 0.335). The analysis did not find a significant difference in the simple interaction effect between the touch and age factors (F (1, 46) = 2.961, p = 0.092, ηp2 = 0.060) in the simple interaction effect between the touch and emotion factors (F (1, 46) = 0.015, p = 0.902, ηp2 = 0.001) or in the two-way interaction effect (F (1, 46) = 0.070, p = 0.793, ηp2 = 0.002).

Fig. 2
figure 2

Questionnaire results of perceived valence (left) and arousal (right): Graphs show average values and standard errors.

The simple main effects showed significant differences: adult < seniors, p < 0.001 in the angry condition, angry < happy, p < 0.001 in the seniors condition, and angry < happy, p < 0.001 in the adult condition.

For arousal (Fig. 2, right), prediction 2 was partially supported. Touch behaviors were only effective in adults in the context of arousal; seniors evaluated these values more highly regardless of the touch behaviors. Our analysis found significant differences in the touch factor (F (1, 46) = 7.206, p = 0.010, ηp2 = 0.135), the emotion factor (F (1, 46) = 32.897, p < 0.001, ηp2 = 0.417), the age factor (F (1, 46) = 9.780, p = 0.003, ηp2 = 0.175), the simple interaction effects between the touch and age factors (F (1, 46) = 7.139, p = 0.010, ηp2 = 0.134), and the simple interaction effects between the emotion and age factors (F (1, 46) = 8.894, p = 0.005, ηp2 = 0.162). We found no significant difference in the simple interaction effect between the touch and emotion factors (F (1, 46) = 1.347, p = 0.252, ηp2 = 0.028) or in the two-way interaction effect (F (1, 46) = 1.424, p = 0.293, ηp2 = 0.030).

The simple main effects showed significant differences: adult < seniors, p < 0.001 in the without-touch condition, without-touch < with-touch, p < 0.001 in the adult condition, adult < seniors, p < 0.001 in the angry condition, and angry < happy, p < 0.001 in the adult condition.

The feeling of kawaii

Following past studies, we used a single questionnaire item (“the degree of the feeling of kawaii,” 0-to-10 response format: 0: “not kawaii at all,” 10: “extremely kawaii”) to measure the feeling of kawaii20. We employed an 11-point response format because a past study argued that using 11 scales closely approximated interval data36. We conducted a three-way mixed measure analysis of variance (ANOVA) for the age, touch, and emotion factors.

For the feeling of kawaii (Fig. 3), prediction 3 was partially supported. Touch behaviors were only effective in adults in the context of the feeling of kawaii; seniors evaluated these values more highly regardless of the touch behaviors. The analysis found significant differences in the touch factor (F (1, 46) = 14.496, p < 0.001, ηp2 = 0.240), the emotion factor (F (1, 46) = 42.894, p < 0.001, ηp2 = 0.483), and in the simple interaction effect between the touch and age factors (F (1, 46) = 8.250, p = 0.006, ηp2 = 0.152). The analysis did not find a significant difference in the age factor (F (1, 46) = 2.764, p = 0.103, ηp2 = 0.057), in the simple interaction effects between the emotion and age factors (F (1, 46) = 0.312, p = 0.579, ηp2 = 0.007), in the simple interaction effect between the touch and emotion factors (F (1, 46) = 0.142, p = 0.708, ηp2 = 0.003), or in the two-way interaction effect (F (1, 46) = 0.959, p = 0.333, ηp2 = 0.020). The simple main effects showed significant differences: adult < seniors, p = 0.032 in the without-touch condition and without-touch < with-touch, p < 0.001 in the adult condition.

Fig. 3
figure 3

Questionnaire results of the feeling of kawaii: Graphs show average values and standard errors.

Discussion

This study provides several implications about a robot’s touch effects on perceived emotions and the feeling of kawaii. One interesting phenomenon is the different touch effects between adults and seniors. Similar to past studies, adults reacted positively to the robot’s touch. On the other hand, seniors only showed a significant difference in perceived valance, although past studies reported that age did not play a significant role in human–human touch interaction37,38. We thought that one possible reason was a ceiling effect. In fact, all the questionnaire results from the senior group are basically higher than those of the adult age group, and a past study reported that seniors reported greater overall valence and arousal than young adults toward affective stimuli39. In other words, our experiment settings might have difficulty confirming the effects of robot-initiated touch behaviors on the seniors’ perceived emotions and the feeling of kawaii. If so, why were the effects of touch shown in the valence (prediction 1) but not in the arousal and the feeling of kawaii (predictions 2 and 3)?

One possible explanation is the dedifferentiation of emotional processing in seniors. A past study40 reported that seniors perceive visual stimuli as a more positive valence with less arousal than younger adults. We thought that the robot-initiated touch behavior was a positive stimulus; therefore, seniors perceived a high valence with less arousal compared to the younger participants. This situation would enhance the touch effects toward perceived valence but suppress the touch effects toward perceived arousal.

Another past study reported that the feeling of kawaii toward infants clearly increased with the age of the observers41. Based on such characteristics of seniors, the touch effects on the feeling of kawaii might not have appeared in the senior group in our study because our robot looks like an infant, and they felt it was more kawaii just by looking at it. Note that even though the positive effects are limited for seniors, increasing the perceived valence remains useful for achieving acceptable social robots in the context of effective emotional expressions.

In our study, the questionnaire results under the angry condition were basically higher than the middle score (4) among the seniors, which might indicate that they positively evaluated the baby robot’s angry expression differently than the adult participants. However, this result might have been influenced by the shorter experience period of our study. A past study3, which used the original version of baby robots for 2 weeks at an elderly nursing home, reported that some participants refused to interact with the robots due to their angry (crying) behavior. A long-term experiment is needed to investigate the effects of a baby robot’s touch on mitigating negative impressions of its angry behaviors.

This study has several limitations regarding the types of robots and their touching behaviors. We only used a single baby robot and a simple touching behavior during its emotional expressions and conducted experiments in a single country. Testing with different baby robots and touch behaviors (e.g., rubbing or squeezing) in different cultural contexts would provide knowledge about the combined effects of these factors. Moreover, adjusting the touch timing and frequency during speaking is important to provide more natural emotional expressions, similar to other forms of touch interaction during conversations42. Note that despite these limitations, our experiment results provide knowledge about the relationships between robot-initiated touch and perceived emotional feelings.

Methods

This study and all its procedures were approved by the Advanced Telecommunication Research Review Boards (501-4). All the participants provided written informed consent before joining this study. All the methods were performed in accordance with relevant guidelines and regulations.

Device

Figure 4 shows the robot used in this study, a modified version of HIRO3 called Sawatte Hiro-chan. We added a motor device (RS314MDVF) and a battery to control both its arms for the touch behaviors during the interactions. We used M5Stack to control its voices and the motor device. We prepared ten seconds of voices for both happy and angry emotions.

Fig. 4
figure 4

Hardware components of developed robot: Sawatte Hiro-chan.

Conditions

This study considered three factors: (1) touch, (2) emotion, and (3) age. The touch and emotion factors were treated as a within-subject design; the age factor was treated as a between-subject design. Each participant experienced four conditions that combined two levels of touch and emotion factors. The orders of the experiment conditions were counterbalanced.

In the touch factor, we prepared two levels: with-touch and without-touch. In the former condition, the robot touched the participants’ hands by moving both its arms three times while expressing an emotion. In the without-touch condition, the robot did not move its arms. In the emotion factor, we prepared two levels: happy and angry. We used actual recordings of a baby laughing and crying for this experiment. In the age actor, we prepared two levels: adults (21–49) and seniors (65–79).

Participants

Forty-eight healthy participants joined the experiment: 12 females and 12 males, from ages 21–49 (mean 34.2), and 12 females and 12 males, from ages 65–79 (mean 71.6). Analysis with G*power43 (a medium effect size of 0.25, power of 0.95, and an α level of 0.05) showed that the sample size was 36, which indicated that the number of participants in this study was sufficient. All were native Japanese speakers gathered through a recruitment agency.

Measurements

We used an affective slider35, which investigates valence and arousal values toward the emotions expressed by the robot. This software enables participants to input the valence and arousal values by a GUI. The values range from 0.0 to 1.0.

To evaluate the feeling of kawaii, we used a single questionnaire item (“the degree of the feeling of kawaii,” 0-to-10 response format: 0: “not kawaii at all,” 10: “extremely kawaii”) to measure the feeling of kawaii20. We employed an 11-point response format because a past study argued that using 11 scales closely approximated interval data36.