Keywords

1 Introduction

Embodied cognition theory posits that a person’s environment and physical behavior, both have a certain degree of influence on their cognition and emotions [1,2,3]. This dissertation argues, from an embodied perspective, that people’s emotional connection with space is based on their physical structure and their experience of interacting with the environment [4]. Casasant (2009) concluded from the outcomes of several experiments that an individual’s left–right spatial cognition is based on their own body space; this cognition develops as a result of their interactions with the environment with their left and right hands. Therefore, right-handed and left-handed people have different emotional connections with their surrounding space. For example, a right-handed person tends to associate the right side with the positive and the left side with the negative. The opposite holds true for left-handed people [5,6,7].

During hand-held human–computer interactions, the human body serves as a behavioral carrier, forming the core element that influences the user’s emotional experience. According to Embodied Cognition Theory, users interact with the terminal through their body posture and hand movements, triggering the cognition of the body and physical behavior. Studies show that in a touch-screen environment different types of human–computer interactions trigger different emotions in users [8]. Studies that investigate the influence of touch-screen interfaces on users’ emotional cognition usually employ large screens in their experiments. Still, it is unclear if the same results can be applied to smaller touch-screen devices.

Rizzolatti, Matelli and Pavesi (1983) assert that the body’s three-dimensional space can be divided into two parts with reference to the arms: the space within the reach of the arms is considered peripersonal and the space beyond the reach of the arms is considered distant peripersonal [9]. Li, Watter and Sun (2011) discovered that there is a difference in cognitive processing between peripersonal and distant peripersonal space [10]. The size of a touch-screen might determine the available space for the interactive gestures of users, thus influencing their emotional cognition. This leads to a crucial question: If the experimental device is changed to a small-sized touch-screen, will the interactive gestures still have an influence on users’ emotions? Our study employs small tablet computers to display pictures with emotional valences that serve as emotional stimuli. We aim to explore the connection between users’ cognition of the pictures’ valence and the direction of movement (far-to-near and near-to-far) within peripersonal space. We also investigate the influence of radial human–computer interactions on users’ cognition of emotional valences.

2 Literature Review

2.1 Embodied Cognitive and Body Specificity Hypothesis

Embodied cognition theory posits that an individual’s cognitive processing is closely related to their environment and that the body and the environment are major elements in the cognition system [11]. Through several experiments, Casasanto (2009) found that discrete individuals develop different interaction experiences when interacting with their environments, thus forming discrepancies in their cognition of the outer world [12]. Humans use their own body as a reference for their emotional cognition of space, which is developed through the interaction of their hands and the surroundings. In their daily lives, right-handed subjects interact with their surroundings mainly with their right hands. Thus, their right hands tend to command better muscle strength, agility, and balance, ensuring a more active experience with the space on the right side. The opposite holds true for left-handed subjects. This is how we know that right-handed and left-handed people have different emotional cognition with regard to spatial locations [4, 13]. For instance, Wilson and Nisbett (1978) discovered that subjects tended to give higher values to socks placed on the right side of a rack than those placed on the left side [14]. Natale, Gur and Gur (1983) discovered in their experiment that subjects tended to label the neutral faces presented on the left side of the screen as “negative” and the neutral faces presented on the right side of the screen as “positive” [15]. Casasanto (2009) developed the body-specificity hypothesis (BSH): since right-handed people command greater motor fluency on their right side, this space is often associated with positive valence [12]. However, they do not command a comparable degree of motor fluency on their left side, so this space is often associated with negative valence. However, based on BSH, studies such as the one performed by de la Vega, Filippis, Lachmair, Dudschig and Kaup (2012) investigate whether the dominant hand’s movements are connected to the processing of positive or negative stimuli [16]. The researchers asked all the subjects to choose the part of speech for each word appearing on a screen. The right-handed subjects tended to respond faster when clicking on the active words with their right hands and when clicking on the negative words with their left hands. Left-handed subjects, in contrast, responded faster when clicking on the positive words with their left hands and when clicking on the negative words with their right hands. In other words, consistent polarity helped speed up the subjects’ emotional cognition process. This result supports the BSH.

In general, subjects tend to associate the more physically fluent, dominant hand with active emotions and the less physically fluent, non-dominant hand with passive emotions. In line with this thinking, when right-handed subjects use their dominant hand (right hand–positive) and non-dominant hand (left hand–negative) to interact with the valence-laden stimuli on a touch-screen, their emotional experience may differ depending on the hand that they use.

2.2 Polarity Correspondence Principle

Since humans react to positive stimuli faster than to negative stimuli [17], in the stimulus valence dimension, stimuli with positive valence is considered as [+] polarity, while stimuli with negative valence is considered as [−] polarity. In studies about emotion valences in sagittal space, arm movements towards the body are associated with [+] polarity, while arm movements away from the body are associated with [−] polarity [18, 19]. Lakens (2012) developed the Polarity Coding Correspondence Hypothesis with reference to emotions and spatial cognition [20]. In the Polarity Coding Correspondence Hypothesis [21], it is argued that a stimulated emotional meaning (valence) and its cognition type (space) overlap, thus speeding up the emotion cognition process and giving subjects a more positive emotional experience. For example, in Cacioppo’s study (1983) [22], it was found that the contraction of arm muscles (extension-away/flexion-toward) influences subjects’ cognition of the emotional valence of stimuli. Compared to subjects with their arms moving forward (extension - [−] polarity), subjects with their arms moving backward (flexion - [+] polarity) have a more active valence evaluation of Chinese characters. Cacioppo’s study (1983) [22] also shows that arm movement influences subjects’ emotional cognition.

When conducting emotional valence research on the sagittal space, positive or negative emotional stimuli (e.g., words or pictures) were displayed on a touch-screen. The subjects completed their interactions with the emotional stimuli by moving their arms forward or backward (i.e., extension or flexion) and evaluating the emotional valences of the stimuli after the interaction [23, 24]. The results showed that swi** the positive pictures toward body (flexion) or swi** the pictures away (extension), both create positive changes in the valence processing of pictures among subjects (with an increase in valence evaluation for positive pictures and a decrease in valence evaluation for negative pictures) [25]. In other words, individuals with different body features interact with their physical environments in different ways and experience varied movement cognition. This will eventually lead to the creation of different psychological representation events that influence the emotional experiences and spatial representations of individuals [12]. In a touch-screen environment, varied operation gestures for human–computer interaction will have diverse influences on users’ emotions [8].

2.3 Peripersonal Space

Rizzolatti, Matelli and Pavesi (1983) divide three-dimensional space into two parts: the space that can be reached by the arms is known as peripersonal space and the space that is beyond the reach of the arms is known as distant peripersonal space [9]. Rizzolatti and his colleagues (1990, 1998) argue that since peripersonal space can be physically engaged with, individuals tend to fully process the operational messages of stimuli; in contrast, distant peripersonal space cannot be physically engaged with, so individuals only need to scan and recognize the stimuli instead of processing them [26, 27]. Costantini, Ambrosini, Tieri, Sinigaglia and Committeri (2010) discovered that in peripersonal space, operational representations related to stimuli are processed, while those in distant peripersonal space are not [28]. This result supports Previc’s position (1990, 1998) [29, 30].

In the past, experiments and studies have mostly used large multimedia screens [23,24,25, 31]. Very few studies explore the influence of human–computer interaction on users’ emotions in a small-screened environment. Since the touch-screen is smaller, users’ hands interactions always remain within the peripersonal space, while most of the movements are limited to the hands rather than the arms. Under these circumstances, the use of hands, the direction of swi**, and the pictures’ valence might all exert different influences on users’ emotional cognition.

Therefore, this study employs small tablet computers to display pictures with emotional valences that serve as emotional stimuli. We aim to explore the connection between users’ cognition of emotional valences and the direction of movement (far-to-near and near-to-far) of both hands (right and left) within peripersonal space. We also investigate the influence of radial human–computer interaction on users’ cognition of emotional valences. Through these experiments, we hope to analyze the influence of radial human–computer interactions in peripersonal space on users’ cognition of emotional valences when operations carry a stimulus valence.

3 Method

This study was carried out by experimental method, in which the emotional pictures were used as stimulate materials [31]. The participants evaluated the pictures using numbers between 1 (very negative) and 9 (very positive) after swi** them on the iPad, either towards or away from their body.

3.1 Participants

A total of 80 right-handed participants (Mage = 20.37, SD = 0.850, 47.5% female) took part in the study in exchange for parts of course credits. The participants were randomly divided into four groups whose visual acuity or corrected visual acuity were normal. Meanwhile, there was no obvious physical disease affecting their hand movements.

3.2 Apparatus and Stimuli

Twenty positive pictures (e.g., plants, landscapes or lovely animals) and twenty negative pictures (e.g., trash, mass or disaster) from the International Affective Picture System (IAPS; Lang, Bradley and Cuthbert 2005) were used as stimuli. An ANOVA on the pictures’ valence means confirmed differences between the valence categories (Mpositive = 5.88, SDpositive = 0.16, Mnegative = 3.40, SDnegative = 0.74), F (1,38) = 210.732, p < 0.001). A tablet (iPad pro 10.5 (17.4 cm × 25.1 cm)) was used to display the stimuli pictures with a resolution of 4.0 cm × 5.0 cm. The brightness, contrast and other functions of the tablets used by the four groups were set uniformly.

3.3 Procedure

Participants all sat in a natural posture with the tablet putting widthwise on the table in front of them, at a distance of 25 cm to the display screen. The experiment was carried out in two sessions, two days apart. The first session is the pretest which was performed to control for the differences within and between the experimental groups. In this section, participants evaluated the valence of the pictures two days before the formal experiment. The stimuli pictures were randomly displayed middle-centered on the iPad with a 9-point Likert scale (1-very negative, 9-very positive) below to evaluate their valence. The experiment task was introduced to the participants as a procedure, and participants were asked to evaluate the pictures just by watching as soon as possible. The results showed that the evaluated valence of all participants on different pictures were highly consistent in the same dimension, and the internal consistency coefficient was within the accepted range of psychometrics, so the evaluation results were credible. The second session is the formal experiment. In this session, 80 participants were randomly assigned to one of four experimental groups. Two groups of participants touched and swiped the pictures on the screen towards or away from their body only with their dominant right hand, while the other two groups of participants with their non-dominant left hand (see Fig. 1). Those pictures were presented randomly regarding their valence category which were displayed either at the near or far side of the touchscreen from the participants’ end. A 9-point Likert scale from 1 (very negative) to 9 (very positive) was used to evaluate digital pictures. Pictures displayed randomly in the middle of the screen after participants logged into the test program, and the participants were asked to touch and move the pictures away or towards their body with their dominant or non-dominant hand to a white square indicated the movement endpoint. After each movement the picture just disappeared, then it appeared again in the middle of the screen together with a 9-point Likert scale below it. Participants had to evaluate the picture using 1 (very negative) to 9 (very positive) and they can rest at any time without time limits.

Fig. 1.
figure 1

Participants were randomly assigned into four groups. Two groups of participants touched and moved the pictures using their dominant right hand while the other two groups using their non-dominant left hand. Participants subsequently moved the pictures on the iPad either away or towards their body. Dashed areas represent the final hand’s position and the dotted squares represent the final pictures’ position.

4 Data Analyses and Results

Experimental studies aim to explore the association between affective valence with hand dominance (i.e. dominant hand-positive; non-dominant hand-negative) and sagittal movement. Furthermore, to investigate how sagittal actions of the hands may influence affective experiences, for example, in valence appraisals of affective objects that have been manipulated. Valence evaluations were analyzed with a 2 (hand: dominant right hand vs. non-dominant left hand) × 2 (movement: forward vs. backward) × 2 (valence category: positive vs. negative), fixed-effects structure.

4.1 Affective Valence Evaluation with Sagittal Movements

A three-way ANOVA was used to analyze the effects of hand, sagittal movements and the pictures’ affective valence on emotional cognition. Valence category and sagittal movement were manipulated between subjects, whereas valence category was manipulated within subjects. A summary of the results can be seen in Table 1.

Table 1. Effects of hand, movement and affective valence on affective valence appraisals

The results of the study showed a significant main effect of the pictures’ valence categories, F (1,38) = 6.41, p = 0.016, η 2p  = 0.144, indicating that positive pictures led to more positive evaluations than negative pictures. Sagittal movements also showed a significant main effect indicating that moving towards body led to more positive evaluations than moving away movements.

The two way interaction between valence categories and sagittal movement were significant, F (1,38) = 8.73, p = 0.005, η 2p  = .187. Due to the significant interaction between the two, the simple main effect test was further carried out (seeing Fig. 2). The results indicated that valence categories had a significant simple main effect no matter moving the picture away (F (1,76) = 420.90, p < 0.001) or towards (F (1,76) = 602.25, p < 0.001) body. There was also a significant simple main effect on the interaction with positive pictures and sagittal movements, which indicating moving positive pictures towards body led to more positive evaluations than moving away (F (1,76) = 4.05, p = 0.005, η 2p  = .051). But unexpectedly we didn’t find any significant effect on the interaction with negative pictures.

Fig. 2.
figure 2

Valence evaluations of pictures after sagittal movements with the dominant right hand and the non-dominant left hand. Vertical bars indicate 95% confidence intervals. ** represent significant effects at the level of p<0.01.

The analysis of three-way interaction showed no significant difference between hand, sagittal movement and valence category (F (1,38) = 0.45, p = 0.51) (seeing Table 2) indicating that the three of them had no mutual effect on the experimental results. However, in order to gain further insight into the kind of sagittal movement that effects affective valence appraisals most, the results with regard to the influence of the hand used to interact with the pictures will be described below.

Table 2. Simple main effect analyses

4.2 Affective Valence Evaluation with Hand

Swi** positive pictures with the dominant right hand resulted in more positive ratings after far-to-near movements than after near-to-far movements, t (38) = 2.56, p = 0.01. In contrast, interaction with negative pictures by the dominant right hand did not result in any significantly different evaluations between the same sagittal movements, t (38) = 0.11, p = 0.91. The result indicated that when interacting positive pictures with the dominant hand, the codes matching of the pictures’ valence, the dominant right hand, as well as the moving towards movements reinforced the valence evaluation of the positive pictures. Likewise, swi** positive pictures with the dominant right hand resulted in more positive evaluations than with non-dominant left hand, t (37) = 2.60, p = 0.01.

However, sagittal interaction with either positive (t (38) = 1.27, p = 0.21) or negative pictures (t (36.132) = −0.22, p = 0.83) by the non-dominant left hand led to any significant difference in valence evaluations. This suggested that, in this case, the valence codes of the pictures’ combination of the hand and sagittal movements from the participants’ non-dominant left side, made no difference in the valence evaluation. Moreover, it appeared that when moving negative pictures with the dominant right hand (t (37) = 0.69, p = 0.50) or non-dominant left hand (t (36.714) = 0.49, p = 0.63) resulted in no significant difference in valence evaluations, no matter away or towards the participant’s body.

5 Conclusion

This study was conducted through experiments and was based on embodied cognition theory. Small-sized tablet computers were chosen as experimental devices. Pictures with emotional valences were used as emotional stimuli to explore the connection between subjects’ cognition of emotional valences and interactions with different hands (left and right) in different directions (far-to-near and near-to-far).

  1. 1.

    In a radial movement human–computer interaction, set in a peripersonal space, moving the pictures with active emotions closer by operating the touch-screen has an active influence on the subjects’ cognition. This is consistent with the polarity coding correspondence theory. However, when it comes to pictures with passive emotions, the movement type does not have any significant influence on subjects. Furthermore, data analysis shows that when moving pictures with active emotional valences, moving them in a radial direction near-to-far makes subjects who are using their dominant hands (right hands) more active in their cognition of emotional valences than subjects using their non-dominant hands (left hands). This is consistent with the theory of embodied cognition. However, when it comes to pictures with passive emotions, the operation types do not have any significant influence on subjects. This may be because passive emotions are not caused by significant dimensions. According to de la Vega, Dudschig, Lachmair and Kaup (2014), the connection between stimuli and similar encoding may be triggered by significant stimuli [32]. Since passive emotions are not included in significant dimensions, the polarity encoding consistency used in this experiment does not have a significant influence on subjects.

  2. 2.

    Torres (2018) pointed out that when subjects operate the touch-screen, the hands used for the interaction (dominant/non-dominant hand), the valence types (active/passive), and the valence matching between the starting points (left/right) of the touch-screen operation—all increase the subjects’ valence evaluations [31]. However, results indicate that when the stimuli are pictures with active emotions, the hands used for the interaction (dominant hand [+]/non-dominant hand [−]), the pictures’ valence types (positive [+]/negative [−]), and movement direction (towards [+]/away [−]) do not increase the subjects’ valence evaluations of the emotions shown in the pictures. This study result does not completely accord with the polarity coding correspondence theory probably since the experimental space is limited to peripersonal space.

This study result has added to the studies on the application of embodied cognition in human–computer interaction. The result is helpful in understanding how digital information with valences in peripersonal space influences users’ emotional cognition. This may help improve users’ emotional experience in touch-screen environments and be used as a reference for improving interface experiences. Moreover, by approaching the matter through users’ physical location and effects, we may be able to design human–computer interaction products that are easier to use, more desirable, and more satisfying to the users’ physical and mental needs.