Enter your keyword

2-s2.0-85101515729

[vc_empty_space][vc_empty_space]

Evaluation of Emotional Meaning of Eyelid Position on a 3D Animatronic Eyes

Jatmiko A.S.a, Ginalih C.T.a, Darmakusuma R.a

a Institut Teknologi Bandung, School of Electrical Engineering and Informatics, Bandung, Indonesia

[vc_row][vc_column][vc_row_inner][vc_column_inner][vc_separator css=”.vc_custom_1624529070653{padding-top: 30px !important;padding-bottom: 30px !important;}”][/vc_column_inner][/vc_row_inner][vc_row_inner layout=”boxed”][vc_column_inner width=”3/4″ css=”.vc_custom_1624695412187{border-right-width: 1px !important;border-right-color: #dddddd !important;border-right-style: solid !important;border-radius: 1px !important;}”][vc_empty_space][megatron_heading title=”Abstract” size=”size-sm” text_align=”text-left”][vc_column_text]© 2020 IEEE.Emotion is a complex thing in humans, there are many ways to express emotions including eye gaze. Ekman, et al. describing abstractions regarding core features on human faces adopted by Onchi E, et al. through single-eyed 2D avatar that is designed that only move the upper and lower eyelids only. Adopting a 2D single-eye avatar, this study evaluated the similarity of human emotions expressed using 3D two-eye animatronic models with stiff eyelids shared by Will Cogley. Simulation in the form of a survey to evaluate the degree of similarity of the seven types of emotions described by Ekman, et al. i.e. neutral, happy, surprised, sad, scared, angry and disgusted, given to 40 participants with a variety of different backgrounds. The result is that the sample of emotions displayed had a significant effect on the participants’ perceptions, each sample was also able to display the meaning of emotions well because there was a significant effect on the interaction between the sample and emotions (p = 5.63 × 10-17). Based on these results, the participants had almost similar perceptions of the eyelids with physical embodiment and virtual agents so that these results could be mutually reinforcing. Compared with the results of research using Probo, emotions can be expressed better when facial features such as eyebrows, eyelids and mouth are present. The conclusion of this research is expected to be a step to find out the function of each facial feature that contributes to express emotions.[/vc_column_text][vc_empty_space][vc_separator css=”.vc_custom_1624528584150{padding-top: 25px !important;padding-bottom: 25px !important;}”][vc_empty_space][megatron_heading title=”Author keywords” size=”size-sm” text_align=”text-left”][vc_column_text]Core features,Degree of similarity,Express emotions,Eye-gaze,Facial feature,Human emotion,Human faces,Virtual agent[/vc_column_text][vc_empty_space][vc_separator css=”.vc_custom_1624528584150{padding-top: 25px !important;padding-bottom: 25px !important;}”][vc_empty_space][megatron_heading title=”Indexed keywords” size=”size-sm” text_align=”text-left”][vc_column_text]3D model,animatronic eyes,emotion,eye gaze[/vc_column_text][vc_empty_space][vc_separator css=”.vc_custom_1624528584150{padding-top: 25px !important;padding-bottom: 25px !important;}”][vc_empty_space][megatron_heading title=”Funding details” size=”size-sm” text_align=”text-left”][vc_column_text][/vc_column_text][vc_empty_space][vc_separator css=”.vc_custom_1624528584150{padding-top: 25px !important;padding-bottom: 25px !important;}”][vc_empty_space][megatron_heading title=”DOI” size=”size-sm” text_align=”text-left”][vc_column_text]https://doi.org/10.1109/ICIDM51048.2020.9339629[/vc_column_text][/vc_column_inner][vc_column_inner width=”1/4″][vc_column_text]Widget Plumx[/vc_column_text][/vc_column_inner][/vc_row_inner][/vc_column][/vc_row][vc_row][vc_column][vc_separator css=”.vc_custom_1624528584150{padding-top: 25px !important;padding-bottom: 25px !important;}”][/vc_column][/vc_row]