Current Search: Eye tracking (x)
View All Items
- Title
- Infants’ sensitivity to gestures by humans and anthropomorphic robots.
- Creator
- Stotler, Jacqueline, Wilcox, Teresa, Florida Atlantic University, Department of Psychology, Charles E. Schmidt College of Science
- Abstract/Description
-
Robotics have advanced to include highly anthropomorphic (human-like) entities. A novel eye-tracking paradigm was developed to assess infants’ sensitivity to communicative gestures by human and robotic informants. Infants from two age groups (5-9 months, n = 25; 10-15 months, n = 9) viewed a robotic or human informant pointing to locations where events would occur during experimental trials. Trials consisted of three phases: gesture, prediction, and event. Duration of looking (ms) to two...
Show moreRobotics have advanced to include highly anthropomorphic (human-like) entities. A novel eye-tracking paradigm was developed to assess infants’ sensitivity to communicative gestures by human and robotic informants. Infants from two age groups (5-9 months, n = 25; 10-15 months, n = 9) viewed a robotic or human informant pointing to locations where events would occur during experimental trials. Trials consisted of three phases: gesture, prediction, and event. Duration of looking (ms) to two areas of interest, target location and non-target location, was extracted. A series of paired t-tests revealed that only older infants in the human condition looked significantly longer to the target location during the prediction phase (p = .036). Future research is needed to tease apart what components of the robotic hand infants respond to differentially, and whether a robotic hand can be manipulated to increase infants’ sensitivity to social communication gestures executed by said robotic hand.
Show less - Date Issued
- 2021
- PURL
- http://purl.flvc.org/fau/fd/FA00013724
- Subject Headings
- Robotics, Infants, Eye tracking, Gesture
- Format
- Document (PDF)
- Title
- THE ASSESMENT OF THE ROLE OF MICROSACCADIC EYE MOVEMENTS IN BISTABLE MOTION PERCEPTION.
- Creator
- Romulus, Darwin, Hong, Sang Wook, Florida Atlantic University, Center for Complex Systems and Brain Sciences, Charles E. Schmidt College of Science
- Abstract/Description
-
Even during fixation, the eye is rarely still, as miniature eye movements continue to occur within fixational periods of the eye. These miniature movements are referred to as fixational eye movements. Microsaccades are one of the three types of fixational eye movements that have been identified. Microsaccades have been attributed to different visual processes/phenomena such as fixation stability, perceptual fading, and multistable perception. Still, debates surrounding the functional role of...
Show moreEven during fixation, the eye is rarely still, as miniature eye movements continue to occur within fixational periods of the eye. These miniature movements are referred to as fixational eye movements. Microsaccades are one of the three types of fixational eye movements that have been identified. Microsaccades have been attributed to different visual processes/phenomena such as fixation stability, perceptual fading, and multistable perception. Still, debates surrounding the functional role of microsaccades in vision ensued, as many of the findings from earlier microsaccade reports contradict one another and the polarity in the field caused by these debates led many to believe that microsaccades do not hold a necessary/specialized role in vision. To gain a deeper understanding of microsaccades and its relevance in vision, we sought out to assess the role of microsaccades in bistable motion perception in our behavioral/eye-tracking study. Observers participated in an eye-tracking experiment where they were asked to complete a motion discrimination task while viewing a bistable apparent motion stimuli. The collected eye-tracking data was then used to train a classification model to predict directions of illusory motion perceived by observers. We found that small changes in gaze position during fixation, occurring within or outside microsaccadic events, predicted the direction of motion pattern imposed by the motion stimuli. Our findings suggest that microsaccades and fixational eye movements are correlated with motion perception and that miniature eye movements occurring during fixation may have relevance in vision.
Show less - Date Issued
- 2021
- PURL
- http://purl.flvc.org/fau/fd/FA00013799
- Subject Headings
- Eye--Movements, Saccadic eye movements, Eye tracking
- Format
- Document (PDF)
- Title
- UNDERSTANDING THE OTHER-RACE EFFECT THROUGH EYE-TRACKING, EXPERIENCE, AND IMPLICIT BIAS.
- Creator
- Soethe, Elizabeth, Anzures, Gizelle, Florida Atlantic University, Department of Psychology, Charles E. Schmidt College of Science
- Abstract/Description
-
Face perception and recognition abilities develop throughout childhood and differences in viewing own-race and other-race faces have been found in both children (Hu et al., 2014) and adults (Blais et al., 2008). In addition, implicit biases have been found in children as young as six (Baron & Banaji, 2006) and have been found to influence face recognition (Bernstein, Young, & Hugenberg, 2007). The current study aimed to understand how gaze behaviors, implicit biases, and other-race experience...
Show moreFace perception and recognition abilities develop throughout childhood and differences in viewing own-race and other-race faces have been found in both children (Hu et al., 2014) and adults (Blais et al., 2008). In addition, implicit biases have been found in children as young as six (Baron & Banaji, 2006) and have been found to influence face recognition (Bernstein, Young, & Hugenberg, 2007). The current study aimed to understand how gaze behaviors, implicit biases, and other-race experience contribute to the other-race effect and their developmental effects. Caucasian children’s (5-10 years of age) and young adults’ scanning behaviors were recorded during an old/new recognition task using Asian and Caucasian faces. Participants also completed an Implicit Association Test (IAT) and a race experience questionnaire. Results found an own-race bias in both children and adults. Only adult’s IAT scores were significantly different from zero, indicating an implicit bias. Participants had a greater number of eye to eye fixations for Caucasian faces, in comparison to Asian faces and eye to eye fixations were greater in adults during encoding phases. Additionally, increased nose looking times were observed with age. Central attention to the nose may be indicative of a more holistic viewing strategy implemented by adults and older children. Participants spent longer looking at the mouth of Asian faces during encoding and test for older children and adults, but younger children spent longer looking at own-race mouths during recognition. Correlations between scanning patterns and implicit biases, and experience difference scores were also observed. Both social and perceptual factors seem to influence looking behaviors for own- and other-race faces and are undergoing changes during childhood.
Show less - Date Issued
- 2020
- PURL
- http://purl.flvc.org/fau/fd/FA00013636
- Subject Headings
- Bias, Discrimination, Eye tracking, Face perception
- Format
- Document (PDF)
- Title
- ROBOTIC ARM PERCEPTION: AN EYETRACKING STUDY EXPLORING CAUSAL RELATIONS AND PERCEIVED TRUST.
- Creator
- Merwin, Elizabeth Rose, Wilcox, Teresa, Florida Atlantic University, Department of Psychology, Charles E. Schmidt College of Science
- Abstract/Description
-
Due to the increased integration of robots into industrial, service, and educational settings it is important to understand how and why individuals interact with robots. The current study aimed to explore the extent to which individuals are receptive to nonverbal communication from a robot compared to a human, and the individual differences and stimuli attributes that are related to trust ratings. A combination of eyetracking and survey measures were used to collect data, and a robot and...
Show moreDue to the increased integration of robots into industrial, service, and educational settings it is important to understand how and why individuals interact with robots. The current study aimed to explore the extent to which individuals are receptive to nonverbal communication from a robot compared to a human, and the individual differences and stimuli attributes that are related to trust ratings. A combination of eyetracking and survey measures were used to collect data, and a robot and human both performed the same gesture to allow for direct comparison of gaze patterns. Individuals utilized the offered information equivalently from agents. Survey measures indicated that trust ratings significantly differed between agents, and the perceived likability and intelligence of the agent were the greatest predictors of increased trust.
Show less - Date Issued
- 2024
- PURL
- http://purl.flvc.org/fau/fd/FA00014471
- Subject Headings
- Eye tracking, Human-robot interaction, Trust
- Format
- Document (PDF)
- Title
- Analysis of Eye Response to Video Quality and Structure.
- Creator
- Pappusetty, Deepti, Kalva, Hari, Florida Atlantic University, College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
Real-time eye tracking systems with human-computer interaction mechanism are being adopted to advance user experience in smart devices and consumer electronic systems. Eye tracking systems measure eye gaze and pupil response non-intrusively. This research presents an analysis of eye pupil and gaze response to video structure and content. The set of experiments for this study involved presenting different video content to subjects and measuring eye response with an eye tracker. Results show...
Show moreReal-time eye tracking systems with human-computer interaction mechanism are being adopted to advance user experience in smart devices and consumer electronic systems. Eye tracking systems measure eye gaze and pupil response non-intrusively. This research presents an analysis of eye pupil and gaze response to video structure and content. The set of experiments for this study involved presenting different video content to subjects and measuring eye response with an eye tracker. Results show significant changes in video and scene cuts led to sharp constrictions. User response to videos can provide insights that can improve subjective quality assessment metrics. This research also presents an analysis of the pupil and gaze response to quality changes in videos. The results show pupil constrictions for noticeable changes in perceived quality and higher fixations/saccades ratios with lower quality. Using real-time eye tracking systems for video analysis and quality evaluation can open a new class of applications for consumer electronic systems.
Show less - Date Issued
- 2017
- PURL
- http://purl.flvc.org/fau/fd/FA00005940
- Subject Headings
- Dissertations, Academic -- Florida Atlantic University, Eye tracking., Video., Quality (Aesthetics)
- Format
- Document (PDF)
- Title
- Predicting Levels of Learning with Eye Tracking.
- Creator
- Parikh, Saurin Sharad, Kalva, Hari, Florida Atlantic University, College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
E-Learning is transforming the delivery of education. Today, millions of students take selfpaced online courses. However, the content and language complexity often hinders comprehension, and that with lack of immediate help from the instructor leads to weaker learning outcomes. Ability to predict difficult content in real time enables eLearning systems to adapt content as per students' level of learning. The recent introduction of lowcost eye trackers has opened a new class of applications...
Show moreE-Learning is transforming the delivery of education. Today, millions of students take selfpaced online courses. However, the content and language complexity often hinders comprehension, and that with lack of immediate help from the instructor leads to weaker learning outcomes. Ability to predict difficult content in real time enables eLearning systems to adapt content as per students' level of learning. The recent introduction of lowcost eye trackers has opened a new class of applications based on eye response. Eye tracking devices can record eye response on the visual element or concept in real time. The response and the variations in eye response to the same concept over time may be indicative of the levels of learning. In this study, we have analyzed reading patterns using eye tracker and derived 12 eye response features based on psycholinguistics, contextual information processing, anticipatory behavior analysis, recurrence fixation analysis, and pupils' response. We use eye responses to predict the level of learning for a term/concept. One of the main contribution is the spatio-temporal analysis of the eye response on a term/concept to derive relevant first pass (spatial) and reanalysis (temporal) eye response features. A spatiotemporal model, built using these derived features, analyses slide images, extracts words (terms), maps the subject's eye response to words, and prepares a term-response map. A parametric baseline classifier, trained with labeled data (term-response maps) classifies a term/concept as a novel (positive class) or familiar (negative class), using majority voting method. On using, only first pass features for prediction, the baseline classifier shows 61% prediction accuracy, but on adding reanalysis features, baseline achieves 66.92% accuracy for predicting difficult terms. However, all proposed features do not have the same response to learning difficulties for all subjects, as we consider reading as an individual characteristic. Hence, we developed a non-parametric, feature weighted linguistics classifier (FWLC), which assigns weight to features based on their relevance. The FWLC classifier achieves a prediction accuracy of 90.54% an increase of 23.62% over baseline and 29.54% over the first-pass variant of baseline. Predicting novel terms as familiar is more expensive because content adapts by using this information. Hence, our primary goal is to increase the prediction rate of novel terms by minimizing the cost of false predictions. On comparing the performance of FWLC with other frequently used machine learning classifiers, FWLC achieves highest true positive rate (TPR) and lowest ratio of false negative rate (FNR) to false positive rate (FPR). The higher prediction performance of proposed spatio-temporal eye response model to predict levels of learning builds a strong foundation for eye response driven adaptive e-Learning.
Show less - Date Issued
- 2017
- PURL
- http://purl.flvc.org/fau/fd/FA00005941
- Subject Headings
- Dissertations, Academic -- Florida Atlantic University, Eye tracking., E-Learning.
- Format
- Document (PDF)
- Title
- FACIAL EXPRESSION PROCESSING IN AUTISM SPECTRUM DISORDER AS A FUNCTION OF ALEXITHYMIA: AN EYE MOVEMENT STUDY.
- Creator
- Escobar, Brian, Hong, Sang Wook, Florida Atlantic University, Department of Psychology, Charles E. Schmidt College of Science
- Abstract/Description
-
The perception and interpretation of faces provides individuals with a wealth of knowledge that enables them to navigate their social environments more successfully. Prior research has hypothesized that the decreased facial expression recognition (FER) abilities observed in autism spectrum disorder (ASD) may be better explained by comorbid alexithymia, the alexithymia hypothesis. The present study sought to further examine the alexithymia hypothesis by collecting data from 59 participants and...
Show moreThe perception and interpretation of faces provides individuals with a wealth of knowledge that enables them to navigate their social environments more successfully. Prior research has hypothesized that the decreased facial expression recognition (FER) abilities observed in autism spectrum disorder (ASD) may be better explained by comorbid alexithymia, the alexithymia hypothesis. The present study sought to further examine the alexithymia hypothesis by collecting data from 59 participants and examining FER performance and eye movement patterns for ASD and neurotypical (NT) individuals while controlling for alexithymia severity. Eye movement-related differences and similarities were examined via eye tracking in conjunction with statistical and machine-learning-based pattern classification analysis. In multiple different classifying conditions, where the classifier was fed 1,718 scanpath images (either at spatial, spatial-temporal, or spatial temporal-ordinal levels) for high-alexithymic ASD, high-alexithymicvi NT, low-alexithymic ASD, and low-alexithymic NT, we could accurately decode significantly above chance level. Additionally, in the cross-decoding analysis where the classifier was fed 1,718 scanpath images for high- and low alexithymic ASD individuals and tested on high- and low-alexithymic NT individuals, results showed that classification accuracy was significantly above chance level when using spatial images of eye movement patterns. Regarding FER performance results, we found that ASD and NT groups performed similarly, but at lower intensities of expressions, ASD individuals performed significantly worse than NT individuals. Together, these findings suggest that there may be eye-movement related differences between ASD and NT individuals, which may interact with alexithymia traits.
Show less - Date Issued
- 2023
- PURL
- http://purl.flvc.org/fau/fd/FA00014358
- Subject Headings
- Autism Spectrum Disorder, Machine learning, Facial expression, Alexithymia, Eye tracking
- Format
- Document (PDF)
- Title
- Eye Fixations of the Face Are Modulated by Perception of a Bidirectional Social Interaction.
- Creator
- Kleiman, Michael J., Barenholtz, Elan, Florida Atlantic University, Charles E. Schmidt College of Science, Department of Psychology
- Abstract/Description
-
Eye fixations of the face are normally directed towards either the eyes or the mouth, however the proportions of gaze to either of these regions are dependent on context. Previous studies of gaze behavior demonstrate a tendency to stare into a target’s eyes, however no studies investigate the differences between when participants believe they are engaging in a live interaction compared to knowingly watching a pre-recorded video, a distinction that may contribute to studies of memory encoding....
Show moreEye fixations of the face are normally directed towards either the eyes or the mouth, however the proportions of gaze to either of these regions are dependent on context. Previous studies of gaze behavior demonstrate a tendency to stare into a target’s eyes, however no studies investigate the differences between when participants believe they are engaging in a live interaction compared to knowingly watching a pre-recorded video, a distinction that may contribute to studies of memory encoding. This study examined differences in fixation behavior for when participants falsely believed they were engaging in a real-time interaction over the internet (“Real-time stimulus”) compared to when they knew they were watching a pre-recorded video (“Pre-recorded stimulus”). Results indicated that participants fixated significantly longer towards the eyes for the pre-recorded stimulus than for the real-time stimulus, suggesting that previous studies which utilize pre-recorded videos may lack ecological validity.
Show less - Date Issued
- 2016
- PURL
- http://purl.flvc.org/fau/fd/FA00004701, http://purl.flvc.org/fau/fd/FA00004701
- Subject Headings
- Eye -- Movements, Eye tracking, Gaze -- Psychological aspects, Nonverbal communication, Optical pattern recognition, Perceptual motor processes, Visual perception
- Format
- Document (PDF)