Research

Overview

Currently, I am working under the guidance of Professor Christian Wallraven in the Cognitive Systems Lab at Korea University. My research centers on fine-grained ethnicity recognition—specifically, how Korean observers distinguish between Korean, Chinese, and Japanese faces, a task that is often assumed to be intuitive but is perceptually challenging in practice.

Across a series of studies, I investigate this question from multiple complementary angles: how observers rely on different facial cues (internal features, external features, or their combination), how social trait impressions such as attractiveness and trustworthiness bias categorization decisions, and how human performance compares to that of machine learning models trained on the same task. Together, these projects aim to characterize the perceptual and social factors shaping fine-grained face categorization, and to clarify where human and artificial systems converge or diverge in solving this problem.


Projects

What Do We Look At When We Decide Someone’s Ethnicity? Behavioral · Face perception

How do Korean observers use internal (eyes, nose) versus external (hair, face outline) facial cues when categorizing Korean, Japanese, and Chinese faces?

Methods: 3AFC categorization · stimulus manipulations · confidence measures

When Humans Struggle and Machines Succeed at Face Categorization Computational · Human–AI

A follow-up study comparing human categorization patterns with deep learning models, asking where humans and machines converge—and where their error profiles diverge.

Methods: face embeddings · supervised classifiers · cross-validation

Attractive Strangers Look Korean: How Social Traits Bias Ethnicity Perception Social cognition · Trait bias

Examines how rapid social trait impressions such as attractiveness and trustworthiness bias fine-grained ethnicity categorization, including asymmetric in-group effects.

Methods: trait ratings · mixed-effects analyses

Reading Emotions When Faces Are Partly Hidden Eye-tracking · Social vision

Investigates how gaze direction and face masks shape emotion recognition and visual attention to diagnostic facial regions.

Methods: eye-tracking · emotion recognition · ROI analyses

Why Some Robots Feel Uncanny: Audio-Visual Cues in Human–Robot Interaction Multisensory · HRI

Explores how naturality in audio–visual signals influences human–robot interaction and the perception of the uncanny valley.

Methods: multisensory stimuli · behavioral judgments