Jaewook Lee

Senior at UIUC, Human-Computer Interaction Researcher

Hi! I'm Jaewook (I prefer to go by Jae)! I'm a senior studying computer science and minoring in psychology at the University of Illinois at Urbana-Champaign (UIUC). I'm a human-computer interaction (HCI) researcher who aspires to answer the following question: How can we design more personalized mobile technologies that understand and adapt to us and our surroundings? To answer this question, I build enabling technologies that leverage contexts (e.g., nearby objects, emotion) and AI to assist us with our day-to-day lives. Additionally, the enabling technologies I build often utilize extended reality (XR - VR/AR/MR) or touch interfaces to display results to users for a more interactive, in-situ experiences. Ultimately, I envision future mobile devices becoming more people-aware and serving as a tool that can empower users to use or interact with data and AI in their everyday lives.

When I'm not conducting research or taking classes, I hang out with friends, play tennis, cook Korean food, play video games! A recent hobby of mine is creating my own games.

I am actively looking for Ph.D. positions in HCI for Fall 2022.

download cv


  • 2018-Present

    University of Illinois at Urbana-Champaign

    B.S. in Computer Science, Minor in Psychology

    GPA: 3.95/4.0, Technical GPA: 3.94/4.0

    Dean's List: FA18, SP19, FA19, SP20, FA20, SP21, FA21

    James Scholar: FA19, SP20, FA20, SP21, FA21

Research & Ongoing Works.

Context-Aware Computing

Can we enable mobile devices to extract information (context) from its users or surrounding (e.g., using sensors or computer vision)?

Mobile Computing

Can we equip mobile devices with the necessary technologies (e.g., AI) such that they can process and make sense of contextual data?

XR & Touch

Can we display results of processing contextual data in an interactive and dynamic manner?

XR Remote Study Toolkit

Leading an effort to create an open source toolkit with a goal of facilitating remote user studies for XR researchers.

Advised by Dr.Eyal Ofek (Microsoft Research).


Conducting a field study using Decipher, a data-driven visualization tool that facilitates understanding of large sets of feedback.

Advised by Prof. Brian P Bailey (UIUC). Done in collaboration with Dr. Joy Kim (Adobe Research).


Nothing else yet :))


(* denotes co-authorship)

User Preference for Navigation Instructions in Mixed Reality

Jaewook Lee, Fanjie Jin, Younsoo Kim, David Lindlbauer
To Appear at IEEE VR 2022

Hand Interfaces: Using Hands to Imitate Objects in AR/VR for Expressive Interactions

Siyou Pei, Alexander Chen, Jaewook Lee, Yang Zhang
To Appear at CHI 2022

ImageExplorer: Multi-Layered Touch Exploration to Encourage Skepticism Towards Imperfect AI-Generated Image Captions

Jaewook Lee, Jaylin Herskovitz, Yi-Hao Peng, Anhong Guo
To Appear at CHI 2022

Challenges and Opportunities for Data-Centric Peer Evaluation Tools for Teamwork

Wenxuan Wendy Shi, Akshaya Jagannadharao, Jaewook Lee, Brian P. Bailey
CSCW 2021


What’s This? A Voice and Touch Multimodal Approach for Ambiguity Resolution in Voice Assistants

Jaewook Lee, Sebastian S. Rodriguez, Raahul Natarrajan, Jacqueline Chen, Harsh Deep, Alex Kirlik
ICMI 2021


Image Explorer: Multi-Layered Touch Exploration to Make Images Accessible

Jaewook Lee, Yi-Hao Peng, Jaylin Herskovitz, Anhong Guo


Explorations of Designing Spatial Classroom Analytics with Virtual Prototyping

JiWoong Jang*, Jaewook Lee*, Vanessa Echeverria, LuEttaMae Lawrence, Vincent Aleven


Developing Virtual Reality Data Kit for Education Researchers

Taehyun Kim, Jaewook Lee, Robb Lindgren, Jina Kang
LSGSC 2020


Measuring Complacency in Humans Interacting with Autonomous Agents in a Multi-Agent System

Sebastian S. Rodriguez, Jacqueline Chen*, Harsh Deep*, Jaewook Lee*, Derrik E. Asher, and Erin Zaroukian
SPIE 2020


Achievements & Activities.


The best way to reach me is to send me an email! :))