Jaewook Lee

Senior at UIUC, Human-Computer Interaction Researcher

I am a senior studying computer science and minoring in psychology at the University of Illinois at Urbana-Champaign. I am a HCI researcher with a focus on mobile computing, VR/AR/MR, and context-aware computing. Through this combination of interests, I aim the answer the following question: How can we design technologies such that they adapt and work for us, and not the other way around. To achieve this, I build mobile technologies that leverage sensors, machine learning, computer vision, and augmented reality to collect and understand the data around us. This context-driven implicit (and sometimes explicit) interactions allow mobile devices to become more personalized, so that users can receive assistance when necessary (e.g., at 12pm on a cold day, AR glasses should recommend a hot soup place for lunch.).

As a hobby, I play tennis, cook Korean food, play video games, or create my own games!

I am actively looking for Ph.D. positions in HCI for Fall 2022.

download cv


  • 2018-Present

    University of Illinois at Urbana-Champaign

    B.S. in Computer Science, Minor in Psychology

    GPA: 3.95/4.0, Technical GPA: 3.93/4.0

    Dean's List: FA18, SP19, FA19, SP20, FA20, SP21

    James Scholar: FA19, SP20, FA20, SP21, FA21

Research & Ongoing Works.

Context-Aware Computing

Can we enable mobile devices to extract information (context) from its users or surrounding (e.g., using sensors or computer vision)?

Mobile Computing

Can we equip mobile devices with the necessary technologies (e.g., machine learning) such that they can process and make sense of contextual data?


Can we improve the way mobile devices display results of processing the data by superimposing them onto the real world?

XR Remote Study Toolkit

Leading an effort to create an open source toolkit with a goal of facilitating remote user studies for XR researchers.

Done in collaboration with Dr.Eyal Ofek at Microsoft Research.

Designing AR Navigation Visualizations

Researching which AR navigation visualizations work best in a given scenario and which contexts do pedestrians pay attention to.

Advised by Prof. David Lindlbauer at CMU HCII.


Conducting a field study using Decipher, a data-driven visualization tool that facilitates understanding of large sets of feedback.

Done in collaboration with Joy Kim at Adobe Research.


(* denotes co-authorship)

3 Papers In Submission.

What’s This? A Voice and Touch Multimodal Approach for Ambiguity Resolution in Voice Assistants

Jaewook Lee, Sebastian S. Rodriguez, Raahul Natarrajan, Jacqueline Chen, Harsh Deep, Alex Kirlik
To Appear at ICMI 2021


Image Explorer: Multi-Layered Touch Exploration to Make Images Accessible

Jaewook Lee, Yi-Hao Peng, Jaylin Herskovitz, Anhong Guo
To Appear at ASSETS 2021


Challenges and Opportunities for Data-Centric Peer Evaluation Tools for Teamwork

Wenxuan Wendy Shi, Akshaya Jagannadharao, Jaewook Lee, Brian P. Bailey
To Appear at CSCW 2021


Explorations of Designing Spatial Classroom Analytics with Virtual Prototyping

JiWoong Jang*, Jaewook Lee*, Vanessa Echeverria, LuEttaMae Lawrence, Vincent Aleven


Developing Virtual Reality Data Kit for Education Researchers

Taehyun Kim, Jaewook Lee, Robb Lindgren, Jina Kang
LSGSC 2020


Measuring Complacency in Humans Interacting with Autonomous Agents in a Multi-Agent System

Sebastian S. Rodriguez, Jacqueline Chen*, Harsh Deep*, Jaewook Lee*, Derrik E. Asher, and Erin Zaroukian
SPIE 2020


Achievements & Activities.


The best way to reach me is to send me an email! :))