To achieve this, I am designing, protoyping, and evaluating ubiquitous computing user interfaces with expressive data visualizations or representations. Ideally, interacting with these system feels “natural” or “intuitive” and creates meaningful human-data experiences. Increasingly, this requires integrating intelligible and explainable forms of data science and artificial intelligence into interactive systems.
Consequently, my research focuses on the intersection of human-data interaction (HDI), human-computer interaction (HCI), interaction design (IxD), information visualization (InfoVis), computer-supported cooperative work (CSCW) and ubiquitous computing (UbiComp).
The overall goal of my research is to understand how to best support individuals or teams during data exploration and sensemaking:
- What kind of interaction and visualization techniques and data representations do users perceive as “natural” and easy-to-use?
- Which techniques are easier to learn and cognitively less demanding?
- How can we harness novel post-WIMP (post-“Windows Icons Menus Pointer”) interaction techniques and ubiquitous computing systems, e.g., multi-touch, pen input, gestures, tangible objects, interactive tables and walls, virtual/augmented reality?
- How can we better support collaboration and navigation in large data sets?
- How can we avoid bias, false interpretations, and convey uncertainty or gaps in data?
- How to use thoughtful interaction designs to map algorithmic and statistical methods to intelligible and explainable representations or metaphors?
On a more theoretical level, I am very interested in how to apply cognitive science and theories of embodied cognition to human-data interaction. As a result, I have proposed the conceptual framework of Blended Interaction as a cognitive model that can help us to design more “natural” interactive systems with a better user experience and fluid interaction.