Research

We are a human-computer interaction research lab contributing new methods, tools, and technologies for creating novel interactive systems and technologies.

We have a history of building and studying next-generation user interfaces, involving touch, gesture, speech, multi-modal, multi-device interactions, and now virtual, augmented, and mixed reality interfaces. When studying these forms of interfaces, our focus is on improving both the designers’ and the end-users’ experience through novel ways of creating and using them.

Latest Research

Over the first three years at U-M, the lab has worked at the forefront of cross-device interfaces, multi-modal interaction, and AR/VR. Our systems research contributions include ProtoAR [CHI’18] and 360proto [CHI’19] for rapid prototyping of AR/VR interfaces using cross-device authoring, 360Anywhere [EICS’18] for 360 video based collaboration using mixed reality, GestureWiz [CHI’18] for using Wizard of Oz and crowdsourcing in gesture design and recognition tasks. Using the knowledge from building these systems, the lab has also been working on conceptual frameworks and new ways of thinking about future interfaces’ design through contributions such as What is Mixed Reality? [CHI’19 Best Paper Honorable Mention], The Trouble with AR/VR Authoring Tools [ISMAR’18 Adj.], Playing the Tricky Game of Toolkits Research [CHI’17 HCI.Tools Workshop], and studies on creating cross-device AR experiences resulting in XD-AR [EICS’18 Best Paper Award] as well as user-driven design principles for gesture representations [CHI’18].

Overview of our externally sponsored research projects:

(2018-)

Developing methods and tools to make prototyping of augmented reality and virtual reality experiences easier for non-technical designers

(2018-) Rethinking the web browser as an augmented reality application delivery platform
(2018-)

Investigating the use and usefulness of virtual reality visualizations for understanding future city traffic scenarios

(2017-)

Developing a conceptual framework and implementation of a cross-device augmented reality application platform


(2018)

Creating augmented reality interfaces to configure and preview future kitchen designs

 

Previous Research

Here is our Mi2 Lab Summer 2018 Showreel:

Below are examples from our earlier research:

  • XD-AR (EICS’18 paper) an augmented reality platform that we are developing for multi-user multi-device augmented reality experiences that allow users to “edit” the physical world in realtime
  • 3D Virtual/Physical Lab: virtual and physical 3D models of our lab environment that we are using to design new forms of natural user interfaces and cross-device interactions for both users in the lab and users participating remotely
  • Surface Pad: initially conceived as a sketching interface that can work with new kinds of input devices, we are currently exploring how the interface can be extended to easily import and blend content from the digital and physical worlds. Surface Pad formed the basis of the aforementioned ProtoAR.

During his postdoc at Carnegie Mellon University, the lab’s PI Michael Nebeling investigated ways of orchestrating multiple devices and crowds to enable complex information seeking, sensemaking and productivity tasks on small mobile and wearable devices. He also contributed to the Google IoT project led by CMU. This work led to three papers at ACM CHI 2016:

  • XDBrowser: a new cross-device web browser that I used to elicit 144 multi-device web page designs for five popular web interfaces leading to seven cross-device web page design patterns (CHI’16 paper, CHI’16 talk) and semi-automatic cross-device interface generation techniques (CHI’17 paperCHI’17 talk)
  • Snap-To-It: a mobile app allowing users to opportunistically interact with appliances in multi-device environments simply by taking a picture of them (CHI’16 paper)

Selected Publications

For the full list of publications, please check Google Scholar and DBLP.

  1. What is Mixed Reality? CHI’19
    BEST PAPER HONORABLE MENTION
    M. Speicher, B.D. Hall, M. Nebeling
  2. 360proto: Making Interactive Virtual Reality & Augmented Reality Prototypes from Paper CHI’19a
    M. Nebeling, K. Madier
  3. ProtoAR: Rapid Physical-Digital Prototyping of Mobile Augmented Reality Applications CHI’18
    M. Nebeling, J. Nebeling, A. Yu, R. Rumble
  4. GestureWiz: A Human-Powered Gesture Design Environment for User Interface Prototypes CHI’18
    M. Speicher, M. Nebeling
  5. User-Driven Design Principles for Gesture Representations CHI’18
    E. McAweeney, H. Zhang, M. Nebeling
  6. XD-AR: Challenges and Opportunities in Cross-Device Augmented Reality Application Development EICS’18
    BEST PAPER AWARD
    M. Speicher, B.D. Hall, A. Yu, B. Zhang, H. Zhang, J. Nebeling, M. Nebeling
  7. 360Anywhere: Mobile Ad-hoc Collaboration in Any Environment using 360 Video and Augmented Reality EICS’18
    M. Speicher, J. Cao, A. Yu, H. Zhang, M. Nebeling
  8. XDBrowser 2.0: Semi-Automatic Generation of Cross-Device Interfaces CHI’17
    M. Nebeling