Research

We are a human-computer interaction research lab contributing new methods, tools, and technologies for creating novel interactive systems and technologies.

We have a history of building and studying next-generation user interfaces, involving touch, gesture, speech, multi-modal, multi-device interactions, and now virtual, augmented, and mixed reality interfaces. When studying these forms of interfaces, our focus is on improving both the designers’ and the end-users’ experience through novel ways of creating and using them.

We thank our sponsors for supporting the research below.

Current Research Topics

Our research group is currently pursuing three major research themes:

  • Empowering novice XR designers: In a series of new systems, we are exploring different ways of enabling rapid prototyping of new AR/VR experiences without significant training in 3D programming, animation, or modeling.
  • Virtual production: We are fascinated by the power and flexibility of XR technology for creating novel user experiences and are experimenting with ways of simulating and studying user interfaces and interaction techniques without fully developed systems in software and hardware.
  • Immersive instruction: As part of our involvement in the U-M wide XR initiative, we are investigating new techniques and tools for creating instructional materials making use of XR technology to facilitate training and learning.

Recent Work

We made three contributions to ACM CHI 2020, the premier conference in the human-computer interaction field:

  • Mixed reality user studies: We presented MRAT, the mixed reality analytics toolkit, designed to support user evaluations on AR/VR devices, eliciting a diverse set of requirements and presented general concepts and techniques to collect, pre-process, and visualize mixed reality user data.
  • Collaborative immersive authoring: We also presented XRDirector, a role-based, multi-user immersive authoring system that adapts roles familiar from filmmaking to coordinate multiple AR/VR designers and characterize the issues related to spatial coordination.
  • Key barriers to entry for novice XR creators: Finally, we co-authored an interview study with 21 participants elaborating on their practices and challenges making use of current XR technologies and tools and explored opportunities for better end-user programming support.

UPDATE: The first and last contribution each won a 🏆Best Paper (top 1%) at CHI 2020. We are grateful for this recognition that rewarded a lot of the hard work of students and faculty in the lab.

Earlier Research

Over the first three years at U-M, the lab has worked at the forefront of cross-device interfaces, multi-modal interaction, and AR/VR. Our systems research contributions include ProtoAR [CHI’18] and 360proto [CHI’19] for rapid prototyping of AR/VR interfaces using cross-device authoring, 360Anywhere [EICS’18] for 360 video based collaboration using mixed reality, GestureWiz [CHI’18] for using Wizard of Oz and crowdsourcing in gesture design and recognition tasks. Using the knowledge from building these systems, the lab has also been working on conceptual frameworks and new ways of thinking about future interfaces’ design through contributions such as What is Mixed Reality? [CHI’19 Best Paper Honorable Mention], The Trouble with AR/VR Authoring Tools [ISMAR’18 Adj.], Playing the Tricky Game of Toolkits Research [CHI’17 HCI.Tools Workshop], and studies on creating cross-device AR experiences resulting in XD-AR [EICS’18 Best Paper Award] as well as user-driven design principles for gesture representations [CHI’18].

We keep an archive of our previous research here.

Publications (since starting the lab at U-M in 2016)

For the full list of publications, please check Google Scholar and DBLP.

  1. XRDirector: A Role-Based Collaborative Immersive Authoring System CHI’20
    M. Nebeling, K. Lewis, Y-C. Chang, L. Zhu, M. Chung, P. Wang, J. Nebeling
  2. MRAT: The Mixed Reality Analytics Toolkit CHI’20
    🏆BEST PAPER AWARD
    M. Nebeling, M. Speicher, X. Wang, S. Rajaram, B.D. Hall, Z. Xie, A.R.E. Raistrick, M. Aebersold, E.G. Happ, J. Wang, Y. Sun, L. Zhang, L. Ramsier, R. Kulkarni
  3. Creating Augmented and Virtual Reality Applications: Current Practices, Challenges, and Opportunities CHI’20
    🏆BEST PAPER AWARD
    N. Ashtari, A. Bunt, J. McGrenere, M. Nebeling, P.K. Chilana
  4. iGYM: An Interactive Floor Projection System for Inclusive Exergame Environments CHI PLAY’19
    🏆BEST PAPER AWARD
    R. Graf, P. Benawri, A.E. Whitesall, D. Carichner, Z. Li, M. Nebeling, H.S. Kim
  5. What is Mixed Reality? CHI’19
    🏅BEST PAPER HONORABLE MENTION
    M. Speicher, B.D. Hall, M. Nebeling
  6. 360proto: Making Interactive Virtual Reality & Augmented Reality Prototypes from Paper CHI’19
    M. Nebeling, K. Madier
  7. ProtoAR: Rapid Physical-Digital Prototyping of Mobile Augmented Reality Applications CHI’18
    M. Nebeling, J. Nebeling, A. Yu, R. Rumble
  8. GestureWiz: A Human-Powered Gesture Design Environment for User Interface Prototypes CHI’18
    M. Speicher, M. Nebeling
  9. User-Driven Design Principles for Gesture Representations CHI’18
    E. McAweeney, H. Zhang, M. Nebeling
  10. XD-AR: Challenges and Opportunities in Cross-Device Augmented Reality Application Development EICS’18
    🏆BEST PAPER AWARD
    M. Speicher, B.D. Hall, A. Yu, B. Zhang, H. Zhang, J. Nebeling, M. Nebeling
  11. 360Anywhere: Mobile Ad-hoc Collaboration in Any Environment using 360 Video and Augmented Reality EICS’18
    M. Speicher, J. Cao, A. Yu, H. Zhang, M. Nebeling
  12. XDBrowser 2.0: Semi-Automatic Generation of Cross-Device Interfaces CHI’17
    M. Nebeling