Reflecting on our iGYM project

Roland Graf, associate professor at the University of Michigan Stamps School of Art & Design, and Michael Nebeling, assistant professor at the University of Michigan School of Information, with Bryan Kreps during a “play test” for iGym. (Photo by Roger Hart, Michigan Photography)

Professor Michael Nebeling was part of the U-M team that created a first version of iGYM, a system designed for school and community-based sport or recreation facilities seeking to provide novel and accessible ways for people with different levels of mobility to play and exercise together. In our current implementation, iGYM produces an interactive gaming floor for inclusive play of a “life-size version of air hockey” using a simple form of projected augmented reality.

Funded by U-M’s ESSI program, the project achieved many of the goals that are envisioned by U-M’s exercise and sport science initiative. It involved the target audience and demonstrated large impact on the users that helped us test and improve it, in this case, the kids, their friends, and their parents. The project also received a best paper award at CHI Play 2019, and wide media coverage in December 2019 (e.g., in U.S. news, ABC news, VRScout, as well as in local news papers).

Most media articles provide Professor Roland Graf‘s perspective, our friend and collaborator and the lead researcher on the project. For Professor Nebeling, it was a very rewarding project to be a part of and something he won’t forget. While it is of course nice to have helped the team win a best paper in the HCI research discipline, the professional and personal impact on Professor Nebeling was much larger than that.

First, he was able to recruit a first selection of UMSI master’s students taking his brand new AR/VR courses, SI 559 and SI 659, Pallavi Benawri and Amy Whitesall, as research assistants. Both really played a significant role in the success of this project!

Second, with most research projects, a researcher is happy when a user study produces great results and demonstrates effectiveness of the system design. With iGYM being a system that is quite complex and difficult to study given the various design parameters, it was a little different. While overall received positively, our user study revealed some mixed feelings about the system. Despite the use of algorithms and manual fine-tuning to help adapt the difficulty and balancing model of the system, it remains a challenge to make it both fun and fair, especially since with the range of parameters we can control, it was quite easy to “overbalance” the system. The results helped us understand how we could make the system better in the future, and this is what we consider a success.

But perhaps the greatest success of this work was that we could begin to see how iGYM might be adopted in the future. To this end, Professor Nebeling really enjoyed helping organize and host two play days with the larger team, including our consultant Betsy Howell, the kids, and of course their parents. It was an amazing atmosphere and really interesting to see our research prototype in action, having our participants “break test” it through creative use, and misuse, of the features and glitches of our implementation. 🙂

Many thanks to the team, especially to Roland Graf, Pallavi Benawri, Amy Whitesall, and Besty Howell.

3 papers at CHI 2020

UPDATE: Two of our papers at CHI 2020 received Best Paper Awards. Wohoo!

We are happy to announce that we have three papers conditionally accepted at CHI 2020 featuring our work on MRAT, our toolkit for recording user interaction sessions with AR/VR applications created with Unity and producing analytics that can be inspected in a dashboard or visualized in mixed reality in-situ; XRDirector, a new collaborative immersive authoring system that adapts roles from filmmaking to coordinate multiple co-located designers, with some of them working in VR and others in AR; and an interview study on key barriers to entry for AR/VR content creators with different backgrounds and levels of expertise, to guide the design of future tools specifically for end-user programming of AR/VR experiences.

  • M. Nebeling, M. Speicher, X. Wang, S. Rajaram, B.D. Hall, Z. Xie, A. Raistrick, M. Aebersold, E.G. Happ, J. Wang, Y. Sun, L. Zhang, L. Ramsier, R. Kulkarni: MRAT: The Mixed Reality Analytics Toolkit. In Proc. CHI 2020. Best Paper Award
  • M. Nebeling, K. Madier, Y. Chang, L. Zhu, M. Chung, P. Wang, J. Nebeling: XRDirector: A Role-Based Collaborative Immersive Authoring System. In Proc. CHI 2020.
  • N. Ashtari, A. Bunt, J. McGrenere, M. Nebeling, P.K Chilana: Creating Augmented and Virtual Reality Applications: Current Practices, Challenges, and Opportunities. In Proc. CHI 2020. Best Paper Award

Winter 2020 Research Positions

UPDATE: The available positions have been filled for Winter 2020. We will consider new applicants for Fall 2020, and will reach out to students in August 2020.

The Michigan Information Interaction Lab is looking to fill up to three new positions for master’s students from across U-M (please apply by filling in this form). Ideally, applicants have taken some of Professor Nebeling‘s AR/VR courses, SI 559 and/or SI 659, and significant experience with methods and tools in the desired areas:

  • AR/VR web developer: Help develop a set of web-based AR/VR tools that were previously published and that are now becoming part of a larger AR/VR online creation platform. The platform will be used to support a new AR/VR MOOC specialization on Coursera currently being designed and produced by Professor Nebeling with the Center for Academic Innovation. Significant experience with JavaScript is required. Experience with A-Frame and Unity is a plus.
  • AR/VR Unity developer: Help create a set of novel AR/VR experiences that make use of a variety of input and output technologies. Significant experience with Unity is required. Experience with 3D animation and modeling tools is a plus.
  • AR/VR designer: Help create interactive 3D content for a variety of new AR/VR experiences that will be created next year as part of Professor Nebeling’s research and his efforts to support the U-M wide XR Initiative. Significant experience with 3D modeling and animation tools required. Experience with UX practices and AR/VR design methods is a plus.

Positions are paid research assistant positions — while there is an expected workload of 6-10 hours per week, there will not always be an equal amount of work every week, and students will contribute to multiple projects based on their skills, rather than being assigned to one semester-long project. There will be a short interview process, and possibly a short trial period, to make sure that candidates are a fit and have the required skills. Positions may also be taken up for credit in the form of independent study.

While we are an ambitious research group, we are also a fun team of students and faculty interested in experimenting with up-and-coming AR/VR technologies. Feel free to reach out to current or previous students working in the lab to learn about their experience.

IMPORTANT: Please apply by filling in this form. Don’t just send emails to express your interest; I will not be able to respond to them. We will review the applications entered via the form and get back to you after initial review in early January 2020. Thank you for your interest!

New XR Initiative at U-M

Professor Nebeling and members of the Information Interaction Lab are excited to share that they have been recognized as a key partner in U-M’s new major XR Initiative, which was recently announced by the Provost and endorsed by the President.

While the specifics and strategic goals are still in development, one of the most exciting first milestones for us in the lab will be a new set of online courses that will provide significant training in design and development of AR/VR technologies, together with frameworks for ethical, social, privacy, and security aware design of AR/VR apps, as well as guidelines and recommendations for creating new AR/VR spaces for research and development as well as teaching and learning, which will be particularly helpful for resource constrained environments such as many libraries and schools. These new learning materials will significantly expand on Professor Nebeling’s AR/VR/MR Teach-Out on Coursera, which has provided an introduction to AR/VR technologies to more than 2,400 active learners world-wide.

We have also started new collaborations with several key vendors in the AR/VR space.

We are so excited by the enthusiastic support and the new movements at U-M! If you are interested in supporting our efforts, please reach out to Professor Nebeling at

Disney Research|Studios Faculty Award

We’re happy to announce that Professor Michael Nebeling has received the Disney Research|Studios Faculty Award 2019.

Michael traveled to the ACM SIGGRAPH 2019 conference in Los Angeles, CA, in July and received the award at DisPLAY, the Disney Mixer social event held during the conference.

Disney Research|Studios Director and ETH Zurich Professor, Markus Gross, and Associate Director, Bob Sumner, announced the award at the event and handed it over to Michael in a very nice ceremony.

The award, “In recognition of your scientific excellence”, is a recognition of Michael’s important research at the intersection of HCI and AR/VR over the past few years, with the goal of empowering non-programmers and user experience designers to create AR/VR experiences rapidly and cheaply.

Thank you, Disney!

New Postdoc Positions

The Michigan Information Interaction Lab directed by Professor Michael Nebeling is looking for 1-2 postdocs to join their efforts in designing and studying new interactive systems & technologies.

Over the first three years at U-M, the lab has worked at the forefront of cross-device interfaces, multi-modal interaction, and AR/VR. Our systems research contributions include ProtoAR [CHI’18] and 360proto [CHI’19] for rapid prototyping of AR/VR interfaces using cross-device authoring, 360Anywhere [EICS’18] for 360 video based collaboration using mixed reality, GestureWiz [CHI’18] for using Wizard of Oz and crowdsourcing in gesture design and recognition tasks. Using the knowledge from building these systems, the lab has also been working on conceptual frameworks and new ways of thinking about future interfaces’ design through contributions such as What is Mixed Reality? [CHI’19 Best Paper Honorable Mention], The Trouble with AR/VR Authoring Tools [ISMAR’18 Adj.], Playing the Tricky Game of Toolkits Research [CHI’17 HCI.Tools Workshop], and studies on creating cross-device AR experiences resulting in XD-AR [EICS’18 Best Paper Award] as well as user-driven design principles for gesture representations [CHI’18].

We’re considering applications of successful PhDs (defended by the start date) who have been active in premier HCI conferences, notably CHI, UIST, and CSCW, but also related, more specialized conferences such as IMWUT (formerly UbiComp), ISS (formerly ITS), EICS, TEI, SUI, ISMAR, VR, and VRST.

As a new post-doctoral research fellow in the lab, you would closely work with Michael and his students to help develop and lead new research projects on novel interactive technologies and interactive systems. We currently focus our research on designing, developing, and evaluating novel AR/VR experiences. But we’re always interested in broadening our research activities as well.

Interested applicants should get in touch with Michael directly via email (

2 Papers at CHI 2019

Update: Michael also contributed to an article summarizing some key papers at CHI 2019, which appeared in IEEE Pervasive and can be found here.

We are happy to announce that we have two papers accepted at CHI 2019 featuring our work on 360proto, a new AR/VR physical-digital prototyping tool which allows designers to make both AR and VR prototypes from paper and bring them to live on common AR/VR devices, and a critical reflection piece on the question of What is Mixed Reality?, written up as a paper that is part scientific literature review, part analysis of state of the art in industry, and part interviews with leading experts on AR/VR in both academia and industry.

  1. M. Nebeling, K. Madier: 360proto: Making Interactive Virtual Reality & Augmented Reality Prototypes from Paper
  2. M. Speicher, B.D. Hall, M. Nebeling: What is Mixed Reality? Best Paper Honorable Mention
Star Trek example based on 360 sketches and digitized using 360proto to run on an ARCore phone

We also announced earlier that we will offer an AR/VR prototyping course at CHI this year! The idea to teach this course really grew out of our success teaching AR/VR interaction design to a diverse student body in Fall 2018. We have since started teaching a more advanced and technical version this semester.

The CHI 2019 course will be an exciting opportunity for new and old AR/VR researchers, designers, and students to come together and learn about our rapid prototyping techniques in a hands-on manner. Should be fun!

See you at CHI 2019 in Glasgow, Scotland, UK this year!

AR/VR Prototyping Course at CHI 2019

We are excited to announce our CHI 2019 AR/VR prototyping course. The course web site can be found here.

AR/VR technologies are starting to disrupt our everyday experiences and they are playing an increasingly important role in interaction design. The course taught by Professor Nebeling will introduce CHI 2019 attendees to state-of-the-art AR/VR prototyping methods and tools including ProtoAR, which we recently presented at CHI 2018 and are currently preparing for public release.

The course will be very hands-on introducing participants to both physical prototyping and digital prototyping techniques. These techniques allow designers to rapidly create and try out AR/VR interface designs in order to make decisions on how to best take a particular design concept further, or to abandon it early before putting in all the effort that is currently required to program these interfaces.

Join us to learn more about ways of incorporating AR/VR design activities into existing courses, workshops, and student design jams!

Find out more about the course here

New Grant from Mozilla

We are really excited to receive a Mozilla Research Grant for our research on AR/VR.

The research gift from Mozilla will enable Professor Nebeling and his lab to conduct two important research studies around the future of the web and its use as a platform for augmented reality applications. The first study will explore a range of desirable augmented reality applications and how they could be created by composing popular, existing applications in interesting and useful, new ways. The second study will focus on new interactions techniques and technical solutions for sharing links to and content of web applications based on novel augmented reality interfaces. Together, these studies will inform the design of future web browsing interfaces and technologies with user-driven support for AR/VR.

We are very happy that Mozilla is supporting the research in the lab, and the support will allow us to bring on several new MSI students, many of which Professor Nebeling has been teaching in his new AR/VR course, and who are now excited to get to work on this project to practice the techniques they studied and deepen their research skills.

Mozilla selected us because they believe in the work we do and in the impact it can have on the web for AR/VR. While many vendors try to shape the future of AR/VR in terms of their specific platforms and technologies, the web needs to remain the open access and device-agnostic platform that it is. This is very important to Mozilla, and to us, and requires new research like ours.