2020 Positions for PhD Students and Postdocs

UPDATE: We have received more than 30 applications from students interested in joining the research of Professor Michael Nebeling and his team. We’re very excited by that. Thank you! We will internally review applications and start contacting shortlisted candidates after January 9th, 2020. It is our hope that final decisions are made and can be communicated by the end of January.

The Michigan Information Interaction Lab will have several open positions in 2020 at different levels. People joining in these positions will work directly with me (Michael Nebeling):

  • 1-2 PhD student positions: If you’re thinking about doing a PhD and applying to UMSI or CSE at Michigan (I can supervise in both, and am happy to co-supervise with other professors in either department, in fact, I consider many close colleagues and friends!), with broadly defined interests in mixed reality (any angle really, UX/interaction design, arts, computer graphics, ML, robotics are all fascinating to me!), definitely get in touch! Don’t just apply, and don’t just consider UMSI or CSE (they are both great, but also quite different, so you may increase your chances of finding the right spot for you at U-M by applying to both). I’m happy to chat with you before your application and share my experience.
  • 1 postdoc position: I originally posted about this here. I did my postdoc at CMU and it was one of the best parts of my academic career. U-M has great resources and UMSI (my home department) is so inter-disciplinary, it’s really eye-opening. If you want to push yourself after your PhD to work on your next career steps, be it in academia or industry, I would really like to meet you!
  • 1 visiting PhD researcher position: The next step after the PhD can be a critical career decision, and I have personally experienced this and know too well how difficult it is to make that decision. I would like to offer a 6-12 months funded research stay in my lab, with possibility of conversion into a postdoc, if it works for both you and me. U-M is great, you should definitely consider it!
  • Presidential Postdocs: We also have the presidential postdoc fellowship program, which means you could end up in a position that will be converted to a tenure-track assistant professor after 1-2 years. Applications are now open (until October 15, 2019).

For any of the positions above, don’t just apply. Reach out to me and express your interest first, so I can guide you through the process. Please include a short statement of interest and a CV, if you can.

I look forward to hearing from you!

Disney Research|Studios Faculty Award

We’re happy to announce that Professor Michael Nebeling has received the Disney Research|Studios Faculty Award 2019.

Michael traveled to the ACM SIGGRAPH 2019 conference in Los Angeles, CA, in July and received the award at DisPLAY, the Disney Mixer social event held during the conference.

Disney Research|Studios Director and ETH Zurich Professor, Markus Gross, and Associate Director, Bob Sumner, announced the award at the event and handed it over to Michael in a very nice ceremony.

The award, “In recognition of your scientific excellence”, is a recognition of Michael’s important research at the intersection of HCI and AR/VR over the past few years, with the goal of empowering non-programmers and user experience designers to create AR/VR experiences rapidly and cheaply.

Thank you, Disney!

New Postdoc Positions

The Michigan Information Interaction Lab directed by Professor Michael Nebeling is looking for 1-2 postdocs to join their efforts in designing and studying new interactive systems & technologies.

Over the first three years at U-M, the lab has worked at the forefront of cross-device interfaces, multi-modal interaction, and AR/VR. Our systems research contributions include ProtoAR [CHI’18] and 360proto [CHI’19] for rapid prototyping of AR/VR interfaces using cross-device authoring, 360Anywhere [EICS’18] for 360 video based collaboration using mixed reality, GestureWiz [CHI’18] for using Wizard of Oz and crowdsourcing in gesture design and recognition tasks. Using the knowledge from building these systems, the lab has also been working on conceptual frameworks and new ways of thinking about future interfaces’ design through contributions such as What is Mixed Reality? [CHI’19 Best Paper Honorable Mention], The Trouble with AR/VR Authoring Tools [ISMAR’18 Adj.], Playing the Tricky Game of Toolkits Research [CHI’17 HCI.Tools Workshop], and studies on creating cross-device AR experiences resulting in XD-AR [EICS’18 Best Paper Award] as well as user-driven design principles for gesture representations [CHI’18].

We’re considering applications of successful PhDs (defended by the start date) who have been active in premier HCI conferences, notably CHI, UIST, and CSCW, but also related, more specialized conferences such as IMWUT (formerly UbiComp), ISS (formerly ITS), EICS, TEI, SUI, ISMAR, VR, and VRST.

As a new post-doctoral research fellow in the lab, you would closely work with Michael and his students to help develop and lead new research projects on novel interactive technologies and interactive systems. We currently focus our research on designing, developing, and evaluating novel AR/VR experiences. But we’re always interested in broadening our research activities as well.

Interested applicants should get in touch with Michael directly via email (

2 Papers at CHI 2019

Update: Michael also contributed to an article summarizing some key papers at CHI 2019, which appeared in IEEE Pervasive and can be found here.

We are happy to announce that we have two papers accepted at CHI 2019 featuring our work on 360proto, a new AR/VR physical-digital prototyping tool which allows designers to make both AR and VR prototypes from paper and bring them to live on common AR/VR devices, and a critical reflection piece on the question of What is Mixed Reality?, written up as a paper that is part scientific literature review, part analysis of state of the art in industry, and part interviews with leading experts on AR/VR in both academia and industry.

  1. M. Nebeling, K. Madier: 360proto: Making Interactive Virtual Reality & Augmented Reality Prototypes from Paper
  2. M. Speicher, B.D. Hall, M. Nebeling: What is Mixed Reality? Best Paper Honorable Mention
Star Trek example based on 360 sketches and digitized using 360proto to run on an ARCore phone

We also announced earlier that we will offer an AR/VR prototyping course at CHI this year! The idea to teach this course really grew out of our success teaching AR/VR interaction design to a diverse student body in Fall 2018. We have since started teaching a more advanced and technical version this semester.

The CHI 2019 course will be an exciting opportunity for new and old AR/VR researchers, designers, and students to come together and learn about our rapid prototyping techniques in a hands-on manner. Should be fun!

See you at CHI 2019 in Glasgow, Scotland, UK this year!

AR/VR Prototyping Course at CHI 2019

We are excited to announce our CHI 2019 AR/VR prototyping course. The course web site can be found here.

AR/VR technologies are starting to disrupt our everyday experiences and they are playing an increasingly important role in interaction design. The course taught by Professor Nebeling will introduce CHI 2019 attendees to state-of-the-art AR/VR prototyping methods and tools including ProtoAR, which we recently presented at CHI 2018 and are currently preparing for public release.

The course will be very hands-on introducing participants to both physical prototyping and digital prototyping techniques. These techniques allow designers to rapidly create and try out AR/VR interface designs in order to make decisions on how to best take a particular design concept further, or to abandon it early before putting in all the effort that is currently required to program these interfaces.

Join us to learn more about ways of incorporating AR/VR design activities into existing courses, workshops, and student design jams!

Find out more about the course here

New Grant from Mozilla

We are really excited to receive a Mozilla Research Grant for our research on AR/VR.

The research gift from Mozilla will enable Professor Nebeling and his lab to conduct two important research studies around the future of the web and its use as a platform for augmented reality applications. The first study will explore a range of desirable augmented reality applications and how they could be created by composing popular, existing applications in interesting and useful, new ways. The second study will focus on new interactions techniques and technical solutions for sharing links to and content of web applications based on novel augmented reality interfaces. Together, these studies will inform the design of future web browsing interfaces and technologies with user-driven support for AR/VR.

We are very happy that Mozilla is supporting the research in the lab, and the support will allow us to bring on several new MSI students, many of which Professor Nebeling has been teaching in his new AR/VR course, and who are now excited to get to work on this project to practice the techniques they studied and deepen their research skills.

Mozilla selected us because they believe in the work we do and in the impact it can have on the web for AR/VR. While many vendors try to shape the future of AR/VR in terms of their specific platforms and technologies, the web needs to remain the open access and device-agnostic platform that it is. This is very important to Mozilla, and to us, and requires new research like ours.

Fall 2018/Winter 2019 Research Positions

Update: Thank you to everyone who has applied with the lab. It was exciting to see so much interest. We have added 8 new research assistants and 8 new independent study students to the lab. This was really the maximum we could do for next semester.

The Information Interaction Lab at the University of Michigan School of Information is looking for up to three new master’s students to assist with a range of ongoing AR/VR projects. We offer two types of positions:

  • technical positions focus on technological aspects and typically involve a variety of programming and technical HCI research tasks around emerging toolkits and tools;
  • design positions focus on HCI/UX and interaction design aspects and typically involve complex UX design tasks and experimentation with new kinds of prototyping methods.

We are looking for students with solid AR/VR design and development experience. Students that have completed Professor Nebeling’s SI 559 AR/VR Application Design, or have completed workshops as part of the Alternate Reality Initiative (ARI) run by Michael Zhang and others, are especially encouraged to apply. For technical positions, you should consider yourself an expert in Unity and/or Three.js/A-Frame, and have experience working with AR frameworks including HoloToolkit, Tango, ARCore, ARKit, AR.js, etc. For design positions, we expect that you are familiar with state-of-the-art prototyping methods and tools.

We realize that there is only a tiny fraction of students at U-M that have such a background, so a compromise might be that you can at least demonstrate significant front-end design and development experience, and the potential to quickly navigate the AR/VR space and learn a range of new technologies as required for our ongoing research projects.

For students with the required background and skills, positions can be converted into paid temporary research assistant positions (pay commensurate with experience) after a short trial period. For students that do not yet have the necessary experience, but want to deepen their knowledge in AR/VR as they are experimenting with new devices and technologies, we also consider independent study projects. Finally, we have been working with several UROP students over the years and are always eager to consider new UROP applicants as well. 

If you think that could be you, please send an email to Michael Nebeling. Make sure to include a short paragraph about yourself and your interests in working with us. If you have it ready, please also include a CV.

Attending UIST, ISMAR, and AWE EU

Attending an ISMAR workshop on creativity in design with & for mixed reality design.

Last week, Michael Nebeling, director of the Information Interaction Lab, attended three conferences in Germay: UIST, the premier forum for innovative user interfaces, ISMAR, the premier conference for augmented reality, and AWE EU, the augmented reality world expo Europe edition.

At UIST, Walter Lasecki, director of the Crowds+Machines Lab, presented our paper on Arboretum, a shared browsing architecture allowing users with, for example, visual impairments to hand-off a web browsing session to trusted crowd workers and friends.

At ISMAR, Michael Nebeling presented a paper co-authored with former postdoc, Max Speicher, on the trouble of augmented reality and virtual reality authoring tools. There is a rapidly growing landscape of diverse tools, but not many so far, in our opinion, adequately address the needs of non-programmers such as many user experience researchers and interaction designers. We reflected on two of our recent tools, ProtoAR and GestureWiz, both presented at CHI this year, presented a classification of existing tools and discussed three major troubles:

  1. First, there is already a massive tool landscape, and it’s rapidly growing. This makes it hard to get started for new designers, and hard to keep track even for experienced developers (except for those who swear upon Unity, which provides support for a lot of AR/VR things, if you’ve spent sufficient time with the tool to master the learning curve and are comfortable writing code in C# to “prototype”).
  2. Second, design processes are unique patchworks. This is not unique to AR/VR interaction design, but it’s especially true there. Basically, every AR/VR app requires a unique tool chain. The tools we identified in lower classes are too limited for most apps, while tools in higher classes, such as A-Frame, Unity, and Unreal Engine, are out of reach for many designers.
  3. Third, there is a significant gaps both within & between tools. Unfortunately, the tool chain is optimized in the upwards direction, allowing export and import only in “higher” tools. This makes design iterations tricky & expensive. We need to build better integrations between tools, to allow multiple different paths, and rapid iteration even if it means one has to go back to an earlier tool.

A particular highlight of the ISMAR workshop was Blair MacIntyre’s, principal scientist at Mozilla and professor on leave at Georgia Tech, presentation on WebXR. We are hoping to start a collaboration with Mozilla soon, so stay tuned!

It was definitely an exciting week, seeing many live demos and having great discussions with old and new friends!

Finally, AWE highlighted the difference between AR/VR in research and industry. While the demos at UIST were all really forward thinking, highly experimental, and very, very diverse, across the AWE EU exhibitions, the dominant theme was AR support for IoT applications. Almost every exhibitor brought a long a physical model (some of which were definitely quite exciting) and then used an AR device to “look under the hood” to configure it with live previews or for training and repair scenarios. While it is true that the research has finally matured to make this possible outside the lab, in research this was one of the first basic applications that was repeatedly demonstrated for at least a decade.

The Trouble with AR/VR Authoring Tools

The lab is going to participate in ISMAR 2018, the premier AR conference in the field, and present a position paper at the “Creativity in Design with & for Mixed Reality” workshop.

Our paper entitled, “The Trouble with AR/VR Authoring Tools”, is essentially a survey of existing AR/VR authoring tools, providing a classification of the tools based on their features, and a discussion of the problems based on our experience with them. Here’s a short summary of the paper from the introduction:

In this position paper, we classify existing authoring tools relevant to AR/VR, identify five classes of tools (Fig. 1), and characterize the main issues we see with how the tool landscape has been evolving. Both authors have a track record of research on interactive technologies with a more recent focus on AR/VR [20, 28–30]. For example, they created ProtoAR [20], a tool designed with the vision of making AR/VR prototyping as easy and versatile as paper prototyping, and GestureWiz [30], a Wizard of Oz gesture prototyping environment. The second author also contributed to the design and development of HoloBuilder [31] from 2015 to 2017. When he joined the company, the original idea was to create a “PowerPoint for AR,” enabling users without specific design and development skills to create AR experiences. For the future, we envision tools as simple yet powerful as PowerPoint or Keynote leveling the playing field for AR/VR.

The paper will be published in the ISMAR 2018 Adjunct proceedings; a pre-print is available here.

Over the past two years, we had multiple students try out several of the tools in our research projects, with mixed success. It seems that the only viable solution to creating AR/VR prototypes is knowing how to use three.js/A-Frame or Unity. Many of the students we work with, however, are not experienced programmers, and struggle with the high learning curve.

Of course, in our own research, we have been addressing this by coming up with useful and effective tools that enable rapid prototyping of AR/VR interfaces without programming–ProtoAR is just the first example of this new stream of our research.

There have always been new tools–one promising one, Halo, unfortunately, is just being closed down. I got the sad news this week, but I’m sure that Dror and Eran will come up with something else and pursue it with the same passion soon — I really enjoyed working with them, and they were very kind and willing to support my new AR/VR course here at Michigan.