Blog

New XR Initiative at U-M

Professor Nebeling and members of the Information Interaction Lab are excited to share that they have been recognized as a key partner in U-M’s new major XR Initiative, which was recently announced by the Provost and endorsed by the President.

While the specifics and strategic goals are still in development, one of the most exciting first milestones for us in the lab will be a new set of online courses that will provide significant training in design and development of AR/VR technologies, together with frameworks for ethical, social, privacy, and security aware design of AR/VR apps, as well as guidelines and recommendations for creating new AR/VR spaces for research and development as well as teaching and learning, which will be particularly helpful for resource constrained environments such as many libraries and schools. These new learning materials will significantly expand on Professor Nebeling’s AR/VR/MR Teach-Out on Coursera, which has provided an introduction to AR/VR technologies to more than 2,400 active learners world-wide.

We have also started new collaborations with several key vendors in the AR/VR space.

We are so excited by the enthusiastic support and the new movements at U-M! If you are interested in supporting our efforts, please reach out to Professor Nebeling at nebeling@umich.edu.

New 2020 Research Positions

The Michigan Information Interaction Lab will have several open positions in 2020 at different levels. People joining in these positions will work directly with me (Michael Nebeling):

  • 1-2 PhD student positions: If you’re thinking about doing a PhD and applying to UMSI or CSE at Michigan (I can supervise in both, and am happy to co-supervise with other professors in either department, in fact, I consider many close colleagues and friends!), with broadly defined interests in mixed reality (any angle really, UX/interaction design, arts, computer graphics, ML, robotics are all fascinating to me!), definitely get in touch! Don’t just apply, and don’t just consider UMSI or CSE (they are both great, but also quite different, so you may increase your chances of finding the right spot for you at U-M by applying to both). I’m happy to chat with you before your application and share my experience.
  • 1 postdoc position: I originally posted about this here. I did my postdoc at CMU and it was one of the best parts of my academic career. U-M has great resources and UMSI (my home department) is so inter-disciplinary, it’s really eye-opening. If you want to push yourself after your PhD to work on your next career steps, be it in academia or industry, I would really like to meet you!
  • 1 visiting PhD researcher position: The next step after the PhD can be a critical career decision, and I have personally experienced this and know too well how difficult it is to make that decision. I would like to offer a 6-12 months funded research stay in my lab, with possibility of conversion into a postdoc, if it works for both you and me. U-M is great, you should definitely consider it!
  • Presidential Postdocs: We also have the presidential postdoc fellowship program, which means you could end up in a position that will be converted to a tenure-track assistant professor after 1-2 years. Applications are now open (until October 15, 2019).

For any of the positions above, don’t just apply. Reach out to me and express your interest first, so I can guide you through the process. Please include a short statement of interest and a CV, if you can.

I look forward to hearing from you!

Disney Research|Studios Faculty Award

We’re happy to announce that Professor Michael Nebeling has received the Disney Research|Studios Faculty Award 2019.

Michael traveled to the ACM SIGGRAPH 2019 conference in Los Angeles, CA, in July and received the award at DisPLAY, the Disney Mixer social event held during the conference.

Disney Research|Studios Director and ETH Zurich Professor, Markus Gross, and Associate Director, Bob Sumner, announced the award at the event and handed it over to Michael in a very nice ceremony.

The award, “In recognition of your scientific excellence”, is a recognition of Michael’s important research at the intersection of HCI and AR/VR over the past few years, with the goal of empowering non-programmers and user experience designers to create AR/VR experiences rapidly and cheaply.

Thank you, Disney!

New Postdoc Positions

The Michigan Information Interaction Lab directed by Professor Michael Nebeling is looking for 1-2 postdocs to join their efforts in designing and studying new interactive systems & technologies.

Over the first three years at U-M, the lab has worked at the forefront of cross-device interfaces, multi-modal interaction, and AR/VR. Our systems research contributions include ProtoAR [CHI’18] and 360proto [CHI’19] for rapid prototyping of AR/VR interfaces using cross-device authoring, 360Anywhere [EICS’18] for 360 video based collaboration using mixed reality, GestureWiz [CHI’18] for using Wizard of Oz and crowdsourcing in gesture design and recognition tasks. Using the knowledge from building these systems, the lab has also been working on conceptual frameworks and new ways of thinking about future interfaces’ design through contributions such as What is Mixed Reality? [CHI’19 Best Paper Honorable Mention], The Trouble with AR/VR Authoring Tools [ISMAR’18 Adj.], Playing the Tricky Game of Toolkits Research [CHI’17 HCI.Tools Workshop], and studies on creating cross-device AR experiences resulting in XD-AR [EICS’18 Best Paper Award] as well as user-driven design principles for gesture representations [CHI’18].

We’re considering applications of successful PhDs (defended by the start date) who have been active in premier HCI conferences, notably CHI, UIST, and CSCW, but also related, more specialized conferences such as IMWUT (formerly UbiComp), ISS (formerly ITS), EICS, TEI, SUI, ISMAR, VR, and VRST.

As a new post-doctoral research fellow in the lab, you would closely work with Michael and his students to help develop and lead new research projects on novel interactive technologies and interactive systems. We currently focus our research on designing, developing, and evaluating novel AR/VR experiences. But we’re always interested in broadening our research activities as well.

Interested applicants should get in touch with Michael directly via email (nebeling@umich.edu).

2 Papers at CHI 2019

We are happy to announce that we have two papers accepted at CHI 2019 featuring our work on 360proto, a new AR/VR physical-digital prototyping tool which allows designers to make both AR and VR prototypes from paper and bring them to live on common AR/VR devices, and a critical reflection piece on the question of What is Mixed Reality?, written up as a paper that is part scientific literature review, part analysis of state of the art in industry, and part interviews with leading experts on AR/VR in both academia and industry.

  1. M. Nebeling, K. Madier: 360proto: Making Interactive Virtual Reality & Augmented Reality Prototypes from Paper
  2. M. Speicher, B.D. Hall, M. Nebeling: What is Mixed Reality? Best Paper Honorable Mention
Star Trek example based on 360 sketches and digitized using 360proto to run on an ARCore phone

We also announced earlier that we will offer an AR/VR prototyping course at CHI this year! The idea to teach this course really grew out of our success teaching AR/VR interaction design to a diverse student body in Fall 2018. We have since started teaching a more advanced and technical version this semester.

The CHI 2019 course will be an exciting opportunity for new and old AR/VR researchers, designers, and students to come together and learn about our rapid prototyping techniques in a hands-on manner. Should be fun!

See you at CHI 2019 in Glasgow, Scotland, UK this year!

AR/VR Prototyping Course at CHI 2019

We are excited to announce our CHI 2019 AR/VR prototyping course. The course web site can be found here.

AR/VR technologies are starting to disrupt our everyday experiences and they are playing an increasingly important role in interaction design. The course taught by Professor Nebeling will introduce CHI 2019 attendees to state-of-the-art AR/VR prototyping methods and tools including ProtoAR, which we recently presented at CHI 2018 and are currently preparing for public release.

The course will be very hands-on introducing participants to both physical prototyping and digital prototyping techniques. These techniques allow designers to rapidly create and try out AR/VR interface designs in order to make decisions on how to best take a particular design concept further, or to abandon it early before putting in all the effort that is currently required to program these interfaces.

Join us to learn more about ways of incorporating AR/VR design activities into existing courses, workshops, and student design jams!

Find out more about the course here

New Grant from Mozilla

We are really excited to receive a Mozilla Research Grant for our research on AR/VR.

The research gift from Mozilla will enable Professor Nebeling and his lab to conduct two important research studies around the future of the web and its use as a platform for augmented reality applications. The first study will explore a range of desirable augmented reality applications and how they could be created by composing popular, existing applications in interesting and useful, new ways. The second study will focus on new interactions techniques and technical solutions for sharing links to and content of web applications based on novel augmented reality interfaces. Together, these studies will inform the design of future web browsing interfaces and technologies with user-driven support for AR/VR.

We are very happy that Mozilla is supporting the research in the lab, and the support will allow us to bring on several new MSI students, many of which Professor Nebeling has been teaching in his new AR/VR course, and who are now excited to get to work on this project to practice the techniques they studied and deepen their research skills.

Mozilla selected us because they believe in the work we do and in the impact it can have on the web for AR/VR. While many vendors try to shape the future of AR/VR in terms of their specific platforms and technologies, the web needs to remain the open access and device-agnostic platform that it is. This is very important to Mozilla, and to us, and requires new research like ours.

Fall 2018/Winter 2019 Research Positions

Update: Thank you to everyone who has applied with the lab. It was exciting to see so much interest. We have added 8 new research assistants and 8 new independent study students to the lab. This was really the maximum we could do for next semester.

The Information Interaction Lab at the University of Michigan School of Information is looking for up to three new master’s students to assist with a range of ongoing AR/VR projects. We offer two types of positions:

  • technical positions focus on technological aspects and typically involve a variety of programming and technical HCI research tasks around emerging toolkits and tools;
  • design positions focus on HCI/UX and interaction design aspects and typically involve complex UX design tasks and experimentation with new kinds of prototyping methods.

We are looking for students with solid AR/VR design and development experience. Students that have completed Professor Nebeling’s SI 559 AR/VR Application Design, or have completed workshops as part of the Alternate Reality Initiative (ARI) run by Michael Zhang and others, are especially encouraged to apply. For technical positions, you should consider yourself an expert in Unity and/or Three.js/A-Frame, and have experience working with AR frameworks including HoloToolkit, Tango, ARCore, ARKit, AR.js, etc. For design positions, we expect that you are familiar with state-of-the-art prototyping methods and tools.

We realize that there is only a tiny fraction of students at U-M that have such a background, so a compromise might be that you can at least demonstrate significant front-end design and development experience, and the potential to quickly navigate the AR/VR space and learn a range of new technologies as required for our ongoing research projects.

For students with the required background and skills, positions can be converted into paid temporary research assistant positions (pay commensurate with experience) after a short trial period. For students that do not yet have the necessary experience, but want to deepen their knowledge in AR/VR as they are experimenting with new devices and technologies, we also consider independent study projects. Finally, we have been working with several UROP students over the years and are always eager to consider new UROP applicants as well. 

If you think that could be you, please send an email to Michael Nebeling. Make sure to include a short paragraph about yourself and your interests in working with us. If you have it ready, please also include a CV.

Attending UIST, ISMAR, and AWE EU

Attending an ISMAR workshop on creativity in design with & for mixed reality design.

Last week, Michael Nebeling, director of the Information Interaction Lab, attended three conferences in Germay: UIST, the premier forum for innovative user interfaces, ISMAR, the premier conference for augmented reality, and AWE EU, the augmented reality world expo Europe edition.

At UIST, Walter Lasecki, director of the Crowds+Machines Lab, presented our paper on Arboretum, a shared browsing architecture allowing users with, for example, visual impairments to hand-off a web browsing session to trusted crowd workers and friends.

At ISMAR, Michael Nebeling presented a paper co-authored with former postdoc, Max Speicher, on the trouble of augmented reality and virtual reality authoring tools. There is a rapidly growing landscape of diverse tools, but not many so far, in our opinion, adequately address the needs of non-programmers such as many user experience researchers and interaction designers. We reflected on two of our recent tools, ProtoAR and GestureWiz, both presented at CHI this year, presented a classification of existing tools and discussed three major troubles:

  1. First, there is already a massive tool landscape, and it’s rapidly growing. This makes it hard to get started for new designers, and hard to keep track even for experienced developers (except for those who swear upon Unity, which provides support for a lot of AR/VR things, if you’ve spent sufficient time with the tool to master the learning curve and are comfortable writing code in C# to “prototype”).
  2. Second, design processes are unique patchworks. This is not unique to AR/VR interaction design, but it’s especially true there. Basically, every AR/VR app requires a unique tool chain. The tools we identified in lower classes are too limited for most apps, while tools in higher classes, such as A-Frame, Unity, and Unreal Engine, are out of reach for many designers.
  3. Third, there is a significant gaps both within & between tools. Unfortunately, the tool chain is optimized in the upwards direction, allowing export and import only in “higher” tools. This makes design iterations tricky & expensive. We need to build better integrations between tools, to allow multiple different paths, and rapid iteration even if it means one has to go back to an earlier tool.

A particular highlight of the ISMAR workshop was Blair MacIntyre’s, principal scientist at Mozilla and professor on leave at Georgia Tech, presentation on WebXR. We are hoping to start a collaboration with Mozilla soon, so stay tuned!

It was definitely an exciting week, seeing many live demos and having great discussions with old and new friends!

Finally, AWE highlighted the difference between AR/VR in research and industry. While the demos at UIST were all really forward thinking, highly experimental, and very, very diverse, across the AWE EU exhibitions, the dominant theme was AR support for IoT applications. Almost every exhibitor brought a long a physical model (some of which were definitely quite exciting) and then used an AR device to “look under the hood” to configure it with live previews or for training and repair scenarios. While it is true that the research has finally matured to make this possible outside the lab, in research this was one of the first basic applications that was repeatedly demonstrated for at least a decade.