Disney Research|Studios Faculty Award

We’re happy to announce that Professor Michael Nebeling has received the Disney Research|Studios Faculty Award 2019.

Michael traveled to the ACM SIGGRAPH 2019 conference in Los Angeles, CA, in July and received the award at DisPLAY, the Disney Mixer social event held during the conference.

Disney Research|Studios Director and ETH Zurich Professor, Markus Gross, and Associate Director, Bob Sumner, announced the award at the event and handed it over to Michael in a very nice ceremony.

The award, “In recognition of your scientific excellence”, is a recognition of Michael’s important research at the intersection of HCI and AR/VR over the past few years, with the goal of empowering non-programmers and user experience designers to create AR/VR experiences rapidly and cheaply.

Thank you, Disney!

CHI 2019 Videos Released

As announced earlier, several members of the Information Interaction Lab attended CHI 2019 in Glasgow, Scotland, UK. This week, the video recordings of our presentations were released:

1. 360proto: Making Interactive Virtual Reality / Augmented Reality Prototypes from Paper
Michael Nebeling, Katy Madier

2. What is Mixed Reality?
Maximilian Speicher, Brian D. Hall, Michael Nebeling

New Postdoc Positions

The Michigan Information Interaction Lab directed by Professor Michael Nebeling is looking for 1-2 postdocs to join their efforts in designing and studying new interactive systems & technologies.

Over the first three years at U-M, the lab has worked at the forefront of cross-device interfaces, multi-modal interaction, and AR/VR. Our systems research contributions include ProtoAR [CHI’18] and 360proto [CHI’19] for rapid prototyping of AR/VR interfaces using cross-device authoring, 360Anywhere [EICS’18] for 360 video based collaboration using mixed reality, GestureWiz [CHI’18] for using Wizard of Oz and crowdsourcing in gesture design and recognition tasks. Using the knowledge from building these systems, the lab has also been working on conceptual frameworks and new ways of thinking about future interfaces’ design through contributions such as What is Mixed Reality? [CHI’19 Best Paper Honorable Mention], The Trouble with AR/VR Authoring Tools [ISMAR’18 Adj.], Playing the Tricky Game of Toolkits Research [CHI’17 HCI.Tools Workshop], and studies on creating cross-device AR experiences resulting in XD-AR [EICS’18 Best Paper Award] as well as user-driven design principles for gesture representations [CHI’18].

We’re considering applications of successful PhDs (defended by the start date) who have been active in premier HCI conferences, notably CHI, UIST, and CSCW, but also related, more specialized conferences such as IMWUT (formerly UbiComp), ISS (formerly ITS), EICS, TEI, SUI, ISMAR, VR, and VRST.

As a new post-doctoral research fellow in the lab, you would closely work with Michael and his students to help develop and lead new research projects on novel interactive technologies and interactive systems. We currently focus our research on designing, developing, and evaluating novel AR/VR experiences. But we’re always interested in broadening our research activities as well.

Interested applicants should get in touch with Michael directly via email (nebeling@umich.edu).

2 Papers at CHI 2019

We are happy to announce that we have two papers accepted at CHI 2019 featuring our work on 360proto, a new AR/VR physical-digital prototyping tool which allows designers to make both AR and VR prototypes from paper and bring them to live on common AR/VR devices, and a critical reflection piece on the question of What is Mixed Reality?, written up as a paper that is part scientific literature review, part analysis of state of the art in industry, and part interviews with leading experts on AR/VR in both academia and industry.

  1. M. Nebeling, K. Madier: 360proto: Making Interactive Virtual Reality & Augmented Reality Prototypes from Paper
  2. M. Speicher, B.D. Hall, M. Nebeling: What is Mixed Reality? Best Paper Honorable Mention
Star Trek example based on 360 sketches and digitized using 360proto to run on an ARCore phone

We also announced earlier that we will offer an AR/VR prototyping course at CHI this year! The idea to teach this course really grew out of our success teaching AR/VR interaction design to a diverse student body in Fall 2018. We have since started teaching a more advanced and technical version this semester.

The CHI 2019 course will be an exciting opportunity for new and old AR/VR researchers, designers, and students to come together and learn about our rapid prototyping techniques in a hands-on manner. Should be fun!

See you at CHI 2019 in Glasgow, Scotland, UK this year!

AR/VR Prototyping Course at CHI 2019

We are excited to announce our CHI 2019 AR/VR prototyping course. The course web site can be found here.

AR/VR technologies are starting to disrupt our everyday experiences and they are playing an increasingly important role in interaction design. The course taught by Professor Nebeling will introduce CHI 2019 attendees to state-of-the-art AR/VR prototyping methods and tools including ProtoAR, which we recently presented at CHI 2018 and are currently preparing for public release.

The course will be very hands-on introducing participants to both physical prototyping and digital prototyping techniques. These techniques allow designers to rapidly create and try out AR/VR interface designs in order to make decisions on how to best take a particular design concept further, or to abandon it early before putting in all the effort that is currently required to program these interfaces.

Join us to learn more about ways of incorporating AR/VR design activities into existing courses, workshops, and student design jams!

Find out more about the course here

New Grant from Mozilla

We are really excited to receive a Mozilla Research Grant for our research on AR/VR.

The research gift from Mozilla will enable Professor Nebeling and his lab to conduct two important research studies around the future of the web and its use as a platform for augmented reality applications. The first study will explore a range of desirable augmented reality applications and how they could be created by composing popular, existing applications in interesting and useful, new ways. The second study will focus on new interactions techniques and technical solutions for sharing links to and content of web applications based on novel augmented reality interfaces. Together, these studies will inform the design of future web browsing interfaces and technologies with user-driven support for AR/VR.

We are very happy that Mozilla is supporting the research in the lab, and the support will allow us to bring on several new MSI students, many of which Professor Nebeling has been teaching in his new AR/VR course, and who are now excited to get to work on this project to practice the techniques they studied and deepen their research skills.

Mozilla selected us because they believe in the work we do and in the impact it can have on the web for AR/VR. While many vendors try to shape the future of AR/VR in terms of their specific platforms and technologies, the web needs to remain the open access and device-agnostic platform that it is. This is very important to Mozilla, and to us, and requires new research like ours.

Attending UIST, ISMAR, and AWE EU

Attending an ISMAR workshop on creativity in design with & for mixed reality design.

Last week, Michael Nebeling, director of the Information Interaction Lab, attended three conferences in Germay: UIST, the premier forum for innovative user interfaces, ISMAR, the premier conference for augmented reality, and AWE EU, the augmented reality world expo Europe edition.

At UIST, Walter Lasecki, director of the Crowds+Machines Lab, presented our paper on Arboretum, a shared browsing architecture allowing users with, for example, visual impairments to hand-off a web browsing session to trusted crowd workers and friends.

At ISMAR, Michael Nebeling presented a paper co-authored with former postdoc, Max Speicher, on the trouble of augmented reality and virtual reality authoring tools. There is a rapidly growing landscape of diverse tools, but not many so far, in our opinion, adequately address the needs of non-programmers such as many user experience researchers and interaction designers. We reflected on two of our recent tools, ProtoAR and GestureWiz, both presented at CHI this year, presented a classification of existing tools and discussed three major troubles:

  1. First, there is already a massive tool landscape, and it’s rapidly growing. This makes it hard to get started for new designers, and hard to keep track even for experienced developers (except for those who swear upon Unity, which provides support for a lot of AR/VR things, if you’ve spent sufficient time with the tool to master the learning curve and are comfortable writing code in C# to “prototype”).
  2. Second, design processes are unique patchworks. This is not unique to AR/VR interaction design, but it’s especially true there. Basically, every AR/VR app requires a unique tool chain. The tools we identified in lower classes are too limited for most apps, while tools in higher classes, such as A-Frame, Unity, and Unreal Engine, are out of reach for many designers.
  3. Third, there is a significant gaps both within & between tools. Unfortunately, the tool chain is optimized in the upwards direction, allowing export and import only in “higher” tools. This makes design iterations tricky & expensive. We need to build better integrations between tools, to allow multiple different paths, and rapid iteration even if it means one has to go back to an earlier tool.

A particular highlight of the ISMAR workshop was Blair MacIntyre’s, principal scientist at Mozilla and professor on leave at Georgia Tech, presentation on WebXR. We are hoping to start a collaboration with Mozilla soon, so stay tuned!

It was definitely an exciting week, seeing many live demos and having great discussions with old and new friends!

Finally, AWE highlighted the difference between AR/VR in research and industry. While the demos at UIST were all really forward thinking, highly experimental, and very, very diverse, across the AWE EU exhibitions, the dominant theme was AR support for IoT applications. Almost every exhibitor brought a long a physical model (some of which were definitely quite exciting) and then used an AR device to “look under the hood” to configure it with live previews or for training and repair scenarios. While it is true that the research has finally matured to make this possible outside the lab, in research this was one of the first basic applications that was repeatedly demonstrated for at least a decade.

The Trouble with AR/VR Authoring Tools

The lab is going to participate in ISMAR 2018, the premier AR conference in the field, and present a position paper at the “Creativity in Design with & for Mixed Reality” workshop.

Our paper entitled, “The Trouble with AR/VR Authoring Tools”, is essentially a survey of existing AR/VR authoring tools, providing a classification of the tools based on their features, and a discussion of the problems based on our experience with them. Here’s a short summary of the paper from the introduction:

In this position paper, we classify existing authoring tools relevant to AR/VR, identify five classes of tools (Fig. 1), and characterize the main issues we see with how the tool landscape has been evolving. Both authors have a track record of research on interactive technologies with a more recent focus on AR/VR [20, 28–30]. For example, they created ProtoAR [20], a tool designed with the vision of making AR/VR prototyping as easy and versatile as paper prototyping, and GestureWiz [30], a Wizard of Oz gesture prototyping environment. The second author also contributed to the design and development of HoloBuilder [31] from 2015 to 2017. When he joined the company, the original idea was to create a “PowerPoint for AR,” enabling users without specific design and development skills to create AR experiences. For the future, we envision tools as simple yet powerful as PowerPoint or Keynote leveling the playing field for AR/VR.

The paper will be published in the ISMAR 2018 Adjunct proceedings; a pre-print is available here.

Over the past two years, we had multiple students try out several of the tools in our research projects, with mixed success. It seems that the only viable solution to creating AR/VR prototypes is knowing how to use three.js/A-Frame or Unity. Many of the students we work with, however, are not experienced programmers, and struggle with the high learning curve.

Of course, in our own research, we have been addressing this by coming up with useful and effective tools that enable rapid prototyping of AR/VR interfaces without programming–ProtoAR is just the first example of this new stream of our research.

There have always been new tools–one promising one, Halo, unfortunately, is just being closed down. I got the sad news this week, but I’m sure that Dror and Eran will come up with something else and pursue it with the same passion soon — I really enjoyed working with them, and they were very kind and willing to support my new AR/VR course here at Michigan.

Fall 2018 Student Design Jams

Update: We have two design jams scheduled for September 2018: September 7 and September 14. Feel free to join our MCommunity list for updates on topics, dates, and times for design jams throughout the Fall/Winter semesters.

As we are getting ready for the next semester, I wanted to announce here that I will again host regular student design jams in my research lab starting in September 2018. The first two design jams will happen on Friday September 7 and 14 from 1-4pm. Sign up here.

Design jams are 3 hour blocks on Fridays 1-4pm for students to work on user interface research and design challenges. This semester I would like to build teams that persist over the semester (no need to come in teams, we will formulate teams in the beginning of the semester). I would like design challenges to continue over multiple weeks to achieve more significant results. It has frequently happened in the past that students ended up doing a research project with me based on some of the initial design jams results.

The design jams will introduce you to state-of-the-art AR/VR technologies. Our lab has a variety of devices, from AR capable smartphones and tablets, to head-mounted AR/VR devices like HTC Vive, Windows Mixed Reality, and HoloLens (and we are in touch with Magic Leap, Meta, and other device providers) that students will have access to and will be encouraged to use in projects. We also have new tools and toolkits developed in the lab’s research, and my goal is to drive these solutions forward with the help of students and these design jams.

Fall 2018 Research Assistant Positions

The Information Interaction Lab at the University of Michigan School of Information is looking for two master’s students to assist with a range of ongoing AR/VR projects. After a short trial period, these positions can be converted into paid temporary research assistant positions (pay commensurate with experience).

We are looking for students with solid AR/VR design and development experience. Ideally, you consider yourself an expert in Unity and/or Three.js/A-Frame, and have experience working with AR frameworks including HoloToolkit, Tango, ARCore, ARKit, AR.js, etc. We realize that there is only a tiny fraction of students out there that have such a background, so a compromise might be that you can at least demonstrate significant front-end design and development experience, and the potential to quickly navigate the AR/VR space and learn a range of new technologies as required for our ongoing research projects.

If you think that could be you, please send an email to Michael Nebeling. Make sure to include a short paragraph about yourself and your interests in working with us. If you have it ready, please also include a CV.