The Trouble with AR/VR Authoring Tools

The lab is going to participate in ISMAR 2018, the premier AR conference in the field, and present a position paper at the “Creativity in Design with & for Mixed Reality” workshop.

Our paper entitled, “The Trouble with AR/VR Authoring Tools”, is essentially a survey of existing AR/VR authoring tools, providing a classification of the tools based on their features, and a discussion of the problems based on our experience with them. Here’s a short summary of the paper from the introduction:

In this position paper, we classify existing authoring tools relevant to AR/VR, identify five classes of tools (Fig. 1), and characterize the main issues we see with how the tool landscape has been evolving. Both authors have a track record of research on interactive technologies with a more recent focus on AR/VR [20, 28–30]. For example, they created ProtoAR [20], a tool designed with the vision of making AR/VR prototyping as easy and versatile as paper prototyping, and GestureWiz [30], a Wizard of Oz gesture prototyping environment. The second author also contributed to the design and development of HoloBuilder [31] from 2015 to 2017. When he joined the company, the original idea was to create a “PowerPoint for AR,” enabling users without specific design and development skills to create AR experiences. For the future, we envision tools as simple yet powerful as PowerPoint or Keynote leveling the playing field for AR/VR.

The paper will be published in the ISMAR 2018 Adjunct proceedings; a pre-print is available here: http://michael-nebeling.de/publications/ismar18adj.pdf

Over the past two years, we had multiple students try out several of the tools in our research projects, with mixed success. It seems that the only viable solution to creating AR/VR prototypes is knowing how to use three.js/A-Frame or Unity. Many of the students we work with, however, are not experienced programmers, and struggle with the high learning curve.

Of course, in our own research, we have been addressing this by coming up with useful and effective tools that enable rapid prototyping of AR/VR interfaces without programming–ProtoAR is just the first example of this new stream of our research.

There have always been new tools–one promising one, Halo, unfortunately, is just being closed down. I got the sad news this week, but I’m sure that Dror and Eran will come up with something else and pursue it with the same dedication soon –I really enjoyed working with them, and they were very kind and willing to support my new AR/VR course here at Michigan.

Fall 2018 Student Design Jams

Update: We have two design jams scheduled for September 2018: September 7 and September 14. Feel free to join our MCommunity list for updates on topics, dates, and times for design jams throughout the Fall/Winter semesters.

As we are getting ready for the next semester, I wanted to announce here that I will again host regular student design jams in my research lab starting in September 2018. The first two design jams will happen on Friday September 7 and 14 from 1-4pm. Sign up here.

Design jams are 3 hour blocks on Fridays 1-4pm for students to work on user interface research and design challenges. This semester I would like to build teams that persist over the semester (no need to come in teams, we will formulate teams in the beginning of the semester). I would like design challenges to continue over multiple weeks to achieve more significant results. It has frequently happened in the past that students ended up doing a research project with me based on some of the initial design jams results.

The design jams will introduce you to state-of-the-art AR/VR technologies. Our lab has a variety of devices, from AR capable smartphones and tablets, to head-mounted AR/VR devices like HTC Vive, Windows Mixed Reality, and HoloLens (and we are in touch with Magic Leap, Meta, and other device providers) that students will have access to and will be encouraged to use in projects. We also have new tools and toolkits developed in the lab’s research, and my goal is to drive these solutions forward with the help of students and these design jams.

Fall 2018 Research Assistant Positions

The Information Interaction Lab at the University of Michigan School of Information is looking for two master’s students to assist with a range of ongoing AR/VR projects. After a short trial period, these positions can be converted into paid temporary research assistant positions (pay commensurate with experience).

We are looking for students with solid AR/VR design and development experience. Ideally, you consider yourself an expert in Unity and/or Three.js/A-Frame, and have experience working with AR frameworks including HoloToolkit, Tango, ARCore, ARKit, AR.js, etc. We realize that there is only a tiny fraction of students out there that have such a background, so a compromise might be that you can at least demonstrate significant front-end design and development experience, and the potential to quickly navigate the AR/VR space and learn a range of new technologies as required for our ongoing research projects.

If you think that could be you, please send an email to Michael Nebeling. Make sure to include a short paragraph about yourself and your interests in working with us. If you have it ready, please also include a CV.

Mi2 Lab at EICS 2018

Several current and former members of the Information Interaction Lab contributed to two papers at the ACM SIGCHI Symposium on Engineering Interactive Computing Systems (EICS). Last year we had to miss out on EICS which was unfortunate as this has been a regular conference for the lab’s PI Michael Nebeling. This year, PhD student Brian Hall is going to represent us at EICS. Unfortunately, Michael is in conflict and will be at the ACM UIST program committee meeting instead. But, it’s going to be an exciting time for Brian. Not only will he present both papers, but it is also his first time in Europe. And, the conference is in Paris — ville d’amour. So show him some love at EICS! 🙂

The schedule of our presentations is as follows:

Wednesday, June 20th:

XD-AR: Challenges and Opportunities in Cross-Device Augmented Reality Application Development
Maximilian Speicher, Brian D. Hall, Ao Yu, Bowen Zhang, Haihua Zhang, Janet Nebeling, Michael Nebeling

Thursday, June 21st:

360Anywhere: Mobile Ad-hoc Collaboration in Any Environment using 360 Video and Augmented Reality
Maximilian Speicher, Jingchen Cao, Ao Yu, Haihua Zhang, Michael Nebeling (Presented by Brian D. Hall)

Mi2 Lab at CHI 2018

As announced previously, several members of the Information Interaction Lab have papers at CHI 2018. While last year three members of the lab were present, this year only Michael Nebeling is able to attend. Michael is always keen on meeting new people. This year he is particularly interested in talking to people that are considering doing a Postdoc after their PhD, or an internship to see if a PhD could be something. A research stay with us would allow you to explore exciting technical HCI domains, such as AR/VR, or gain more independent research experience in general and be involved in setting up a new lab such as ours. Please get in touch!

If you were interested in seeing some of our work, the schedule of our presentations is as follows:

Sunday all-day

Michael Nebeling is participating in the workshop Rethinking Interaction organized by Michel Beaudouin-Lafon and Wendy Mackay.

Tuesday 9:00-9:20 AM

Erin McAweeney is going to present our paper User-Driven Design Principles for Gesture Representations by Erin McAweeney (REMS fellow visiting from UW), Haihua Zhang (summer intern from Tsinghua), and Michael Nebeling.

Tuesday 12:00-12:20 PM

Michael Nebeling is going to present our paper GestureWiz: A Human-Powered Gesture Design Environment for User Interface Prototypes by Maximilian Speicher and Michael Nebeling.

Wednesday 9:40-10:00 AM

Michael Nebeling is going to present our paper ProtoAR: Rapid Physical-Digital Prototyping of Mobile Augmented Reality Applications by Michael Nebeling, Janet Nebeling, Ao Yu (summer intern from Tsinghua), and Rob Rumble.

Wednesday 4:00-5:20 PM

Michael Nebeling is going to be part of the Special Interest Group discussion on Redefining Natural User Interface with James Landay, Chen Zhao, and others from Alibaba.

We’ll publish the final versions of the papers on this web site soon after CHI 2018.

U-M Teach-Out on AR/MR/VR

Michael Nebeling and colleagues from U-M’s School of Information and the STAMPS School of Art and Design have put together a U-M Teach-Out on Augmented, Virtual, and Mixed Reality. The teach-out will be delivered in the form of an open-enrollment course on Coursera for free titled Augmented Reality, Virtual Reality, and Mixed Reality: Opportunities and Issues.

The course is a combination of discussion rounds, virtual lab tours, and an overview of relevant research and teaching at U-M. After one lecture introducing the different terminologies and technologies, the course features a series of discussions and expert panels on why these topics are extremely important and timely to talk about, and explores the impact on research and teaching as well as our daily lives, in a number of different domains, ranging from medicine and nursing, to landscaping and architectural design, to multimedia and entertainment.

We had a lot of fun creating this course and we hope you will be able to join us and participate in the discussion forums!

3 Papers at CHI 2018

We are happy to announce that we have three conditionally accepted papers at CHI 2018 featuring our work on ProtoAR, a new AR physical-to-digital prototyping tool, GestureWiz, a gesture interface prototyping tool, and one paper on two studies into user-driven gesture representations.

  1. ProtoAR: Rapid Physical-Digital Prototyping of Mobile Augmented Reality Applications 
    M. Nebeling, J. Nebeling, A. Yu, R. Rumble
  2. GestureWiz: A Human-Powered Gesture Design Environment for User Interface Prototypes
    M. Speicher, M. Nebeling
  3. User-Driven Design Principles for Gesture Representations
    E. McAweeney, H. Zhang, M. Nebeling

See you at CHI 2018 in Montreal!