Blog

Fall 2018/Winter 2019 Research Positions

The Information Interaction Lab at the University of Michigan School of Information is looking for up to three new master’s students to assist with a range of ongoing AR/VR projects. We offer two types of positions:

  • technical positions focus on technological aspects and typically involve a variety of programming and technical HCI research tasks around emerging toolkits and tools;
  • design positions focus on HCI/UX and interaction design aspects and typically involve complex UX design tasks and experimentation with new kinds of prototyping methods.

We are looking for students with solid AR/VR design and development experience. Students that have completed Professor Nebeling’s SI 559 AR/VR Application Design, or have completed workshops as part of the Alternate Reality Initiative (ARI) run by Michael Zhang and others, are especially encouraged to apply. For technical positions, you should consider yourself an expert in Unity and/or Three.js/A-Frame, and have experience working with AR frameworks including HoloToolkit, Tango, ARCore, ARKit, AR.js, etc. For design positions, we expect that you are familiar with state-of-the-art prototyping methods and tools.

We realize that there is only a tiny fraction of students at U-M that have such a background, so a compromise might be that you can at least demonstrate significant front-end design and development experience, and the potential to quickly navigate the AR/VR space and learn a range of new technologies as required for our ongoing research projects.

For students with the required background and skills, positions can be converted into paid temporary research assistant positions (pay commensurate with experience) after a short trial period. For students that do not yet have the necessary experience, but want to deepen their knowledge in AR/VR as they are experimenting with new devices and technologies, we also consider independent study projects. Finally, we have been working with several UROP students over the years and are always eager to consider new UROP applicants as well. 

If you think that could be you, please send an email to Michael Nebeling. Make sure to include a short paragraph about yourself and your interests in working with us. If you have it ready, please also include a CV.

Attending UIST, ISMAR, and AWE EU

Attending an ISMAR workshop on creativity in design with & for mixed reality design.

Last week, Michael Nebeling, director of the Information Interaction Lab, attended three conferences in Germay: UIST, the premier forum for innovative user interfaces, ISMAR, the premier conference for augmented reality, and AWE EU, the augmented reality world expo Europe edition.

At UIST, Walter Lasecki, director of the Crowds+Machines Lab, presented our paper on Arboretum, a shared browsing architecture allowing users with, for example, visual impairments to hand-off a web browsing session to trusted crowd workers and friends.

At ISMAR, Michael Nebeling presented a paper co-authored with former postdoc, Max Speicher, on the trouble of augmented reality and virtual reality authoring tools. There is a rapidly growing landscape of diverse tools, but not many so far, in our opinion, adequately address the needs of non-programmers such as many user experience researchers and interaction designers. We reflected on two of our recent tools, ProtoAR and GestureWiz, both presented at CHI this year, presented a classification of existing tools and discussed three major troubles:

  1. First, there is already a massive tool landscape, and it’s rapidly growing. This makes it hard to get started for new designers, and hard to keep track even for experienced developers (except for those who swear upon Unity, which provides support for a lot of AR/VR things, if you’ve spent sufficient time with the tool to master the learning curve and are comfortable writing code in C# to “prototype”).
  2. Second, design processes are unique patchworks. This is not unique to AR/VR interaction design, but it’s especially true there. Basically, every AR/VR app requires a unique tool chain. The tools we identified in lower classes are too limited for most apps, while tools in higher classes, such as A-Frame, Unity, and Unreal Engine, are out of reach for many designers.
  3. Third, there is a significant gaps both within & between tools. Unfortunately, the tool chain is optimized in the upwards direction, allowing export and import only in “higher” tools. This makes design iterations tricky & expensive. We need to build better integrations between tools, to allow multiple different paths, and rapid iteration even if it means one has to go back to an earlier tool.

A particular highlight of the ISMAR workshop was Blair MacIntyre’s, principal scientist at Mozilla and professor on leave at Georgia Tech, presentation on WebXR. We are hoping to start a collaboration with Mozilla soon, so stay tuned!

It was definitely an exciting week, seeing many live demos and having great discussions with old and new friends!

Finally, AWE highlighted the difference between AR/VR in research and industry. While the demos at UIST were all really forward thinking, highly experimental, and very, very diverse, across the AWE EU exhibitions, the dominant theme was AR support for IoT applications. Almost every exhibitor brought a long a physical model (some of which were definitely quite exciting) and then used an AR device to “look under the hood” to configure it with live previews or for training and repair scenarios. While it is true that the research has finally matured to make this possible outside the lab, in research this was one of the first basic applications that was repeatedly demonstrated for at least a decade.

The Trouble with AR/VR Authoring Tools

The lab is going to participate in ISMAR 2018, the premier AR conference in the field, and present a position paper at the “Creativity in Design with & for Mixed Reality” workshop.

Our paper entitled, “The Trouble with AR/VR Authoring Tools”, is essentially a survey of existing AR/VR authoring tools, providing a classification of the tools based on their features, and a discussion of the problems based on our experience with them. Here’s a short summary of the paper from the introduction:

In this position paper, we classify existing authoring tools relevant to AR/VR, identify five classes of tools (Fig. 1), and characterize the main issues we see with how the tool landscape has been evolving. Both authors have a track record of research on interactive technologies with a more recent focus on AR/VR [20, 28–30]. For example, they created ProtoAR [20], a tool designed with the vision of making AR/VR prototyping as easy and versatile as paper prototyping, and GestureWiz [30], a Wizard of Oz gesture prototyping environment. The second author also contributed to the design and development of HoloBuilder [31] from 2015 to 2017. When he joined the company, the original idea was to create a “PowerPoint for AR,” enabling users without specific design and development skills to create AR experiences. For the future, we envision tools as simple yet powerful as PowerPoint or Keynote leveling the playing field for AR/VR.

The paper will be published in the ISMAR 2018 Adjunct proceedings; a pre-print is available here: http://michael-nebeling.de/publications/ismar18adj.pdf

Over the past two years, we had multiple students try out several of the tools in our research projects, with mixed success. It seems that the only viable solution to creating AR/VR prototypes is knowing how to use three.js/A-Frame or Unity. Many of the students we work with, however, are not experienced programmers, and struggle with the high learning curve.

Of course, in our own research, we have been addressing this by coming up with useful and effective tools that enable rapid prototyping of AR/VR interfaces without programming–ProtoAR is just the first example of this new stream of our research.

There have always been new tools–one promising one, Halo, unfortunately, is just being closed down. I got the sad news this week, but I’m sure that Dror and Eran will come up with something else and pursue it with the same dedication soon –I really enjoyed working with them, and they were very kind and willing to support my new AR/VR course here at Michigan.

Fall 2018 Student Design Jams

Update: We have two design jams scheduled for September 2018: September 7 and September 14. Feel free to join our MCommunity list for updates on topics, dates, and times for design jams throughout the Fall/Winter semesters.

As we are getting ready for the next semester, I wanted to announce here that I will again host regular student design jams in my research lab starting in September 2018. The first two design jams will happen on Friday September 7 and 14 from 1-4pm. Sign up here.

Design jams are 3 hour blocks on Fridays 1-4pm for students to work on user interface research and design challenges. This semester I would like to build teams that persist over the semester (no need to come in teams, we will formulate teams in the beginning of the semester). I would like design challenges to continue over multiple weeks to achieve more significant results. It has frequently happened in the past that students ended up doing a research project with me based on some of the initial design jams results.

The design jams will introduce you to state-of-the-art AR/VR technologies. Our lab has a variety of devices, from AR capable smartphones and tablets, to head-mounted AR/VR devices like HTC Vive, Windows Mixed Reality, and HoloLens (and we are in touch with Magic Leap, Meta, and other device providers) that students will have access to and will be encouraged to use in projects. We also have new tools and toolkits developed in the lab’s research, and my goal is to drive these solutions forward with the help of students and these design jams.

Fall 2018 Research Assistant Positions

The Information Interaction Lab at the University of Michigan School of Information is looking for two master’s students to assist with a range of ongoing AR/VR projects. After a short trial period, these positions can be converted into paid temporary research assistant positions (pay commensurate with experience).

We are looking for students with solid AR/VR design and development experience. Ideally, you consider yourself an expert in Unity and/or Three.js/A-Frame, and have experience working with AR frameworks including HoloToolkit, Tango, ARCore, ARKit, AR.js, etc. We realize that there is only a tiny fraction of students out there that have such a background, so a compromise might be that you can at least demonstrate significant front-end design and development experience, and the potential to quickly navigate the AR/VR space and learn a range of new technologies as required for our ongoing research projects.

If you think that could be you, please send an email to Michael Nebeling. Make sure to include a short paragraph about yourself and your interests in working with us. If you have it ready, please also include a CV.

Mi2 Lab at EICS 2018

Several current and former members of the Information Interaction Lab contributed to two papers at the ACM SIGCHI Symposium on Engineering Interactive Computing Systems (EICS). Last year we had to miss out on EICS which was unfortunate as this has been a regular conference for the lab’s PI Michael Nebeling. This year, PhD student Brian Hall is going to represent us at EICS. Unfortunately, Michael is in conflict and will be at the ACM UIST program committee meeting instead. But, it’s going to be an exciting time for Brian. Not only will he present both papers, but it is also his first time in Europe. And, the conference is in Paris — ville d’amour. So show him some love at EICS! 🙂

The schedule of our presentations is as follows:

Wednesday, June 20th:

XD-AR: Challenges and Opportunities in Cross-Device Augmented Reality Application Development
Maximilian Speicher, Brian D. Hall, Ao Yu, Bowen Zhang, Haihua Zhang, Janet Nebeling, Michael Nebeling

Thursday, June 21st:

360Anywhere: Mobile Ad-hoc Collaboration in Any Environment using 360 Video and Augmented Reality
Maximilian Speicher, Jingchen Cao, Ao Yu, Haihua Zhang, Michael Nebeling (Presented by Brian D. Hall)

Mi2 Lab at CHI 2018

As announced previously, several members of the Information Interaction Lab have papers at CHI 2018. While last year three members of the lab were present, this year only Michael Nebeling is able to attend. Michael is always keen on meeting new people. This year he is particularly interested in talking to people that are considering doing a Postdoc after their PhD, or an internship to see if a PhD could be something. A research stay with us would allow you to explore exciting technical HCI domains, such as AR/VR, or gain more independent research experience in general and be involved in setting up a new lab such as ours. Please get in touch!

If you were interested in seeing some of our work, the schedule of our presentations is as follows:

Sunday all-day

Michael Nebeling is participating in the workshop Rethinking Interaction organized by Michel Beaudouin-Lafon and Wendy Mackay.

Tuesday 9:00-9:20 AM

Erin McAweeney is going to present our paper User-Driven Design Principles for Gesture Representations by Erin McAweeney (REMS fellow visiting from UW), Haihua Zhang (summer intern from Tsinghua), and Michael Nebeling.

Tuesday 12:00-12:20 PM

Michael Nebeling is going to present our paper GestureWiz: A Human-Powered Gesture Design Environment for User Interface Prototypes by Maximilian Speicher and Michael Nebeling.

Wednesday 9:40-10:00 AM

Michael Nebeling is going to present our paper ProtoAR: Rapid Physical-Digital Prototyping of Mobile Augmented Reality Applications by Michael Nebeling, Janet Nebeling, Ao Yu (summer intern from Tsinghua), and Rob Rumble.

Wednesday 4:00-5:20 PM

Michael Nebeling is going to be part of the Special Interest Group discussion on Redefining Natural User Interface with James Landay, Chen Zhao, and others from Alibaba.

We’ll publish the final versions of the papers on this web site soon after CHI 2018.

Josh Guberman joining as PhD student

We’re happy to announce that Joshua Guberman will enter the UMSI PhD program in Fall 2018, and will be working in the Information Interaction Lab as a research assistant. He will be advised by Michael Nebeling and Sile O’Modhrain.

Josh describes himself as a social-sciences researcher, a tinkerer/inventor, and a UI/UX enthusiast. He is a recent Illinois Tech graduate with a B.S. in Psychology. Josh is very passionate about accessibility and will help the lab explore new research avenues with accessibility as an important application domain. We’re excited about this, and look forward to him joining our lab and the UMSI community!

Rob Rumble joining as PhD student

We’re happy to welcome Rob Rumble as a PhD student in the Information Interaction Lab starting in Fall 2018. He will be advised by Michael Nebeling and Steve Oney.

Rob has already been working in the lab since Summer 2017, then joined the MSI program as a master’s student, and was recently accepted into the UMSI PhD program. This is an exciting switch and will allow Rob to spend more time on research.

Rob’s interests are in AR/VR. Over the last two semesters as a research assistant in the lab, he has worked on several smartphone-based AR projects, including ProtoAR which we will present at CHI 2018, and has also been exploring the potential of VR, and the ability to switch between AR and VR in an independent study project. We’re glad to have him as a PhD student in our lab!

AR/VR Courses starting in the Fall

We’re happy to announce that a new set of courses focused on AR/VR will be offered starting in the Fall 2018 semester. These and other courses we are teaching can also be found under Teaching.

First, there is an Intro to AR/VR (SI 559). We are currently working hard to make it accessible to the entire community at Michigan. While there is a number of students on the wait list, we will have to start small in the Fall semester. Based on this experience, our plan is to scale the course to larger student groups and perhaps multiple sections.

Second, there will be an Advanced AR/VR course. This course is targeted at more technical students interested in diving deeper into AR/VR interface development and content creation. Importantly, the Intro course is not a prerequisite for this course, though it is recommended. Prerequisites are SI 506 and SI 539, or similar courses, to make sure that students have a background in programming. Ideally, students would have experience with Three.js/A-Frame/JavaScript or Unity/C#.

Both courses will be taught by Professor Nebeling and will be supported by members of the Information Interaction Lab. We are working hard with various partners from industry, ranging from prototyping tool developers, to device manufacturers, to content providers. It will be a very exciting experience for us, and we hope you will be able to join us on this journey!