MRAT (CHI 2020)

MRAT Dashboard showing a session created when recording the video accompanying the paper: (1) events recorded for each user, (2) modes, (3) selected event’s info (here an injury photo), (4) event statistics, (5) timeline and (6) floor plan visualizations, (7) metrics computed for the selected events.

MRAT: The Mixed Reality Analytics Toolkit

Significant tool support exists for the development of mixed reality (MR) applications; however, there is a lack of tools for analyzing MR experiences. We elicit requirements for future tools through interviews with 8 university research, instructional, and media teams using AR/VR in a variety of domains. While we find a common need for capturing how users perform tasks in MR, the primary differences were in terms of heuristics and metrics relevant to each project. Particularly in the early project stages, teams were uncertain about what data should, and even could, be collected with MR technologies. We designed the Mixed Reality Analytics Toolkit (MRAT) to instrument MR apps via visual editors without programming and enable rapid data collection and filtering for visualizations of MR user sessions. With MRAT, we contribute flexible interaction tracking and task definition concepts, an extensible set of heuristic techniques and metrics to measure task success, and visual inspection tools with in-situ visualizations in MR. Focusing on a multi-user, cross-device MR crisis simulation and triage training app as a case study, we then show the benefits of using MRAT, not only for user testing of MR apps, but also performance tuning throughout the design process.

Video Summary

Next Steps
Full citation (student names and mentees are underlined)

MRAT: The Mixed Reality Analytics Toolkit (CHI 2020)
M. Nebeling, M. SpeicherX. WangS. RajaramB.D. HallZ. XieA.R.E. Raistrick, M. Aebersold, E.G. Happ, J. WangY. SunL. ZhangL. RamsierR. Kulkarni