In Your Face! – Designing Future Interaction Models for Internet of Things and Augmented Reality
It is estimated that the number of devices connected to the Internet will be 50 billion by 2020. How should a not-so-tech-savvy end user be able to discover and directly interact with a myriad of connected things in an intuitive and comfortable manner? Up until now, smartphones have shown potential for managing the Internet of Things (IoT) environments, but we cannot rely on that technology. Wearable technology devices are maturing and are available in many different form factors including head-worn displays (HWDs), smartwatches and wristbands. They enable access to information at a glance. They are intended to always be ‘‘on’’, to always be acting and to always be sensing the surrounding environment in order to offer a better interface to the real world. A technology suitable for these kind of user interfaces is augmented reality (AR) due to its ability to merge the real with the virtual. However, prototyping AR user interfaces to discover and control connected things can be difficult and costly because it involves a number of different devices and systems with varying levels of technological readiness.The aim of the research presented in this thesis was to develop and explore three tools that can be used for prototyping AR and IoT interaction and to introduce four interaction models for controlling IoT devices. One of the tools is based on real-world Wizard of Oz (WOZ) prototyping method, which lets a human to operate undeveloped components of a technical system, and the other two are built on virtual reality (VR) -based prototyping for an IoT environment. The interaction models were developed for different form factors. One is based on a smartwatch form factor and an interaction model called UbiCompass, and three are based on HWD form factor and interaction models called Floating Icons, World in Miniature and Floating Menu, respectively.The thesis is based on the five attached papers.Paper 1 presents a WOZ prototyping tool called WozARd and the set of features it offers. The WozARd device allows the test leader to control the visual, tactile and auditive output that is presented to the test participant. The study described in Paper 1 is an initial investigation of the capability of the real-world prototyping method with WOZ to simulate a believable illusion of a real working AR city tour. A user study was carried out by collecting and analyzing qualitative and quantitative data from 21 participants who performed the AR city tour using the WozARd with an HWD and smartwatch. The data analysis focused on seven categories that can have a potential impact on how the WozARd method is perceived by participants: precision, relevance, responsiveness, technical stability, visual fidelity, general user experience, and human operator performance. Overall, the results seem to indicate that the participants perceived the simulated AR city tour as a relatively realistic experience despite a certain degree of technical instability and human operator mistakes.Paper 2 presents a proposed VR-based prototyping tool called IVAR (Immersive Virtual AR) for prototyping wearable AR and IoT interaction in a virtual environment (VE). IVAR was developed in an iterative design process that resulted in a testable setup in terms of hardware and software. Additionally, a basic pilot experiment with 24 participants was conducted to explore what it means to collect quantitative and qualitative data with the proposed prototyping tool. The main contribution is that IVAR shows potential to become a useful wearable AR and IoT prototyping tool, but that several challenges remain before meaningful data can be produced in controlled experiments. In particular, tracking technology needs to improve, both with regards to intrusiveness and precision.Paper 3 presents a proposed VR-based prototyping tool, using VR technology based on room-scale tracking to prototype IoT interaction. It is built on the same idea as in Paper 2. We refer to the prototyping tool as VRUbi. Three IoT interaction concepts were compared in a controlled experiment with 21 test persons for evaluation and comparison. Some statistically significant differences and subjective preferences could be observed in the quantitative and qualitative data, respectively. The main contribution of this paper is to elucidate knowledge about the method of using VR as a prototyping tool to explore IoT interaction.Paper 4 presents a novel IoT interaction concept called UbiCompass. A functional, smartwatch face prototype of the UbiCompass was developed and integrated with an existing smart home system. It was then compared to a traditional smart home mobile application in a controlled experiment. In total 36 participants were recruited for the experiment. The results showed statistically significant differences in favor of the proposed concept, which highlights the potential the UbiCompass has as an IoT interaction concept.Paper 5 presents three basic IoT interaction models, with a focus on the aspects of discovering and selecting devices, implemented for Microsoft HoloLens. The intention was to compare the models in an experimental study with 20 participants. They were split into two groups: one with low device density and one with high device density. Each group had to solve the same task using each of the three interaction models. The results showed that with just a few devices to interact with, the participants’ interactions did not differ signiﬁcantly. However, with many devices to engage with, the World in Miniature model stood out as especially demanding and time-consuming. There was also high variability in the models that were preferred by the participants, possibly implying that a combination of the three proposed models is desired in a fully developed AR system for managing IoT devices.Overall, the research presented in this thesis found the three prototyping tools – WozARd, IVAR, and VRUbi – to be useful for prototyping AR and IoT interaction. One important takeaway for organizations that develop IoT systems or services is to use VR to simulate different scenarios and interactions. The two VR-based prototyping tools are suitable for simulations of more complex scenarios, since registration and tracking can be easily simulated, while WozARd is suitable for prototyping simple AR user interfaces.Overall, the interaction models presented utilize two form factors – smartwatch and HWD – both of which did well during the experiments. They both focus on three aspects: discovering connected devices; selecting and controlling connected devices; and that the user not needing to start an application. An example of the later is that the user interface should just appear when a person enters a smart office.
This content was originally published here.