The Interactive Robotics and Novel Technologies (IRON) Laboratory is an interdisciplinary research group housed within the ATLAS Institute at the University of Colorado Boulder. Our mission is to advance knowledge regarding the design of new sensing, interface, and robotic technologies to improve user experience, productivity, and enjoyment.
We specialize in Aerial/Free-flying Robots, Human-Robot Collaboration, Space Robotics, Augmented & Virtual Reality, and Art/Technology Synthesis.
This project seeks to create a 3D reconstruction pipeline that enables robots to leverage sensors of opportunity—collocated, heterogeneous, consumer-grade sensors embedded in the operational environment—to augment onboard robot sensing capabilities. Our modular infrastructure integrates several popular visual odometry and point cloud generation algorithms, allowing customization based on environmental conditions. In addition, we have developed a novel iterative closest point (ICP) algorithm that supports fusing point clouds into a single data product without relying on a priori scale relationships, allowing diverse sensors to contribute to map construction.
In this work, we explore opportunities for bringing tangible input and haptic output to desktop GUI applications by addressing two challenges. First, we address the lack of tangible input devices by repurposing an existing technology, that of educational and toy robots such as Sphero, Wonder Workshop’s Dash, and Parrot’s AR Drone. These robots are inexpensive (often less than $100 USD) and support connecting to PCs via Bluetooth and Wi-Fi. Most importantly, these robots contain sensors such as accelerometers and gyroscopes that enable them to be repurposed as input devices, and actuators that can be used to provide haptic feedback. Second, we address the lack of software support for tangible input and haptic output by enabling our tangible input devices to be paired with existing applications without changing the underlying code, through a combination of input event emulation, GUI automation, and custom application APIs.
In this project, we are exploring how augmented reality might mediate human-robot communication. We are designing interfaces that leverage augmented reality to visualize data collected by robots and investigating the use of augmented reality as a medium for designing intuitive robot supervisory and control interfaces.
This project explores how design decisions for rendering virtual objects using augmented reality technologies influence their perceived real-world locations. For example, object shaders may approximate real-world lighting conditions to varying degrees, potentially altering user perceptions concerning the positions of virtual objects. We seek to understand relationships between virtual object rending and user perceptions to improve the effectiveness of augmented reality applications.
This project focuses on building robust datasets that capture natural human behaviors elicited during human-robot interactions. For example, we are capturing data on the various types of gestures that users might find intuitive in attempting to give navigational commands to different types of proximal robots, such as ground vehicles, humanoid robots, and free-flying platforms.
Free-flying robots hold great promise in assisting users in a variety of terrestrial and space exploration activities, for example mapping novel environments and collecting data from locations that are difficult to access or infeasible to instrument. However, we currently lack tools for effectively tasking, managing, and interactive with these robots. This research aims to develop scalable interface technologies for supervising aerial robots and new algorithms and techniques for communicating robot state to nearby users.
Current concepts in design for repair focus heavily on simple heuristics, such as the commonization of hardware and the reduced use of permanent adhesives. We are currently looking into ways to approach assembly, disassembly, and repair as a human-machine interaction problem.
Graphic Impulse is live performance piece incorporating ground and aerial movement via pole dance with interactive projected graphics that react to the dancer in real time. The work explores and describes how technology may extend our limited human capabilities and modes of expression through themes of duality, reflecting on human vs. non-human structures. Graphic Impulse debuted at Pole Theatre USA in August 2016 at the Boulder Theater, where it won first place in the Pole Art Semi-Pro division. We are now extending this work to enable a walk-through style, interactive installation.