Two of our team members, Amanda and Neeraj, had previously used the Sphero SPRK robot in a class teaching kids how to program. They wanted to use it during the hackathon, so we decided to build something for the 1D, 2D, 3D challenge. Specifically, we wanted to create a controller that would allow an individual to control and manipulate a digital 3-D object or visualization through movement of the Sphero ball.
The Sphero uses accelerometeric and gyroscopic data to determine position for it's programmed use as a robotic toy. We grabbed the data provided by their Internal Measurement Unit (IMU) to instead return information on relative metrics, such as pitch, yaw, and roll.
Then, utilizing three js, those movements were associated with the movement of a 3-D object in a viewer along a fixed point. By our design, it should work with .stl, .obj, and most other three-dimensional object files.
Our vision for this project extends beyond the functionality we've displayed in the video. The Sphero accepts commands based on variables such as "On Collision," which could be sensitized to operate based on a simple tap. With this added capability, an operator could tap the SpheriView in order to switch between common functions, such as Zoom In/Out, or to cycle through a time series of data (imagine a 3-D globe, with decades of climate data that can be accessed with a tap and a twist of the wrist).
Beyond that early iteration of our product, we believe the form and functionality would be well-suited to work within 3-D environments. As a consumer product, many people do not have the available space at home to safely move within a virtual world. Ever since we've been children, we've grown comfortable holding ball-shaped objects, which is why we feel manipulating the SpheriView to control movement will be intuitive to most users.
Another example of the use of the controller would be in a future version of live-sporting events, where cameras are fixed around the field of play. This type of setup could work anywhere from a UFC Octagon, to the opening ceremony of the Olympics. By allowing users to shuffle through an array of cameras to pick the vantage point they prefer, Live VR events could cater to individual preferences, offering a much more immersive experience for live sports.
The SpheriView controller enables professionals in Oil and Gas Exploration, Medicine, Design, and a wide range of other professions to obtain better control of their 3-D visualizations, and opens up new possibilities in how we experience virtual reality.
SpaceApps is a NASA incubator innovation program.