Beyond The Force: Using Quadcopters to Appropriate Objects and the Environment for Haptics in Virtual Reality
Parastoo Abtahi, Benoit Landry, Jackie (Junrui) Yang, Marco Pavone, Sean Follmer, James Landay
What is haptics?
Haptics refers to any form of interaction that involves the sense of touch, particularly in the context of computer applications. In virtual reality the sense of touch is often missing, so when you make contact with a virtual object you don’t feel anything and your hand goes through that object. What haptic devices aim to do is to create that sensation of touch in virtual reality, such that when you make contact with a virtual object you can feel it as well.
Why do we care about haptics?
Haptic devices enhance virtual reality applications in many ways, but I’ll list three reasons why we care about haptics here. Firstly, by adding the sensation of touch virtual environments feel more real, and this helps users feel more immersed in the virtual world. Also, our understanding of the world is limited when we can only look at it, and we get lots of information by haptically interacting with objects. For example, by touching the surface of objects we learn how smooth or rough they are, or by picking up objects we learn how heavy they are. Finally haptic devices make it easier for us to manipulate objects. For example, it’s much easier to pick up a cup if we can feel it in our hand.
What is encountered-type haptics?
There are many different types of haptic devices. Encountered-type haptic devices are a subset of haptic devices that move around and present themselves when you contact a virtual object. With these devices you don’t have to hold something (like a controller) in your hand or wear something (like a glove), but when you encounter a virtual object they move to that contact point, such that at the instant that your virtual hand contacts the virtual object, your real hand touches the encountered-type haptic device. These devices are usually grounded (attached to the ground) robotic arms.
What did we do?
As I mentioned, encountered-type haptic devices are usually grounded robotic arms. For this project, we used a drone as a non-grounded encountered-type haptic device, since it’s hovering in the air and not attached to the ground. We first built a cage for a drone to protect users’ fingers from the rotating propellers of the drone. I’ve shared the process, if you’re interested in building your own safe-to-touch drone cage. Then our goal was to show that, using this drone and by attaching objects to it, you can enable interesting haptic interactions. We demonstrated these interactions in a virtual shopping experience. We attached different fabrics around the drone cage and also attached a hanger to it. When users touched a piece of clothes, the drone would fly there and rotate to match the texture of the fabric. When users wanted to purchase an item, they could pick up the hanger and they would feel the hanger in their hand and also the weight of the drone. You can watch the video!
What did we learn?
First we learned that there are many requirements to make this system work, but unfortunately these requirements are interdependent and can’t all be satisfied at the same time. For example, the drone has to be really fast to be able to fly quickly to the next commanded position. But a fast drone is dangerous, because it could potentially hit the user (with a large impact force) and injure them. Also, if the drone is really fast, it won’t be as accurate, and in this case the drone has to be really accurate, so that when we send it a position command, it can go to that exact position. If it’s not accurate, the user will not be able to touch the drone when they contact a virtual object.
We tried different methods to satisfy some of these requirements and tried to find a balance between them. For example, we lowered the speed of the drone so that it would be safer and more accurate. But to avoid long delays, it couldn’t be too slow either. This medium speed was fast enough that there were still safety concerns and accuracy issues. We addressed those by implementing safety mechanisms like emergency modes and collision avoidance algorithms. To compensate for the lack of position accuracy, we used visuo-haptic illusions. If you’re interested in learning more, checkout the paper.
What did users think?
We ran a user study with 9 participants, which means 9 people came to our lab and tried out the virtual shopping experience with the drone. They got to feel the fabric of a scarf and a shirt in the virtual boutique. They picked up a shoebox and also a scarf by the hanger and placed them in their shopping basket. Afterwards we asked them some questions and found that overall people liked the experience with the drone. Most of them felt safe - except one person who was worried about hitting the drone throughout the experience. The speed of the drone was a bit slower than people would have liked, and they had to wait longer than they expected to be able to touch different virtual objects. People’s favorite part was picking up the hanger, because instead of just touching the object, they could pick it up and move it around, while feeling its weight.
I think that in the future drones will become smaller, quieter, safer, more affordable, and as a result more readily available. And if drones become even more powerful, then they could have a gripper mechanism attached to them, to pick up different objects. Imagine if you had a bunch of these drones, then maybe as you’re about to touch something in virtual reality, they system could predict what you’re about to touch, and the closest drone could identify a similar looking object (either a real object that’s somewhere in the room or a 3D printed prop), pick it up, and fly quickly to where you are, so you could actually feel it just in time!