I’m a 3rd year Computer Science PhD candidate at Stanford University, co-advised by James Landay and Sean Follmer. My research interests are in virtual/augmented reality, tangible user-interfaces, ubiquitous computing, and more broadly human-computer interaction. Previously, I was advised by Daniel Wigdor at University of Toronto for my undergraduate thesis.
Beyond The Force: Using Quadcopters to Appropriate Objects and the Environment for Haptics in Virtual Reality
Parastoo Abtahi, Benoit Landry, Jackie (Junrui) Yang, Marco Pavone, Sean Follmer, James Landay
CHI 2019 | Best Paper Honorable Mention
We present HoverHaptics, an autonomous safe-to-touch quadcopter and its integration with a virtual shopping experience. We highlights three affordances of quadcopters that enable rich haptic interactions: dynamic positioning of passive haptics, texture mapping, and animation of passive props.
I’m a Giant: Walking in Large Virtual Environments at High Speed Gains
Parastoo Abtahi, Mar Gonzalez-Franco, Eyal Ofek, Anthony Steed
We explore three methods for increasing users’ perceived walking speed in virtual reality: Ground-Level Scaling turns users into a giant, Eye-Level Scaling allows them to walk through a miniature world while maintaining their eye level, and with Seven-League Boots every step the user takes appears longer.
Visuo-Haptic Illusions for Improving the Perceived Performance of Shape Display
Parastoo Abtahi, Sean Follmer
CHI 2018 | Best Paper Honorable Mention
Shape displays are matrices of actuated pins that move up and down to render physical shapes; however, they have limitations such as low resolution, small display size, and low pin speed. We employ illusions such as redirection, scaling, and retargeting that take advantage of the visual dominance effect, to improve the perceived performance of shape displays.
Drone Near Me: Exploring Touch-Based Human-Drone Interaction
Parastoo Abtahi, David Zhao, Jane E, James Landay
UbiComp / IMWUT 2017
We ran an elicitation study using an unsafe and a safe-to-touch drone to find out if participants will instinctively use touch as a means of interacting with the safe-to-touch drone. We found that in the safe condition, 58% of the participants used touch, and across all tasks 39% of interactions were touch-based.