Tapomayukh Bhattacharjee

Date:

Speaker

Tapomayukh “Tapo” Bhattacharjee is an Assistant Professor in the Department of Computer Science at Cornell University where he directs the EmPRISE Lab. He completed his Ph.D. in Robotics from Georgia Institute of Technology and was an NIH Ruth L. Kirschstein NRSA postdoctoral research associate in Computer Science & Engineering at the University of Washington. He wants to enable robots to assist people with mobility limitations with activities of daily living. His work spans the fields of human-robot interaction, haptic perception, and robot manipulation and focuses on addressing the fundamental research question on how to leverage robot-world physical interactions in unstructured human environments to perform relevant activities of daily living. He is the recipient of NSF CAREER Award’23 and his work has won Best RoboCup Paper Award at IROS’22, Best Paper Award Finalist and Best Student Paper Award Finalist at IROS’22, Best Technical Advances Paper Award at HRI’19, and Best Demonstration Award at NeurIPS’18. His work has also been featured in many media outlets including the BBC, Reuters, New York Times, IEEE Spectrum, and GeekWire and his robot-assisted feeding work was selected to be one of the best interactive designs of 2019 by Fast Company.

Speaker Links: Google ScholarLab Youtube Channel

Abstract

How do we build robots that can assist people with mobility limitations with activities of daily living? To successfully perform these activities, a robot needs to be able to physically interact with humans and objects in unstructured human environments. Through this talk, I will cover various projects in my lab that showcase fundamental advances in the field of robotic caregiving. Specifically, I will show you how we can build caregiving robots to perform activities of daily living such as feeding and bed-bathing. Both tasks require a robot to reason about the safety of complex physical interactions with humans in the presence of uncertainties due to perception and planning in cluttered assistive environments as well as complexities due to disability conditions (spasms and involuntary movements). Using insights from human studies, I will showcase algorithms and technologies that leverage multiple sensing modalities to perceive varied object properties and determine successful control policies for these tasks. Using feedback from all the stakeholders, I will show how we built autonomous robot-assisted feeding and bed-bathing systems that use these algorithms and technologies, and how we deployed them to work with real users.