maya cakmak

PAUL G. ALLEN SCHOOL OF COMPUTER SCIENCE & ENGINEERING, UNIVERSITY OF WASHINGTON

The long term goal of my research is to make personal robot assistants in the home a reality. Such robots can bring independence to persons with motor limitations, enable older adults to age in place, and improve the quality of people's lives. Despite tremendous progress towards this vision in all areas of robotics, there are still no robots (commercial or prototype) ready to be deployed in a real home, even though many people would immediately benefit from them. Tackling challenges of developing assistive robots that can be deployed in the real world and perform useful tasks for people requires a human-centered approach combined with technical innovation and end-to-end system building.

Robots for Access

For millions of people living with a physical or motor impairment, robots present a unique opportunity to access the physical world. Although many roboticists are motivated by the potential impact of their work on people with disabilities (PwDs), most do not take the extra steps necessary to actualize their impact. That included me, until recently. I learned from my colleagues in the field of accessibility that the key to real impact is including PwDs in all aspects of the research. For the past five years, I have committed to this approach, working with PwDs across many projects to develop assistive robots for increasing their access and independence.

Accessible Robots

I have continued to pursue my long time research agenda of \textit{end-user robot programming} (EUP) with the goal of ensuring robot programming tools are accessible to users of different backgrounds and abilities. Making robots programmable by end-users can enable a more feasible path to deployable robots, while also ensuring assistive robots can be customized to every user's unique needs and preferences. My recent research within these themes has been published at the most selective robotics and Human-Robot Interaction (HRI) conferences, receiving two best paper awards (ICRA 2021 and HRI 2023).

One of the recent highlights of my career has been the fruition of many years of research with the first-of-its-kind long-term deployment of a mobile manipulator in the home of a person with severe motor impairments. I describe some of this work in the following.

Physically Assistive Robots

The World Health Organization reports that 190 million people globally live with some form of physical or motor limitation, which can impact their ability to perform activities of daily living and make them dependent on caregivers. Robots, which can move about human environments and physically interact with objects and people, have the potential to assist in these activities. Although the vision of physically assistive robots has motivated research across sub-fields of robotics for decades, such robots have only recently become feasible in terms of capabilities, safety, and price. However, getting robots to perform useful tasks in homes requires important steps that go beyond proof-of-concept capabilities demonstrated in the lab. Further, since such robots are completely new to most potential users, there are still many open questions around how they can and should interact with their users.

In the past five years, I have taken inspiration from research in the field of accessibility, where the norm is to include people from the target population at every stage of the research. Many of my projects started with formative work aiming to better understand the challenges of the user and opportunities for robots. For example, our ASSETS 2019 paper (link TBA) describes a contextual inquiry which involved having a meal with four individuals who need assistance with feeding, together with their caregivers, leading to many insights that influenced the course of our research on assistive feeding (link TBA)

Another project explored the use of a tele-operated Stretch mobile manipulator in a home setting to demonstrating the feasibility of different tasks and the potential utility for people with motor limitations to either receive remote assistance or directly control a robot themselves (link TBA). A unique collaboration with a team from Bangladesh involved several visits to an assisted-living and community center to understand challenges of older adults and the potential of low-cost assistive robots in the developing world (link TBA).

One strategy that can lower the barrier for home deployments of assistive robots is to reduce their level of autonomy. This strategy not only allows robots to perform useful tasks without depending on universal autonomous capabilities across the board, it is also preferred by users in some cases. A study we presented at HRI 2020 demonstrated that people with severe motor impairments do not always prefer a more autonomous feeding robot (link TBA).

Encouraged by those findings and previous findings about the potential of tele-operated assistive robots, we have been developing a tele-operation toolkit that allows full customization of interfaces to ensure its accessibility for users with limited input.  Our IROS 2021 paper developed a unifying formalism in which very different tele-operation interfaces are all represented as finite state machines, revealing new variations of existing interfaces that might be more accessible to certain users (link TBA). In a recent study, we demonstrated the potential benefits of allowing users to customize even very few aspects of the interface (link TBA).

My lab's efforts in developing accessible and customizable tele-operation interfaces have been the key in enabling the first-of-its-kind home deployment of an assistive Stretch robot, as part of an interdisciplinary effort in partnership with Hello Robot. Our user, Henry Evans, is quadriplegic due to a stroke from many years ago. For our first deployment in Summer 2021 an occupational therapist from our team, Vy Nguyen, lived with Henry and his wife Jane for a total of four weeks, working with Henry everyday to achieve his occupational goals via the Stretch robot.  Using our interface, Henry was able to feed himself, wipe his face, scratch itches, apply lotion on his arms and legs, operate his percussion vest, use a printer, play a card game with his family, and hand a rose to Jane (link TBA).

Our second deployment in Summer 2022 involved a first prototype of an end-user programming tool we developed for Henry, allowing him to perform certain tasks more efficiently (link TBA). The third deployment in Summer 2023, where one of my students was part of the team living with Henry and the robot, further expanded Henry's abilities, including breakthroughs in establishing a connection with his granddaughter via the robot. Despite the great amounts of effort that goes into each deployment that is not directly recognized in academia, they have been the most valuable in inspiring new research projects in my lab. I recently secured a grant from the NIH to further pursue some of those projects.

While tele-operation and end-user programming enable general purpose assistance, frequently repeated activities are worth deeper explorations to develop autonomous capabilities. One micro interaction that is key to many assistive tasks is human-robot handovers, which I have worked on for over a decade with several recent technical contributions  (links TBA) (including a Best HRI paper award at ICRA 2021). In a recent paper, we developed a method that automatically selects object handover poses by taking into account motor limitations of the human (link TBA).

Another activity we have focused, in collaborations with the UW Personal Robotics Lab, is feeding (links TBA). In our recent paper, which received the best design paper award at HRI 2023, we explored the use of assistive feeding robots in social contexts  (link TBA)---a topic that was repeatedly brought to our attention by users in prior studies. We collaborated with a local community researcher, Tyler Schrenck, who is also paralyzed and directs an assistive technology nonprofit. The project not only revealed many insights about the design of assistive feeding robots, but also allowed us to develop new methodologies for involving community researchers like Tyler in academic research.

My students and I were recently completed a systematic literature survey on Physically Assistive Robots, which is currently in press for the Annual Review of Control, Robotics, and Autonomous Systems, reinforced my excitement about the potential of robots for access and the need for further research in this area.

End-User Robot Programming

Enabling end-users to program their own robots after deployment has several benefits: It can eliminate the challenge of developing universal robotic capabilities that will work robustly in every possible deployment scenario. Instead, each robot needs to robustly work in only one specific environment. EUP also has the potential to greatly expand possible use-cases of general-purpose robots by empowering users to decide what their robots will do for them. Further, it allows users to customize their robot's behavior to meet their particular needs and preferences. These benefits come with one key challenge: enabling end-users to do what is currently done by skilled programmers. My research tackles this challenge.

My early work in EUP for robots explored new ways of programming robots while also developing intuitive interfaces for known ways of programming them. My work in  the past five years continued to build on those foundations. In one project we developed FLEX-SDK: an open-source toolkit that allows creating a social robot from two simple tablet screens (link TBA). The toolkit has been used by interdisciplinary research teams (at UW and other institutions) to conduct HRI research, supporting a wide array of methods from participatory design with users to controlled experiments. Our recent UIST 2022 paper described nine such case studies from the past five years. In another project we developed ConCodeIt, expanding visual programming to support concurrency, which is critical in robot programming (link TBA).

As the topic became more popular, I had the opportunity to collaborate with researchers at other institutions on similar projects. With a team from University of Toronto, we contributed an EUP tool to create robot programs that can adapt to new scenarios by having a human-in-the-loop (link TBA). With a team from University of Wisconsin, we received a grant from the National Robotics Initiative to pursue the application of techniques from programming languages (program verification and synthesis) to programming interactive robots. One joint project led to an original interactive table top programming environment that allows users to demonstrate interactions using figurines that correspond to a human and a robot (link TBA).

One of the key ways our research expanded in this area is the application of program synthesis to reduce the burden on programmers. Our early work (IROS 2019) uses task-based heuristics to synthesize an executable robot program from a single demonstration of a user performing the task themselves (link TBA). Our RSS 2020 paper introduced a method for expert programmers to create a program sketch with hole variables whose distributions can tuned over time by end-users, explicitly or implicitly, simply by interacting with the robot (link TBA).

Another method we developed (HRI 2020) uses expected divergence maximization to automatically tune parameters of a robot program by querying the user in different ways that account for parameter types (link TBA). Similarly, in our collaborative project with University of Wisconsin we developed a method for synthesizing robot programs from multi-modal user input, including user path sketches on a tablet and natural language instructions (link TBA), which was published at HRI 2023.

End-user robot programming has become an active topic at HRI and robotics conferences, with a recent literature survey from another team identifying 45 papers just on end-user program specification for robots (link TBA). I have continued efforts to make tools that others truly find useful (e.g., FLEX-SDK link TBA), as well as transfer tools to commercial robots (e.g., EUP tool on the Hello Robot Stretch SE2).