Kenneth Shaw

CV | Google Scholar | Twitter | Formal Bio |

Hi I'm Kenny. I'm a final year PhD student at the Robotics Institute in Carnegie Mellon University, advised by Prof. Deepak Pathak .

My reserach centers on dexterous manipulation. I have designed several low-cost, highly capable dexterous robotic hands aimed at making manipulation research and education more accessible. I develop highly dexterous AI policies for these hands by leveraging human demonstrations from internet videos, teleoperation, and simulation.

More broadly, I'm interested in how can we create new democratized robotic hardware systems with unlocked capabillities from machine learning. How does the design of a robot's hardware shape the way it learns—and how does the learning influence how the hardware should be designed?

Previously, I graduated from Georgia Tech in Computer Engineering and worked on multi-agent systems and HRI with Prof. Sonia Chernova and Prof. Harish Ravichandar. Please contact me via email at kshaw2 -at- andrew dot cmu dot edu .


  Publications

sym IFG: Internet-Scale Functional Grasping
Ray Muxin Liu*, Mingxuan Li*, Kenneth Shaw, Deepak Pathak

In Submission
website | abstract | bibtex | arXiv

                      In Submission
          

Deep Reactive Policy: Learning Reactive Manipulator Motion Planning for Dynamic Environments
Jiahui Yang*, Jason Jingzhou Liu*, Yulong Li, Youssef Khaky, Kenneth Shaw, Deepak Pathak

CoRL 2025
website | abstract | bibtex | arXiv

@article{yang2025deep,
title={Deep Reactive Policy: Learning Reactive Manipulator 
Motion Planning for Dynamic Environments},
author={Jiahui Yang and Jason Jingzhou Liu and 
Yulong Li and Youssef Khaky and Deepak Pathak},
journal={9th Annual Conference on Robot Learning},
year={2025},
}
          

LEAP Hand V2 Advanced: Dexterous, Low-cost Hybrid Rigid-Soft Hand for Robot Learning
Kenneth Shaw, Deepak Pathak
IEEE Humanoids 2025

webpage | abstract | bibtex | IEEE Humanoids

   
        @inproceedings{shaw2025leapv2adv,
          title={LEAP Hand V2 Advanced: Dexterous, Low-cost Hybrid Rigid-Soft Hand for Robot Learning},
          author={Shaw, Kenneth and Pathak, Deepak},
          booktitle={2025 IEEE-RAS International Conference on Humanoid Robots (Humanoids)},
          year={2025}
        }
      

DexWild: Dexterous Human Interactions for In-the-Wild Robot Policies
Tony Tao*, Mohan Kumar Srirama*, Jason Jingzhou Liu, Kenneth Shaw, Deepak Pathak
RSS 2025

Best Paper Award at EgoAct Workshop 2025
website | abstract | bibtex | arXiv

@article{tao2025dexwild,
title={DexWild: Dexterous Human Interactions for In-the-Wild Robot Policies},
author={Tao, Tony and Srirama, Mohan Kumar and Liu, Jason Jingzhou and Shaw, Kenneth and Pathak, Deepak},
journal={Robotics: Science and Systems (RSS)},
year={2025},
}

FACTR: Force-Attending Curriculum Training for Contact-Rich Policy Learning
Jason Jingzhou Liu*, Yulong Li*, Kenneth Shaw, Tony Tao, Ruslan Salakhutdinov, Deepak Pathak
RSS 2025

website | abstract | bibtex | arXiv | code

@article{liu2025factr,
        title={FACTR: Force-Attending Curriculum Training for 
        Contact-Rich Policy Learning}, 
        author={Jason Jingzhou Liu and Yulong Li and Kenneth Shaw 
        and Tony Tao and Ruslan Salakhutdinov and Deepak Pathak},
        journal={arXiv preprint arXiv:2502.17432},
        year={2025},
        }
        

Demonstrating LEAP Hand v2: Low-Cost, Easy-to-Assemble, High-Performance Hand for Robot Learning
Kenneth Shaw, Deepak Pathak
RSS 2025

webpage | abstract | bibtex | RSS

   
        @inproceedings{shaw2025leapv2,
          title={Demonstrating LEAP Hand v2: Low-Cost, Easy-to-Assemble, High-Performance Hand for Robot Learning},
          author={Shaw, Kenneth and Pathak, Deepak},
          booktitle={2025 Robotics: Science and Systems},
          year={2025}
        }
      

Bimanual Dexterity for Complex Tasks
Kenneth Shaw*, Yulong Li*, Jiahui Yang, Mohan Kumar Srirama, Ray Liu, Haoyu Xiong, Russell Mendonca†, Deepak Pathak†
CoRL 2024

webpage | abstract | bibtex | CoRL

   
        @inproceedings{shaw2024bimanual,
          title={Bimanual Dexterity for Complex Tasks},
          author={Shaw, Kenneth and Li, Yulong and Yang, Jiahui and Srirama, Mohan Kumar and Liu, Ray and Xiong, Haoyu and Mendonca, Russell and Pathak, Deepak},
          booktitle={8th Annual Conference on Robot Learning},
          year={2024}
        }
      
sym

Adaptive Mobile Manipulation for Articulated Objects In the Open World
Haoyu Xiong, Russell Mendonca, Kenneth Shaw, Deepak Pathak
ArXiv 2024

webpage | abstract | bibtex | arXiv

    @article{xiong2024adaptive,
      title={Adaptive Mobile Manipulation for Articulated Objects In the Open World},
      author={Xiong, Haoyu and Mendonca, Russell and Shaw, Kenneth and Pathak, Deepak},
      journal={arXiv preprint arXiv:2401.14403},
      year={2024}
    }
  
sym

SPIN: Simultaneous Perception Interaction and Navigation
Shagun Uppal, Ananye Agarwal, Haoyu Xiong, Kenneth Shaw, Deepak Pathak
CVPR 2024 (Oral)

webpage | abstract | bibtex | arXiv

    @InProceedings{Uppal_2024_CVPR,
      author    = {Uppal, Shagun and Agarwal, Ananye and Xiong, Haoyu and Shaw, Kenneth and Pathak, Deepak},
      title     = {SPIN: Simultaneous Perception Interaction and Navigation},
      booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
      month     = {June},
      year      = {2024},
      pages     = {18133-18142}
  }
  

DEFT: Dexterous Fine-Tuning for Real-World Hand Policies
Aditya Kannan*, Kenneth Shaw*, Shikhar Bahl, Pragna Mannam, Deepak Pathak
CoRL 2023

webpage | abstract | bibtex | CoRL

  @article{kannan2023deft,
  title={DEFT: Dexterous Fine-Tuning for Real-World Hand Policies},
  author={Kannan, Aditya* and Shaw, Kenneth* and Bahl, Shikhar and Mannam, Pragna and Pathak, Deepak},
  journal= {CoRL},
  year={2023}
  }
  

Dexterous Functional Grasping
Ananye Agarwal, Shagun Uppal, Kenneth Shaw, Deepak Pathak
CoRL 2023

webpage | abstract | bibtex | arXiv

@inproceedings{agarwal2023dexterous,
  title={Dexterous Functional Grasping},
  author={Agarwal, Ananye and Uppal, Shagun and Shaw, Kenneth and Pathak, Deepak},
  booktitle={Conference on Robot Learning},
  pages={3453--3467},
  year={2023},
  organization={PMLR}
}

DASH: A Framework for Designing Anthropomorphic Soft Hands through Interaction
Pragna Mannam*, Kenneth Shaw*, Dominik Bauer, Jean Oh, Deepak Pathak, Nancy Pollard
IEEE Humanoids 2023 Best Oral Paper Award Finalist (Top 3)

webpage | abstract | bibtex | arXiv

  @article{mannam2023Dashhand,
  title={DASH: A Framework for Designing Anthropomorphic Soft Hands through Interaction},
  author={Mannam, Pragna* and Shaw, Kenneth* and Bauer, Dominik and Oh, Jean and Pathak, Deepak and Pollard, Nancy},
  journal= {IEEE Humanoids},
  year={2023}
  }
  

LEAP Hand: Low-Cost, Efficient, and Anthropomorphic Hand for Robot Learning
Kenneth Shaw, Ananye Agarwal, Deepak Pathak
RSS 2023

Start your dexterous manipulation journey here!

webpage | abstract | bibtex | RSS

  @article{shaw2023Leaphand,
  title={LEAP Hand:Low-Cost, Efficient,
  and Anthropomorphic Hand for Robot Learning},
  author={Shaw, Kenneth and Agarwal, Ananye
  and, Pathak, Deepak},
  journal= {RSS},
  year={2023}
  }
  
Learning Dexterity from Human Hand Motion in Internet Videos
Kenneth Shaw*, Shikhar Bahl*, Aravind Sivakumar, Aditya Kannan, Deepak Pathak
IJRR Special Issue
Featured on the front page of IJRR Special Issue April 2024

abstract | bibtex

  @article{shaw_internetvideos,
    title={Learning Dexterity from Human Hand Motion in Internet Videos},
    author={Shaw, Kenneth and Bahl,
    Shikhar and Sivakumar, Aravind and Kannan, Aditya and Pathak, Deepak},
    journal= {IJRR},
    year={2022}
  }

VideoDex: Learning Dexterity from Internet Videos
Kenneth Shaw*, Shikhar Bahl*, Deepak Pathak
CoRL 2022

webpage | abstract | bibtex | arXiv | demo

  @article{shaw_videodex,
    title={VideoDex: Learning Dexterity
    from Internet Videos},
    author={Shaw, Kenneth and Bahl,
    Shikhar and Pathak, Deepak},
    journal= {CoRL},
    year={2022}
  }
sym

Robotic Telekinesis: Learning a Robotic Hand Imitator by Watching Humans on Youtube
Aravind Sivakumar*, Kenneth Shaw*, Deepak Pathak
RSS 2022
Best Paper Award Finalist in Scaling Robot Learning Workshop

webpage | abstract | bibtex | arXiv | demo | in the media

We build a system that enables any human to control a robot hand and arm, simply by demonstrating motions with their own hand. The robot observes the human operator via a single RGB camera and imitates their actions in real-time. Human hands and robot hands differ in shape, size, and joint structure, and performing this translation from a single uncalibrated camera is a highly underconstrained problem. Moreover, the retargeted trajectories must effectively execute tasks on a physical robot, which requires them to be temporally smooth and free of self-collisions. Our key insight is that while paired human-robot correspondence data is expensive to collect, the internet contains a massive corpus of rich and diverse human hand videos. We leverage this data to train a system that understands human hands and retargets a human video stream into a robot hand-arm trajectory that is smooth, swift, safe, and semantically similar to the guiding demonstration. We demonstrate that it enables previously untrained people to teleoperate a robot on various dexterous manipulation tasks. Our low-cost, glove-free, marker-free remote teleoperation system makes robot teaching more accessible and we hope that it can aid robots that learn to act autonomously in the real world.

@article{telekinesis,
  title={Robotic Telekinesis: Learning a
  Robotic Hand Imitator by Watching Humans
  on Youtube},
  author={Sivakumar, Aravind and
  Shaw, Kenneth and Pathak, Deepak},
  journal={RSS},
  year={2022}
}

Demos

Come meet me at CoRL, ICRA and RSS! I have attended CoRL 2023, 2024 and 2025, RSS 2023, 2024 and 2025, and ICRA 2025, and I plan to attend many more. I would love to chat with you, and you can even try out the robot hands yourself!

Education & Outreach

I am deeply committed to supporting educational use of low-cost robot hands, from university labs to K-12 classrooms. My goal is to inspire curiosity, teach core robotics and machine learning skills, and make hands-on research accessible to everyone. Please reach out if you need additional resources or support beyond what's available on our website. I'd love to help!


Modified version of template from here and here