Skip to main content
Interactive Robotics Laboratory
Yu Gu, Professor

Chris Tatsch

Robotic Perception and Machine Learning
Mechanical Engineering

EDUCATION

    BS Electrical Engineering, Federal University of Santa Maria, Brazil (2017)

    MS Mechanical Engineering, West Virginia University (2020)

        Thesis: “ Route Planning for Long-Term Robotics Missions

    PhD Aerospace Engineering, West Virginia University (EST 2023)

DIGITAL FOOTPRINT

BIO

Christopher Tatsch is a PhD student focused on autonomous navigation and perception. He also likes building new robots. He has had the experience and opportunity to build and work with many types of robots. Christopher previously worked on the development of Turtlebot3 mobile platform at Robotis in South Korea. During his undergraduate studies in Brazil, his home country, he worked on developing a humanoid robot called Dimitri and applying machine learning techniques for controlling the torso. He joined the IRL in 2018 for his MS degree, working on planning for the Bramblebee project, where the goal was to pollinate flowers using robotic arm autonomously. He also worked developing the new version of SMART robot, a mobile platform that is used for classes and multi-robot experiments. His thesis work supports the autonomous inspection of stone mine environments, where he studied and developed a multi-layer optimization method for planning routes for robots in very large environments and under uncertainty. Additionally, Chris participated in the NASA SRC2 competition, where he was the perception lead for team mountaineers. The goal of the competition was to perform fully autonomous cooperative tasks with multiple robots in a simulated lunar environment for in-situ resource utilization.

His ongoing research centers on unraveling the intricate relationship between exploration and semantics to support autonomous robot operation. In the context of a mobile robot that is able to understand images in real-time, the challenge is to formulate exploration methodologies that not only enable the robot to navigate the environment but also enhance the robot's understanding of it (such as building 3D scene graphs). This research finds practical application across diverse scenarios, notably within underground mines using the Rhino robot, as well as within indoor building environments (e.g., leveraging the Stickbug robot and on various simulation environments). 

KEYWORDS

  • planning and exploration
  • perception
  • autonomous navigation
  • machine learning