RAMPA++: Improving the RAMPA XR Interface for Machine Learning-based LfD Applications

RAMPA++: Improving the RAMPA XR Interface for Machine Learning-based LfD Applications

RAMPA is the extended reality interface developed by the members of our lab, Colors, to control a UR10 robot via a Meta Quest 3 headset. RAMPA utilizes the Unity framework to create visualizations on the headset while the communication with the robot is done via the Robot Operating System. All of the source code is written in Python.

 

A snapshot from the menu


 

In (a), the user sees the actual and the simulated robots aligned. In (b), the simulated robot follows a trajectory demonstrated through the XR interface. In (c), the real robot follows the same trajectory as the simulated robot.


 

In this project, we want to improve RAMPA interface to facilitate training and testing Machine Learning based Learning-from-Demonstration algorithms [1], such as [2, 3]. You are expected to use Unity, ROS1, and Python. Also, you need to learn deep LfD methods explained in [2, 3].

 

[1]: Argall, B. D., Chernova, S., Veloso, M., & Browning, B. (2009). A survey of robot learning from demonstration. Robotics and autonomous systems, 57(5), 469-483.

[2]: Seker, M. Y., Imre, M., Piater, J. H., & Ugur, E. (2019, June). Conditional Neural Movement Primitives. In Robotics: Science and Systems (Vol. 10).

[3]: Yildirim, Y., & Ugur, E. (2024). Conditional Neural Expert Processes for Learning Movement Primitives from Demonstration.

Project Advisor: 

Emre Uğur

Project Status: 

Project Year: 

2024
  • Fall

Bize Ulaşın

Bilgisayar Mühendisliği Bölümü, Boğaziçi Üniversitesi,
34342 Bebek, İstanbul, Türkiye

  • Telefon: +90 212 359 45 23/24
  • Faks: +90 212 2872461
 

Bizi takip edin

Sosyal Medya hesaplarımızı izleyerek bölümdeki gelişmeleri takip edebilirsiniz