Studentische Arbeiten und HiWi-Stellen

Human-Centered Robotics

Combining MoCap-markers and Hand 3D-scans for dexterous manipulation representation

The complexity, inherent to dexterous manipulation, makes it remain one of the unsolved topics in robotics. In order to elicit significant progress, it is crucial to exploit the knowledge of how humans manipulate objects. Being able to effectively translate the knowledge of human manipulation to the control and design of robotic systems, will have a direct impact in a variety of topics, and will move robotics a step forward into a real coexistence with people.

At MSRM we strive to understand human manipulation and represent it accurately. As part of this extensive aim, this work focuses on developing a platform that can accurately adapt a 3D scan of a human hand according to the trajectories of motion capture (MoCap) markers, measured using a VICON tracking system.

Being able to accurately recreate the motion of a surface model of a human hand opens up a wide range of possibilities, including, human manipulation prediction, ergonomic design, tactile sensing, robotic hand designs, among others.

In this study, you will work with state-of-the-art algorithms to handle mesh models and other graphic representation of objects, while deepening your understanding of machine learning, as this will be a crucial tool for your algorithms development. Additionally, you will get hands on experience with 3D scanners and the VICON tracking system.

If you want to have a glimpse of what your outcome might look like, feel free to look into this research work.

Tasks may include:

  • Development of algorithms to handle and modify mesh models (.obj, .stl)
  • Implementation of algorithms to connect VICON motion tracking measurements with 3D scan models of a human hands
  • Experimentation with 3D scans of human hands
  • Modeling of objects using CAD software and 3D printing of them for your algorithm validation

Type: Research Internship, Master Thesis, Bachelor Thesis (The tasks can be split into different work packages)

What we expect from you:

  • First and foremost, motivation to contribute fundamentally to the scientific community
  • Motivation to work independently and come up with solutions as you go
  • Creativity and out-of-the-box thinking
  • Good skills with c++/python/matlab programming
  • Good skills with machine learning algorithms
  • Basic knowledge of mesh representations (.obj, .stl) or other object graphic representations

What we can offer:

  • An open discussion environment, where your ideas will be listened.
  • Tasks that intend to bridge the human and technology worlds.
  • Resources for you to test your developed algorithms.

How to apply:

If you are interested, please send your CV, transcripts and let us know a bit more about yourself and your previous work.

Send your application to M.Sc. Diego Hidalgo (diego.hidalgo-carvajal@tum.de)

Robot Learning

Machine Learning Interface to the QB SoftHand

Robot hands are a central requirement for manipulation tasks such as dynamic grasping, object handover, catching etc. However, an extra hand introduces additional complexity and uncertainty to the manipulation problem. This motivates integrating machine learning methods as well as probabilistic formulations into the pipeline as to better understand and model the uncertainty and handle the extra complexity.  In this work, you'll develop a machine learning interface to the QB SoftHand research platform.

Tasks may include one or some of the below:

  • Develop a python communication interface to the mentioned hand
  • Integrate the developed interface in our own reference manipulation platform (featuring the Franka Emika panda robot)
  • Test machine learning methods for classifying different kinds of contacts to the hand based on joint-torque sensors in the panda joints

Type: Internship, IDP, Bachelor Thesis

Prerequisites:

  • Basic knowledge about machine learning
  • Basic knowledge about robotics and control theory
  • Very good Python and C++ skills

If interested, please contact: elie.aljalbout@tum.de

Human-Robot Interaction

Biomechanics Aware Sequential Planner for Human-Robot Interaction

Empowering robots to physically engage and interact with humans is one of the key challenges of today’s robotics. When autonomously interacting with humans, robots need better decision-making which is hindered by the lack of a reliable quantitative assessment of human physical capabilities and limitations. At MSRM, we are developing planning and control algorithms for robot manipulators/humanoids based on human biomechanics and ergonomics which will push service robots and intelligent manufacturing to a new era of physical human-robot collaboration (pHRC).
This research internship/master thesis program is focused on implementing, developing, and contributing to pHRC algorithms driven by human biomechanics response, ergonomics, and motion recognition and prediction.
In this work, you will have the opportunity to work with state-of-the-art robots and algorithms and be part of the exciting community that drives robotics forwards. Indeed, be excited and passionate about the research is a fundamental prerequisite. You have the right support, but you will also be expected to work as hard as we do.


Tasks may include few of the below (to be discussed depending on your interest and background): 

- Implement/integrate grasp planners with motion planners;
- Implement/design impedance/admittance controllers;
- Implement sequential motion planner on a dual-arm system and/or humanoid robot;
- Integrate existing biomechanics model into ROS for biomechanics-aware planning;
- Implementing biomechanics based human-manipulability;  
- Design ergonomics-based hand-over and pHRC;
- Implement/Design biomechanics-intention-aware reactive motion planning;
- Integrate motion capture systems for pHRC experiments;
- Skeleton tracking integration with biomechanics metrics;
- Human motion intention recognition;
- EMGs setup for experiments and post-processing of the EMG data;
- Inverse muscular-activity estimation based on experimental data.


Type: Research Internship, Master Thesis, Master Internship
(The tasks can be split into different workpackages)


Pre-requisites:
- Motivation to work in a team and driving environment
- Good C++ / Python / Matlab skills
- Basic knowledge about robotics OR biomechanics (particularly, for the upper-limb) 


Helpful depending on the tasks/position: 
- Experience with grasp planners 
- Experience with control or planners in robotics 
- Experience with Opensim 
- Knowledge of sEMG data capture and analysis
- Experience with motion capture systems  


Related literature

[1] Chen, L., Figueredo, L.F.C. & Dogar, M.R. Manipulation planning under changing external forces. Auton Robot 44, 1249–1269 (2020). doi.org/10.1007/s10514-020-09930-z

[2] L. F. C. Figueredo, R. C. Aguiar, L. Chen, S. Chakrabarty, M. R. Dogar and A. G. Cohn, "Human Comfortability: Integrating Ergonomics and Muscular-Informed Metrics for Manipulability Analysis During Human-Robot Collaboration," in IEEE Robotics and Automation Letters, vol. 6, no. 2, pp. 351-358, April 2021, doi: 10.1109/LRA.2020.3043173.

[3] L. Chen, L. F. C Figueredo and M. R. Dogar, "Planning for Muscular and Peripersonal-Space Comfort During Human-Robot Forceful Collaboration," 2018 IEEE-RAS 18th International Conference on Humanoid Robots (Humanoids), 2018, pp. 1-8, doi: 10.1109/HUMANOIDS.2018.8624978.

[4] Riddhiman Laha, Luis F.C. Figueredo, Juraj Vrabel, Abdalla Swikir, and Sami Haddadin. "Reactive Cooperative Manipulation using Set Primitives and Circular Fields." 2021 IEEE International Conference on Robotics and Automation (ICRA), Xi'an, China.

 

For more information, please contact:

Dr Luis Figueredo

 

 

Point-to-Point Motion Planning and Visual Servoing using User Guidance

Despite the increasing number of collaborative robots in human-centered manufacturing, up to today, industrial robots are still largely preprogrammed with very few autonomous features. At MSRM, we developing novel strategies based on single user guidance motion generation that facilitates changes in the production line in a timely and easy-to-implement fashion.
This research internship/master thesis program is focused on implementing, developing, and contributing to point-to-point motion generation and visual servoing based on purely geometric information from a single user demonstration.
In this work, you will have the opportunity to work with state-of-the-art robots and algorithms and be part of the exciting community that drives robotics forwards. Indeed, be excited and passionate about the research is a fundamental prerequisite. You have the right support, but you will also be expected to work as hard as we do.  


Tasks may include a few of the below (to be discussed depending on your interest and background):
- Implement motion interpolation algorithms using existing dual quaternions lib. (available in c++, python, matlab);  
- Implement motion interpolation from one-shot user demonstrations;
- Extract geometric features from robotic kinesthetic demonstration;  
- Extract geometric features from skeleton tracking;  
- Design/implement path and motion planning from interpolation poses;
- Design/implement visual tracking algorithms;
- Design interpolation algorithms for bimannual manipulator;


Type: Research Internship, Master Thesis, Master Internship, Bachelor Thesis
(The tasks can be split into different work packages)


Pre-requisites:
- Motivation to work in a team and driving environment
- Good C++ / Matlab skills
- Good math(algebra) and/or control skills
- Basic experience with RGBD sensors

Helpful but not required
- Experience with ROS
- Experience with robot manipulators
- Python


Related literature

[1] B. V. Adorno and M. Marques Marinho, "DQ Robotics: A Library for Robot Modeling and Control," in IEEE Robotics & Automation Magazine, doi: 10.1109/MRA.2020.2997920.

[2] A. Sarker, A. Sinha and N. Chakraborty, "On Screw Linear Interpolation for Point-to-Point Path Planning," 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2020, pp. 9480-9487, doi: 10.1109/IROS45743.2020.9341651.

[3] Riddhiman Laha, Luis F.C. Figueredo, Juraj Vrabel, Abdalla Swikir, and Sami Haddadin. "Reactive Cooperative Manipulation using Set Primitives and Circular Fields." 2021 IEEE International Conference on Robotics and Automation (ICRA), Xi'an, China.

[4] R. Laha, A. Rao, L. F.C. Figueredo, Q. Chang, S. Haddadin, N. Chakraborty, "Point-to-point Path Planning based on User Guidance and Screw Linear Interpolation," in ASME International Design Engineering Technical Conferences and Computers and Information in Engineering Conference IDETC/CIE2021.

 

For more information, please contact:

Dr Luis Figueredo

Riddhiman Laha (riddhiman.laha@tum.de)

Human-Motion Aware Collision Handling and Avoidance during Motion Planning

Recent advances in robotics technologies are closing the gap between humans and robots. Still, fluent and safe interactions of humans and robots require both partners to understand and anticipate each other's actions and to be able to move and react accordingly in a timely fashion. At MSRM, we are developing state-of-the-art human models, robotic controllers, and planners that will enhance robot capabilities and autonomy during human-robot interaction and collaboration.
One of the bottlenecks in existing methods is a proper integration between human motion estimation and prediction with reactive motion behavior from the robotic system. In this work, you will have the opportunity to work with state-of-the-art robots and algorithms, and be part of the exciting community that drives robotics forwards. Indeed, be excited and passionate about the research is a fundamental prerequisite. You have the right support, but you will also be expected to work as hard as we do.

Tasks may include a few of the below (to be discussed depending on your interest and background):
- RGBD-based skeleton tracking;
- Trajectory and skeleton prediction for the human motion (unsupervised approach);  
- Imitation learning from human demonstration;
- Probabilistic intention-aware human motion from demonstrations;
- Motion-intention-aware reactive motion planning;
- Biomechanics-intention-aware reactive motion planning;
- Intention-aware human-robot collaboration.


Type: Research Internship, Master Thesis, Master Internship, Bachelor Thesis
(The tasks can be split into different work packages)


Pre-requisites:
- Motivation to work in a team and driving environment
- Good C++ / Python skills
- Good experience with RGBD sensors


Helpful but not required
- Basic experience with ROS
- Basic knowledge about robotics/manipulators


Related literature

[1] Butepage, J., Kjellstrom, H., & Kragic, D. (2018). Anticipating Many Futures: Online Human Motion Prediction and Generation for Human-Robot Interaction. Proceedings - IEEE International Conference on Robotics and Automation, 4563–4570. doi.org/10.1109/ICRA.2018.8460651

[2] D. Koert, J. Pajarinen, A. Schotschneider, S. Trick, C. Rothkopf and J. Peters, "Learning Intention Aware Online Adaptation of Movement Primitives," in IEEE Robotics and Automation Letters, vol. 4, no. 4, pp. 3719-3726, Oct. 2019, doi: 10.1109/LRA.2019.2928760.

[3] Riddhiman Laha, Luis F.C. Figueredo, Juraj Vrabel, Abdalla Swikir, and Sami Haddadin. "Reactive Cooperative Manipulation using Set Primitives and Circular Fields." 2021 IEEE International Conference on Robotics and Automation (ICRA), Xi'an, China.

[4] L. F. C. Figueredo, R. C. Aguiar, L. Chen, S. Chakrabarty, M. R. Dogar and A. G. Cohn, "Human Comfortability: Integrating Ergonomics and Muscular-Informed Metrics for Manipulability Analysis During Human-Robot Collaboration," in IEEE Robotics and Automation Letters, vol. 6, no. 2, pp. 351-358, April 2021, doi: 10.1109/LRA.2020.3043173.

 

For more information, please contact:

Dr Luis Figueredo

Riddhiman Laha (riddhiman.laha@tum.de)

Prosthetics

Forschungspraxis: Robot Multibody Modelling

MSRM is currently developing a new prototype of an intelligent full mechatronic arm prosthesis.
The work focuses on modeling of multi-body systems.

You will get the opportunity to work in:

  • Mechanical Modeling (Matlab/Simulink), Sim Mechanics

Prior knowledge in multi-body systems and robotics is helpful.

The work is offered as Forschungspraxis (FP).

For more information please contact:

alexander.toedtheide@tum.de

Tel.: +49 (89) 289 - 29414

Mechatronics System Developement

Forschungspraxis: Experimental Control Performance Testing and Model Identification of a Pneumatic Acuation Unit

I am always looking for talented students who are interested in doing their internship (Forschungspraxis) in the following fields:

  • mechatronic systems and robot development (CAD + system design)
  • optimization and identification, 
  • control and observers,
  • pneumatic acutators,
  • modeling of multibody systems and actuators,
  • experimental data analysis and signal processing.

It would be benefitial if you have already gained some experience in Matlab Simulink or Solidworks. It might also be possible to write a master thesis after the internship.

Please contact me by the following address:
alexander.toedtheide@tum.de

Development of compliant actuator development algorithm and toolbox

Compliant actuators are a hot topic in robotics. Even though most of the developed systems are unique in their fields, they are mostly a product of the integration of off-the-shelf parts. However, the development process relies on experience, intuition, and trial error. The main goal of this research opening is to develop an inclusive algorithm to select necessary components for the compliant actuator such as gearbox, motor, cooling system, etc. 

 

Type: Research Internship, Master Thesis, Master Internship

(The tasks can be split into different work packages)

 

Pre-requisites:

  • Prior knowledge of Matlab/Simulink
  • Basic knowledge of actuator models
  • Basic knowledge of mechanics, fatigue, wear, thermal models
  • Strong background in dynamic modeling
  • Knowledge of optimization

 

Application:

Please send these documents to [mehmet (dot) yildirim (at) tum (dot) de]

  • Transcript
  • CV
  • Max. 200 words, letter of intention
  • Portfolio of previous projects

 

Legged Locomotion

Dynamics Algorithms – Closed form Computation of the EoM

Type: Research Internship

Description:

Due to the increasing complexity of robots, especially in the field of legged robots, the computation of the equations of motion has received more and more attention throughout the last decades. The spatial formulation of dynamic quantities and algorithms such as the recursive Newton-Euler algorithm (RNEA) and the composite rigid-body algorithm (CRB) are some examples that underline the progress that has been made in the field. However, the “algorithmic toolbox” that is available in most frameworks today still lacks for example the computation of state and partial derivatives, the computation of the regressor, the explicit computation of the Coriolis matrix, etc.

The Rigid Body Dynamics Library (RBDL) developed by [1] is based on Featherstone’s spatial formulation of the RNEA and CRB. It also contains methods to compute the forward kinematics and Jacobians for an arbitrary tree-like structure. However, like most other libraries, it lacks the explicit numerical computations of some quantities that might result in better control laws. To get past these shortcomings, the goal of this student project is to augment the RBDL library with a novel iterative method introduced by [2]. With this algorithm, it is possible to obtain the closed-form solution of the EoM as well as their derivatives.

Prerequisites:

  • Strong background in kinematic and dynamic modeling
  • Strong C programming skills

Literature:

[1] Felis, Martin L. "RBDL: an efficient rigid-body dynamics library using recursive algorithms." Autonomous Robots 41.2 (2017): 495-511.

[2] Garofalo, Gianluca, Christian Ott, and Alin Albu-Schäffer. "On the closed form computation of the dynamic matrices and their differentiations." 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE, 2013.

Contact:
Dennis Ossadnik
dennis.ossadnik@tum.de

Robot Safety: Bachelor/Master Thesis | Research/Engineering-Internship

Collision Analysis and Safe Control in Human-Robot Interaction

Currently, increasing effort is taken in the robotics community to understand injury mechanisms during physical human-robot interaction (pHRI). This is motivated by the fact that human and robot will work intensively and closely together, and therefore, one has to be aware of the potential threats in case such a close cooperation takes place and take appropriate countermeasures to ensure human safety via planning and/or control. In the context of safety in pHRI, possible topics that can be addressed in the thesis/internship are:

  • Design and analysis of collision experiments and/or testing devices
  • Development and verification of collision simulations
  • Survey of biomechanics and forensics literature
  • Development of robot motion planning and/or control schemes for ensuring human safety

Prerequisites:

  • Studies in Mechanics, Mechatronics, Electronics, Computer Science 
  • Knowledge in robotics & control (for topics on planning & control)
  • Good C++ programming skills
  • Matlab/Simulink
  • Working knowledge in ROS
  • Ability to work well structured and organized
  • Creativity

Contact:
Mazin Hamad, M.Sc.
mazin.hamad@tum.de
Chair of Robotics Science and Systems Intelligence

AI-Enabled Lab-Automation

AI unterstützter robotischer Lab Assistent: Zuverlässiges automatisiertes Flüssigkeitshandling für den Laboralltag der Zukunft

Typ: Forschungs-/Ingenieurpraktikum

→ Für weitere Informationen kontaktieren Sie uns!

Die Automatisierung von Laborprozessen in der Chemie, Bio-, Pharma- und Lebensmitteltechnologie sowie in der Medizin ist bereits heute Realität. Doch viele Lösungen, die heute auf dem Markt angeboten werden, sind entweder zu hochpreisig und/oder nur speziell und unflexibel für bestimmte Prozesse im Labor entwickelt und optimiert. Damit die Laborautomatisierung für den dynamischen Laboralltag als Werkzeug für Jedermann genutzt werden kann, soll in diesem Anwendungsforschungsprojekt ein mit AI unterstützter Automatisierungsablauf für einen robotischen Laborassistenten entwickelt und im Labor umgesetzt werden. Als zu automatisierender Prozess soll vorerst das zuverlässige automatisierte Flüssigkeitshandling angegangen werden. Um dies zu erreichen steht einer der neusten kollaborativen Roboter zur Verfügung sowie 3D-Druck-Möglichkeiten zur flexiblem Gripperfingerentwicklung.

Folgender Ablauf ist geplant:

  • Ausführliche Literaturrecherche Robotik in der Laborautomatisierung
  • Analyse und Entwicklung von Roboterfingersystemen zur Nutzung von Laborwerkzeugen
  • Analyse, Automatisierung und Evaluierung von Prozessabläufen in Bezug auf Flüssigkeitshandling im Labor
  • Und vieles mehr …

Voraussetzungen:

  • Aus dem Fachbereich Elektrotechnik und Informationstechnik, Maschinenbau oder Mechatronik
  • Grundwissen in Robotik, Regelungstechnik und Systemtheorie
  • Gute Programmierkenntnisse in C/C++, Python, Matlab
  • Gute Fähigkeiten in CAD-Design (SolidWorks usw.)
  • Erfahrung mit 3D-Druck

Kontakt:
Dennis Knobbe
dennis.knobbe@tum.de
+49 (89) 289 - 29412

Human modeling

3D-Modellierung und Echtzeitvisualisierung der Muskelverformung

Es gibt heutzutage zahlreiche 3D computergraphische Modelle für muskuloskelettale Systeme, z.B.
- statisches Modell,
- bewegliches aber rechenintensives Modell
- echtzeitfähiges Modell mit aber abstrahierter Muskeldarstellung.
Diese Arbeit handelt sich um eine algorithmische Kombination der vorteilhaften Merkmale von den Stand-der-Technik-Modelle, und zwar eine rechnerische Lösung zur echtzeitfähiger 3D-Visualisierung mit sowohl anatomisch korrekter (Muskel-)Darstellung als auch diversen Bewegungsfreiheitsgrade. Hierbei ist es eine rechnerisch effiziente Muskelverformung während der Bewegungen eine Herausforderung. 

Wir erwarten von Ihnen
- Grundkenntnis der Mechanik und der Mehrkörpersysteme (Kinematik, Statik)
- Grundkenntnis numerischer Mathematik
- Objektorientierte Programmierung (C++ oder C#)
- Interesse an Visualisierung der Anatomie und Entwicklung von Computerspielen

Wir bieten
- Gut gestalteten Arbeitsplatz
- Fachliche Betreuung

Kontakt:
M.Sc. M.Sc. Tingli Hu
tingli.hu@tum.de