Welcome to my website!
My name is Niklas and I am pursuing a PhD degree under the supervision of Jan Peters at the Intelligent Autonomous Systems Group at TU Darmstadt.
Generally speaking, I am interested in all sorts of algorithms and methods enabling and advancing Intelligent Systems. In the past years, I have especially focussed on the intersection between Machine/Reinforcement Learning and Control. Besides trying to make the ideas work in simulation, I am also particularly interested in real-world demonstrations.
BSc in Electrical Engineering and Information Technology, 2017
ETH Zurich
MSc in Robotics, Systems & Control, 2020
ETH Zurich
This work proposes a new event-based optical tactile sensor called Evetac. The main motivation for investigating event-based optical tactile sensors is their high spatial and temporal resolutions and low data rates. Benchmarking experiments demonstrate Evetac’s capabilities of sensing vibrations up to 498 Hz, reconstructing shear forces, and significantly reducing data rates compared to RGB optical tactile sensors. Moreover, Evetac’s output provides meaningful features for learning data-driven slip detection and prediction models. The learned models form the basis for a robust and adaptive closed-loop grasp controller capable of handling a wide range of objects. We believe that fast and efficient event-based tactile sensors like Evetac will be essential for bringing human-like manipulation capabilities to robotics.
Finalist for IROS Best Paper Award on Mobile Manipulation
This work investigates the effectiveness of tactile sensing for the practical everyday problem of stable object placement on flat surfaces starting from unknown initial poses. We devise a neural architecture that estimates a rotation matrix, resulting in a corrective gripper movement that aligns the object with the placing surface for the subsequent object manipulation. We compare models with different sensing modalities, such as force-torque, an external motion capture system, and two classical baseline models in real-world object placing tasks with different objects. The experimental evaluation of our placing policies with a set of unseen everyday objects reveals significant generalization of our proposed pipeline, suggesting that tactile sensing plays a vital role in the intrinsic understanding of robotic dexterous object manipulation.
This work presents a training procedure and a pretrained model that can estimate the normal force distribution and the contact area acting upon the GelSight Mini visuotactile sensor. The representation maps from the raw output images of the sensor to the normal force distribution acting across the entire sensor surface. This is appealing as it allows to directly do classical force control based on the output of visuotactile sensors. Moreover, it yields a physically grounded, interpretable representation of the tactile signals.
Best Paper Award in the Geometric Representations Workshop at ICRA 2023
We propose learning task-space, data-driven cost functions as diffusion models. Diffusion models represent expressive multimodal distributions and exhibit proper gradients over the entire space. We exploit these properties for motion optimization by integrating the learned cost functions with other costs in a single objective function, and optimize all of them jointly by gradient descent.
We propose a novel hybrid method for Robot Assembly discovery that is based on a combination of Mixed Integer Programming and a graph-based reinforcement learning agent.
We propose a novel method for learning to assemble arbitrary structures from scratch. The transformer-like graph-based neural network jointly decides which blocks to use and how to assemble the structure with the robot-in-the-loop.