I’m Clemens Schuwerk, an engineer and expert in robotics. Grown up in southern Germany, I studied Electrical Engineering and Information Technology at the Technische Universität München (TUM) and the University of Edinburgh. Driven by my passion for game-changing technologies, I joined the Chair of Media Technology at TUM in 2011 to pursue my PhD (Dr.-Ing.) in the areas of robotics and haptic technologies. My dissertation was awarded „summa cum laude“ in 2016. After my PhD, I focused on transferring research results on robotic perception into business and worked as a project leader to spin off a robotics startup from the university. Together with my colleagues, I received the nationwide award „Digital Innovations“ in 2018. Today, my interests include smart robots, robotic perception and artificial intelligence. I have also been working as a freelancing web developer since 2009.
Smart and autonomous service robots will change our daily lives by supporting us with the dull, dirty and dangerous jobs at work or at home. But to achieve autonomy, there are still many open challenges to solve, especially in the area of perception or grasping.
Recent advancements in Machine Learning and Deep Learning promote many novel applications such as autonomous cars or smart robots. From object detection to grasp planning - there are many exciting areas that evolve rapidly.
The feedback we receive today while interacting with machines is primary audio-vision-based, whilst many studies have proven the importance of the sense of touch for various tasks. Extending human-machine-interaction with more than simple vibration feedback, thus, will open up many new applications in the near future!
Remote controlled robots are a prime example for machines that benefit from the integration of haptic feedback. The required low-delay data exchange between the human operator and the remote robot, however, poses severe challenges on the underlying communication and control systems.
Head-mounted display allow us to immerse into virtual worlds, but today the immersion is mainly audio-visual. Novel haptic joysticks, algorithms and control strategies, however, enable us to also feel virtual environments, bringing the immersion to a new level.
Shared Haptic Virtual Environments are often realized using a client/server architecture, which suffers from heavy communication load and reduced fidelity in the presence of communication delay. This thesis proposes human-centric data reduction schemes for the haptic interaction with remotely simulated objects. The system analyses reveal guidelines on the communication delay below which the effects of delay are not perceivable. If this delay is exceeded, the proposed algorithms compensate for its effects.
Available online: mediaTUM
Rated by a jury of entrepreneurs, investors and startup coaches, my three minute long pitch was awarded as the best startup pitch at the "UnternehmerTUM Demo Day" event.