Academics

Academic projects showcase

Projects developed by undergrad students using neuromorphic sensing, algorithms, and processing hardware.

SpotiMotion - Inertial motion controller for Spotify

Embedded neural network algorithm design for inertial motion understanding and closed-loop control. SpotiMotion is a project that allows interaction with Spotify using gesture recognition with TinyML and an Arduino Nano 33 BLE board. 

A machine learning model was trained using EdgeImpulse to recognize gestures from the Arduino. The gestures are recognized using the accelerometer and gyroscope sensors from the board. The detected gesture is then sent to a connected computer running a Python script that controls Spotify (using the SpotiPy library) over their API.


Lea Schaab: schaable79981@th-nuernberg.de 

Timo Maußner: maussnerti75065@th-nuernberg.de 

Jonas Reif: reifjo96249@th-nuernberg.de 


Camera-LiDAR fusion for embedded depth estimation

Embedded neural network algorithm design for LiDAR and CMOS camera fusion embedded on a mobile robot platform—a tool to enhance planning and kinematic control of wheeled mobile robots.

The project uses 2D lidar or 2D lidar depth data with camera images to generate accurate depth information. This is done using a depth estimation model (such as MiDAS or Depth-Anything V2), which can (more or less) estimate the depth or relative depth between objects in an image. These estimates are then calibrated with the lidar data to obtain more accurate depth information from the image.

The project uses ROS2 to process the sensor data and for communication between the individual program parts, as well as Python code to fuse the generated depth estimates, the lidar and the camera. The project's goal is to accurately estimate 3D distance data even when no 3D sensors are present, thus making expensive 3D lidar systems redundant in certain applications.


Maximilian Werzinger: max.werzinger@franken-online.org 

Christopher Witzl: christopher.witzl@gmx.de 

Event-based camera-LiDAR fusion for clustering-based depth estimation

This project addresses the challenge of estimating depth using a single DVS (Dynamic Vision Sensor) camera and a 2D LiDAR. While DVS cameras capture changes in light rather than full images, they lack native depth information.

Most existing solutions rely on two DVS cameras for stereo depth estimation. Still, this project introduces a more cost-effective approach by using a 2D LiDAR instead, eliminating the need for expensive 3D LiDAR systems.


Built using ROS2 for communication and data processing, the system can run on various platforms, including wheeled robots with Jetson Nano or similar hardware.

Since 2D LiDAR measures depth in a single plane, a custom clustering algorithm was developed to estimate the depth of objects outside this plane, such as a person's arms.

The fusion algorithm aligns the LiDAR's measurements with dynamic events from the DVS, creating a unified point cloud, that can be visualized in real time.


This approach enhances depth perception for robotics and other applications while maintaining affordability.


Our E-Mail addresses:

Annika Igl: aiglgg@web.de 

Timo Kapellner: timo.kapellner@gmail.com 

Embedded inertial data fusion neural network for gamepad development

Embedded neural network algorithm design for fusing and processing inertial data on a TinyML Arduino Nano BLE board. Motion processing was used to control a game.

Philipp Kremling: kremlingph95027@th-nuernberg.de

Igor Bodyagin: bodyaginig67371@th-nuernberg.de


PlantControlling - Optimizing plant growth with Embedded Fuzzy Logic Control

Embedded fuzzy logic controller design for fusing and processing humidity and light data for controlling a water pump on a TinyML Arduino Nano BLE board. 

Laurin Müller muellerla89125@th-nuernberg.de 

Marco Tong tongma89472@th-nuernberg.de 


SPICEnet Neural Network Porting on iOS

The project focused on porting an open-source neural network called SPICEnet to a mobile setup. The Sensorimotor Processing, Intelligence, and Control at the Edge Network is a novel, lightweight neural network designed for sensor fusion and control, inspired by biological neural computation mechanisms. The project aimed to develop a mobile demonstrator codebase with a graphical user interface (GUI) that allows for extensive network parameterisation, selection of input sources, and visualization of network elements. The application was designed to be portable across different operating systems, specifically Android and iOS.


Omar Ashour ashourom88075@th-nuernberg.de