Shape Tracking with Event Cameras
This lab project is a practical course, which can also be taken as Digital Engineering project.
Intended Participants | Master students |
Instructors | Benjamin Noack |
SWS | 2 |
Credits | 6 |
Languages | English / German |
Required Knowledge |
|
Desired Knowledge |
|
Project Description
General description:
Event cameras, also known as neuromorphic cameras, differ from conventional cameras in that they do not capture images at a fixed rate. Instead, they output pixel-level brightness changes enabling a very high dynamic range, avoiding motion blur, and offering a latency in the order of microseconds. Traditional vision-based tracking algorithms are not applicable, and new algorithms need to be designed to process the asynchronous sensor output with high temporal resolution.
The project pursues the goal of developing and implementing extended-object tracking algorithms using an event camera. The algorithms shall use so-called extended object tracking algorithms to track the shape and pose of an object. Such models describe the spatial extent by means of randomly scaled versions of the shape boundary. The measurements provided by event cameras typically appear at the boundaries of the moving object, which shall be exploited by the extended object tracking algorithms. However, challenges are the asynchronous sensor output and efficiency to enable real-time tracking.
Project goals:
- Calibration and setup of the event camera
- Implementation of extended tracking algorithms
- Derivation of mathematic models for event sensor
- Development of new tracking algorithm using event sensors
- Evaluation of implemented methods
Subtasks:
- Hardware setup and software implementation
- Literature review for extended object tracking
- Experimental design and definition of evaluation criteria
- Development, testing, and evaluation of tracking algorithms
- Documentation of project
Registration
For any additional questions regarding the project or for any issues with registration, please email