Accurately determining an object’s position and orientation, known as 6DoF pose tracking, is a crucial capability for a wide range of applications, yet conventional camera systems struggle with issues like motion blur and poor lighting. Zibin Liu, Banglei Guan, and Yang Shang, all from the National University of Defense Technology, along with colleagues, address these limitations by developing a novel tracking method using event cameras, bio-inspired sensors that excel in dynamic and low-light conditions. Their approach leverages optical flow to precisely characterise object movement, combining two and three-dimensional features extracted from both the event data and object models. The team’s innovative technique establishes correlations between key object features, iteratively refining the 6DoF pose and achieving significantly improved accuracy and robustness compared to existing event-based tracking methods, representing a substantial advance in the field.
Object pose tracking represents a pivotal technology in multimedia applications and currently receives considerable attention from researchers. Traditional camera-based methods frequently encounter challenges including motion blur, sensor noise, partial occlusion, and fluctuating lighting conditions. Emerging bio-inspired sensors, notably event cameras, offer advantages such as high dynamic range and low latency, potentially resolving these limitations. This work presents an optical flow-guided Degrees of Freedom (DoF) object pose tracking method specifically designed for use with event cameras, leveraging optical flow information to enhance the accuracy and robustness of pose estimation.
Event Camera Tracks 6DoF Object Pose
Scientists have developed a novel method for tracking the six degrees of freedom (6DoF) of objects using event cameras, bio-inspired sensors that overcome limitations of traditional cameras. The research addresses challenges such as motion blur, sensor noise, and varying lighting conditions by leveraging the high dynamic range and low latency inherent in event camera technology. The team’s work centers on an optical flow-guided approach, enabling continuous and accurate pose tracking of objects in dynamic scenes. The method begins with a 2D-3D hybrid feature extraction strategy, precisely characterizing object motion by detecting corners from event-based Time Surfaces and edges from projected object models, effectively capturing object contours and identifying key geometric features for pose determination.
Subsequently, scientists search for the optical flow of these detected corners, maximizing event-associated probability within a spatio-temporal window to establish correlations between corners and edges. This innovative use of optical flow addresses the challenges posed by the asynchronous and discretized nature of event data. Results demonstrate that the team’s method achieves robust 6DoF object pose tracking through iterative optimization, minimizing the distances between corners and edges to refine pose estimates. Tests conducted using both simulated and real-world event streams show significant performance gains over existing event-based methods, delivering a more accurate and reliable solution for object tracking in challenging conditions and opening possibilities for applications in augmented reality, robotic grasping, and autonomous navigation.
Event Camera Pose Tracking with 3D Models
This research presents a novel method for tracking the six-degree-of-freedom pose of objects using event cameras, which are bio-inspired sensors offering advantages over traditional cameras in challenging conditions. The team developed a system that combines two-dimensional event data with three-dimensional object models, extracting key features like corners and edges to precisely characterise object motion. By calculating optical flow based on the probability of events occurring in space and time, the method establishes correlations between these features and iteratively refines the object’s pose, achieving continuous tracking. Experimental results, conducted with both simulated and real-world event data, demonstrate that this approach outperforms existing event-based tracking methods, particularly when dealing with significant occlusion and complex backgrounds, indicating improved accuracy in pose estimation across various scenarios.
👉 More information
🗞 Optical Flow-Guided 6DoF Object Pose Tracking with an Event Camera
🧠 ArXiv: https://arxiv.org/abs/2512.21053
