Event-based vision, utilising cameras that output sparse, asynchronous data, promises revolutionary advances in sensing speed and efficiency! Researchers Nimrod Kruger, Nicholas Owen Ralph, and Gregory Cohen, all from the International Centre for Neuromorphic Systems at Western Sydney University, alongside Paul Hurley from the Centre for Research in Mathematics and Data Science, have tackled a key challenge in this field: integrating this nonlinear event representation with established linear optical system models! Their new work details a physics-grounded pipeline which maps event streams into measurable log-intensity and intensity derivatives, effectively embedding them within a dynamic linear systems framework, allowing for inverse filtering directly from event data! This innovative approach, validated through both simulation and real-world telescope data, offers a crucial bridge between event sensing and model-based imaging, potentially unlocking a new era of dynamic, high-performance optical systems.
Event streams mapped to dynamic linear models offer
The study establishes a physics-grounded approach, formalising the relationship between event streams and log-intensity derivatives via pixel event interpolations. This innovative method allows for the estimation of not only log-intensity but also its temporal changes, providing a richer dataset for computational imaging. Crucially, the research embeds these estimations into a dynamic linear systems model, representing the optical system as an operator T(t) projecting an input field to an intensity map. This model accounts for the dynamic nature of many optical systems, such as those with tunable focus, and facilitates the application of frequency-domain Wiener deconvolution using a known or parameterised dynamic transfer function.
Experiments demonstrate the framework’s efficacy in both simulated and real-world scenarios. Simulations involving single and overlapping point sources under modulated defocus confirmed the approach’s ability to accurately model dynamic optical systems. Furthermore, the team validated their pipeline on real event data captured by a tunable-focus telescope imaging a star field, successfully demonstrating source localisation and separability. The proposed framework provides a practical bridge between event sensing and model-based imaging for dynamic optical systems, enabling direct scene inference from event-stream data.
This work opens exciting possibilities for applications requiring high-speed, low-bandwidth imaging, such as robotics, industrial inspection, and medical diagnostics. By moving beyond traditional image reconstruction techniques, the researchers have created a pathway for Computational Neuromorphic Imaging (CNI) to overcome latency and bandwidth limitations. The ability to directly process event data through a physics-grounded model, utilising known optical transfer functions and scene priors, promises to expand the design possibilities for machine vision systems and advanced optical sensors, ultimately leading to more efficient and powerful imaging technologies.
Event Data Reconstruction via Dynamic Linear Systems is
Scientists developed a novel processing pipeline to directly extract scene information from event-based vision sensors, overcoming limitations inherent in traditional imaging approaches! These event cameras output asynchronous streams of ON/OFF events triggered by log-intensity threshold crossings, offering microsecond resolution and high dynamic range, but presenting challenges for integration with conventional linear imaging models. The research team engineered a method to map these event streams into estimates of per-pixel log-intensity and its derivatives, embedding these measurements within a dynamic linear systems model featuring a time-varying point spread function! This innovative approach enables inverse filtering directly from event data, utilising frequency-domain Wiener deconvolution with a known, or parameterised, dynamic transfer function.
Experiments employed a rigorous methodology beginning with the formalisation of pixel log-intensity threshold-crossing events into estimates of log-intensity derivatives, intensity, and intensity derivatives via pixel event interpolations! The study pioneered a dynamic imaging system model, a linear forward model representing optical signals transferred through a dynamic system denoted by the operator T(t), and projecting these signals onto the sensor plane, relating expected intensity and changes to the information encoded in the event stream. Researchers harnessed a scene-informed spatial-temporal filter, designing it as an inverse operator on event data to extract scene estimations, leveraging controlled system parameters and scene priors for direct inference from the event stream. To validate this framework, the team first conducted simulations of single and overlapping point sources under modulated defocus, meticulously assessing the system’s ability to localise and separate these sources.
Subsequently, they applied the method to real event data captured by a tunable-focus telescope imaging a star field, demonstrating successful source localisation and separability in a practical setting. The system delivers precise measurements by analysing event data ek ≡(xk, tk, pk), where xk represents the pixel location, tk the time instance, and pk indicates a log-intensity increase (+1) or decrease (-1) upon crossing thresholds θON or θOFF, respectively, accounting for a refractory period trf. This combined event interpolation, known optical transfer functions, and scene priors provide a direct pathway from event streams to scene inference, expanding design possibilities for machine vision, microscopy, and advanced optical sensors! The work demonstrates a practical bridge between event sensing and model-based imaging for dynamic optical systems, offering a refreshed view on log-intensity change events within the context of linear optical systems and paving the way for Computational Neuromorphic Imaging (CNI) models and methods.
Event data enables star localisation and separation with
Scientists have developed a physics-grounded processing pipeline that successfully maps event streams from vision sensors to estimates of per-pixel log-intensity and intensity derivatives! This breakthrough embeds these measurements within a dynamic linear systems model, utilising a time-varying point spread function to enable inverse filtering directly from event data. The team validated this approach through simulations of single and overlapping point sources under modulated defocus, and crucially, on real event data captured by a tunable-focus telescope imaging a star field, demonstrating source localisation and separability. This work provides a practical bridge between event sensing and model-based imaging for dynamic optical systems, opening new avenues for high-speed, low-bandwidth vision applications.
Experiments revealed that the simulation employed an event-stream generated using a sinusoidally modulated tunable lens model, mimicking the optical parameters of a liquid-lens telescope imaging a star field. Pixel parameters were aligned with the EVS Prophesee Gen4 device values, with deliberate non-uniformity introduced across pixel contrast threshold values, set at 0.2 ±0.02, to avoid symmetries in event projections. Following event interpolation, the researchers produced log-intensity estimation frames and intensity derivative estimation images, subsequently performing a 2D Fast Fourier Transform (FFT) to estimate the point source image at infinity, utilising a uniform mean spectral density value and a low-pass filter. Typical images resulting from this process demonstrate the feasibility of reconstructing source information from event data, with a full video of the image sequence provided as supplementary material.
Results demonstrate that during periods of limited dynamics, such as when the simulated spot reaches maximum or minimum size, source estimation diverges in both shape and intensity. To further test the framework’s capabilities, the team simulated data with two point-sources positioned 5.6 pixels apart, one with double the intensity of the other, inducing overlapping events during lens modulation. Analysis of the event-stream, intensity derivative, and resulting source estimation (Figure 4) showed that the output spots were largely noticeable and separable, with the normalised intensity of a line crossing both source locations plotted as a function of time. Optimal source estimation occurred when the spots were also separable on the image plane, such as at t = 380ms, where the reduced intensity of the right source matched the expected 50% intensity compared to the left source.
Tests prove the method’s applicability to real-world data, utilising event data collected from a 2800mm focal length telescope equipped with an Optotune El-10-30-C-VIS liquid lens system, modulated to project the star field ‘M47’. Application of the deconvolution method revealed deviations from the thin lens approximation, highlighting the challenges of imaging broad-spectrum light through a liquid lens system. Slight spherical aberrations resulted in a non-uniform point spread function, breaking the symmetry assumed for focusing, and indicating that a realistic PSF derived from optical simulation would improve performance. Nevertheless, the framework successfully processed the measured event data, demonstrating its potential for dynamic optical system modelling and image reconstruction.
👉 More information
🗞 Optical Linear Systems Framework for Event Sensing and Computational Neuromorphic Imaging
🧠 ArXiv: https://arxiv.org/abs/2601.13498
