Vision sensor uses event, not frame-based algorithms

Prophesee SA develops event-based vision sensor
Prophesee SA has developed the industry's first standard package event-based vision sensing chip. (Prophesee SA)

Prophesee SA has reportedly introduced the industry's first standard package vision sensing chip that leverages event-based vision technology, considered an advancement over frame-based vision approaches. The chip, under development for five years, is aimed at developers of cameras to enable next-generation vision in industrial automation and IoT systems such as robots, inspection equipment, monitoring and surveillance devices.

The part leverages Prophesee's neuromorphic vision technology to offer highly efficient capabilities for various use models, including ultra high-speed part counting, vibration measurement and monitoring or kinematic monitoring for predictive maintenance. 

The chip, available in a 13 x 15 mm mini PBGA package, integrates Prophesee's third generation CMOS Image Sensor (CIS) vision module. It features 640 x 480-pixel resolution with 15-μm pixels in a 3/4 in. optical format. It is manufactured in a 0.18-micron specialized CIS process. 

Free Newsletter

Like this article? Subscribe to FierceSensors!

The sensors industry is constantly changing as innovation runs the market’s trends. FierceSensors subscribers rely on our suite of newsletters as their must-read source for the latest news, developments and analysis impacting their world. Register today to get sensors news and updates delivered right to your inbox.

RELATED: Machine Vision Cameras Feature Autofocus, Autoiris, & PoE

"This is a major milestone for Prophesee and underscores the progress in commercializing our pioneering Event-Based Vision sensing technology. After several years of testing and prototyping, we can now offer product developers an off-the-shelf means to take advantage of the benefits of our machine vision inventions that move the industry out of the traditional frame-based paradigm for image capture,” said Luca Verre, co-founder and CEO of Prophesee, in a statement.

In Prophesee Metavision sensors, each pixel is independent and asynchronous, only activating if it senses a change in the scene, a movement, an event. According to the company, this allows for major reductions of power, latency and data processing requirements imposed by traditional frame-based systems. It reportedly enables sensors to achieve much higher dynamic ranges than commonly associated with high-speed vision. It allows cost-efficient sensors and systems to record events that would otherwise require conventional cameras to run at 10,000 images/second.

The sensor can be used by system developers to improve and in some cases create whole new industrial uses, including accelerating quality assessment on production lines; positioning, sensing and movement guidance for robots to enable better human collaboration; and equipment monitoring (e.g. caused by vibration, kinetic deviations) suiting the system for predictive maintenance and reducing machine downtime.

The sensor is supported by a comprehensive software development kit (SDK), a full set of drivers, data recording tools and an online knowledge center.

Suggested Articles

Researchers at Nvidia think AI will be used to help learn laws of physics to help train machines and vehicles learn how to move and manuever.

Displays are now liberated from rectangular formats; Smart algorithms and inspection and repair technology advances make it easier to design them.

Hyris bCUBE testing device for surface COVID-19 relies on AI to process data