Embedded vision tech pops up all over from robots to smart cars

 

Embedded vision technology is hot.

It runs inside everything from smartphones to manufacturing lines where high-resolution images of finished products can be used to detect imperfections and other irregularities in milliseconds.

Today, engineers are also developing applications for smart cities and smart cars.  Tiny cameras can process precision images for objects both near and far, recognizing everything from bar codes to license plates.

Internet of Things applications are also on the rise in factories, medical and retail settings.  Many require image capture that is processed locally and connected to the cloud for further processing, data analytics or storage.  Industrial applications of embedded vision are similar to what happens with a smartphone but customized to a specific application need and also ruggedized and designed with a longer life cycle.

“We see a lot of value in robotics for embedded vision with applications that need to respond quickly,” said Brian Geisel, CEO of Geisel Software. Robotics are often used in remote areas, from space to coal mining.

Advanced Driver Assistance Systems are also incorporating embedded vision. “We’re going to see embedded vision become more mainstream as hardware becomes smaller, faster, cheaper,” Geisel said. Algorithmic improvements will allow developers to undertake tasks with less powerful devices.

“We’ll have a lot more ability to make use of embedded vision as we are able to shrink the necessary compute footprint,” Geisel added. “There are so many places where streaming a mass amount of data isn’t feasible, so we will see an explosion of new applications as we can enable more computer vision at the edge.”

Embedded vision for industry  "is moving from bleeding edge to cutting edge and will become mainstream in the coming decade,” said Tim Coggins, head of sales for embedded imaging modules in the Americas for Basler AG, an industrial camera manufacturer that produces a range of software and hardware products.  “Early adopters have a strong, compelling business case to do it now and not wait.”

Engineering students may know Basler for its web-based Vision Campus tutorials that explain in simple terms camera technology, interfaces and standards, vision systems and components and applications.  Some are presented with text and diagrams, while others are explained in quick videos by expert presenter Thies Moeller.  There are tutorials on everything from, “What’s the best way to compare modern CMOS cameras?” to “At which point does software come into play during image processing?”

One recent video explains five tips for making embedded vision systems to avoid common pitfalls. One tip: Have the system developed by a single source instead of having the development of key components carried out separately. With separate development, components may not interact with each other in a high-performance manner causing costly delays, Moeller explains.

A growing number of IoT applications that involve image capture require non-recurring engineering (NRE) for a one-time cost to research, design, develop and test a new approach. “Many early adopters have high volume requirements or strategic application requirements and the NRE in these cases is not an obstacle,” he said. “They can justify it by cost.”

Without standard plug and play solutions on the market, custom embedded vision can take time to market and require NRE costs that vary depending on the complexity of the application. “The primary challenge for developers is a large variety of hardware and software variables that need to come together to adopt a common connect ability,” Coggins said.

OpenCV for image processing is a good example of standard embedded vision software, noted Adam Taylor, founder of Adiuvo Engineering.  First developed in 2000 by Intel, OpenCV is a library of programming functions for real-time computer vision that is cross platform and open source under an Apache 2 license. Developers use it to process images, capture video and analyze the video for object or facial detection and other purposes.

 “Standards are how you scale and define the maximum benefit of accelerated development and easier engineering development,” Taylor said. “Embedded vision should be just plug and play—and boring to a large extent – allowing companies to focus on value-added activities and not just trying to get an image from yet another sensor/camera with a different interface.”

Basler is driving standards in the embedded vision industry, and a quick look at its website shows just how involved this standardization effort can be.  “The embedded ecosystem is already in place and continues to grow with many talented companies and individuals who can educate and provide help, answer questions or develop systems solutions,” Coggins said.

Basler offers complete system design as well mass production, but so do many of Basler’s partners, Coggins said.  Basler’s partners include companies like Nvidia, with its Nvidia Jetson platform. In one example, Basler last June announced an embedded vision development kit, which extends Basler’s support for Jetson products to provide AI at the edge in robotics, logistics, smart retail and smart cities.  Nvidia boasts nearly half a million AI-at-the-edge developers.

The kit comes with a Basler dart BCON for MIPI camera with a 13-megapixel lens and an adapter board developed for the Jetson Nano module.

The chief advantage of embedded vision tech is the ability to offer scalability to the overall computer vision market with low cost, high performance and real time operation bolstered by edge-to-cloud connectivity.

 “Companies moving to embedded vision technology are quite content and early adopters have good reason to be so,” Coggins said. “Most of our clients want the reliability that comes in an industrial ruggedized solution with a long-life cycle. They can’t get this from the consumer market.”  

Grand View Research valued the overall global computer vision market at $10.6 billion in 2019 with 70% coming from hardware such as high-resolution cameras.  Growth of 7% each year is expected through 2026.    That 2019 total does not specifically segment out embedded vision systems, but Grand View says that the industrial vertical made up half of all revenues in 2019.

The researchers count Intel, Omron, Sony and Texas Instruments among the most prominent players in computer vision.

Embedded Vision will be the subject of digital keynotes and a panel discussion on January 27 as part of Embedded Innvation Week. Sign up for the free event online.