Sensors Midwest AV Workshop: Will Tu & Crew Show You What To Do


At the risk of sounding self-promoting in the ways of industry tradeshows, it’s more than self-evident that the Automotive & Autonomous Vehicle Workshop held at Sensors Midwest this week was one of the most informative in terms of where the development of autonomous vehicles is now and insightful as to where they are going technologically. A panel of four industry experts in diverse areas of autonomous vehicles (AVs), hosted and moderated by Will Tu, Senior Director of Xlinx’s Automotive Business Unit, walked attendees through the intricate maze of tech issues AV engineers and designers are facing and how they may be dealt with successfully.


Free Newsletter

Like this article? Subscribe to FierceSensors!

The sensors industry is constantly changing as innovation runs the market’s trends. FierceSensors subscribers rely on our suite of newsletters as their must-read source for the latest news, developments and analysis impacting their world. Register today to get sensors news and updates delivered right to your inbox.

First up, Egil Juliussen, IHS Markit’s Director of Research & Principal Analyst in Automotive Technology, gave a brief outline of the AV market with details as to the disruptive nature of its unique technologies. His presentation migrated seamlessly into the relationships between various components such as automotive infotainment systems, driver assistance systems, intra- and extra-vehicle connectivity, and autonomous driving safety features.


The concepts laid out clearly show how critical it is for these systems to work together flawlessly to ensure safe, smooth, and enjoyable rides. His forty year forecast for the automotive/AV arena: “In 2057, if you live in an urban area, you will need to go to a driving track to drive your own car, just like you do with horses today.”


Jada Tapley, Vice President of Advanced Engineering & External Relations at Delphi outlined and clearly explained the importance of how sensor systems and software need to interact to ensure accurate and safe AV operation.  Working in conjunction with all of the heavy hardware that makes up the average car, software and sensors work hand-in-arm to make automated driving work.


Ms. Tapley’s presentation put a deep emphasis on the importance of software for accurately interpreting and acting upon sensor data. An important aspect was brought to light; the ability of smart-vehicle architectures to address Level 4 computing requirements.


Attendees of Jada’s session came away knowing how the convergence of megatrends are driving exponential computing power demand; knowing how smart cities create near-term opportunities for automated driving applications, knowing the importance of reducing electrical architecture complexity to enable content growth, and cognizant of future opportunities in connected services and smart cities integration.


Leonard Nieman, In-Vehicle Apps Architect Manager at General Motors discussed aspects of the in-vehicle infotainment (IVI) experience, particularly its future. Once considered a luxury option, IVI is becoming standard on most new vehicles. Mr. Nieman believes the Technologies that will drive the most change to IVI is natural language voice, larger display screens, new apps, projection, hand gesture recognition, and, obviously, autonomous vehicles.


Mr. Nieman follows with what will shape consumers’ choices. These include wireless features, relevant and better data, more apps, ease of use, familiarity, and more relevant capabilities.


Michael Blicher, Director of Business Development at Autotalks, addressed how vehicle-to-vehicle (V2V) communication may affect the future of IVI, autonomous driving, and the passenger experience. Considered a major safety technology, V2V communications enable unrelated vehicles to communicate with each other when in close proximity of each other on the road. The advantages are obvious, i.e., you’re getting too close, are you going to turn, collision imminent, etc.


One of the exciting possibilities of V2V is the ability of vehicles to communicate with their drivers of possible events that are not visible. For example, a car around a corner can learn if a vehicle around the other side of the corner is coming, how fast, and which way it may turn. In other words and in summary, “V2X is about knowing ahead without visibility.”


Senior Director of Xlinx’s Automotive Business Unit Will Tu wrapped things up with an in depth look at the compute engines that will drive machine learning and autonomous driving. Will really gets under the hood of autonomous vehicles, covering the interrelationships of software (development tools to aid machine learning), emerging sensor technologies such as LiDAR, and processing (semiconductors).


The presentations where followed by a panel discussion involving all of the participants and attendees were given the opportunity to ask questions and discuss the issues with the experts. Have questions? Well, you had to be there. ~MD

Suggested Articles

OmniVision's new OX01F10 SoC module provides automotive designers with a small form factor with low-light performance, ultra-low power and reduced cost.

Several industry leaders have formed a QSFP-DD800 Multi-Source Agreement (MSA) Group to expedite development of high-speed, double-density, quad small form…

NXP Semiconductors N.V. has announced its secure fine ranging chipset, SR100T, to achieve precise positioning performance for next-generation UWB-enabled…