How Machine Vision Is Benefitting The Automotive Industry
By John Oncea, Editor
Machine vision is making its impact felt in every industry and that impact is only going to grow more and more dramatic. Let’s take a look at how it’s making its mark on the automotive industry.
Fill my eyes with that machine vision, no disguise for that machine vision. *
Machine vision (MV) definitions vary but let’s think of it as both a technology and a method that can be used to automate the extraction of information from an image. Put more simply, TechTarget considers it the “ability of a computer to see; it employs one or more video cameras, analog-to-digital conversion (ADC), and digital signal processing (DSP). The resulting data goes to a computer or robot controller. Machine vision is similar in complexity to voice recognition.”
The MV market, according to a recent Grand View Research report, was valued at just under $17 billion last year and is expected to expand at a compound annual growth rate of 12.3% from 2023 to 2030. “The technology is gaining considerable traction across food and packaging, automotive, pharmaceutical, and other industrial verticals owing to abilities such as improved detection of objects, enhanced analysis, monitoring tolerance, and accurate component measuring,” writes Grand View Research. “All these factors are expected to boost the market growth over the forecast period. However, the lack of efficient system operators due to inadequate training is a restraining factor that is likely to obstruct the smooth growth of the market.”
The automotive industry is expected to be the biggest driver of MV market growth, followed by pharmaceuticals/chemicals, printing/labeling, and electronics/semiconductors. Let’s take a look at one of these industries – automotive – and the benefits MV will bring to it over the next half-decade.
* Read this as if you’re singing Foreigner’s Double Vision. **
MV – A Critical Component Of Industry
“In Industry 4.0 and associated digital industry transition, every step in the process, including manufacturing, inventory control of the supply chain, and more, involves a different and innovative approach,” writes ResearchGate. “One of the aims is to develop MV capable of seeing, communicating, and working with more accuracy better than human beings. Enabling robots to perceive and help people in dynamic systems provides the way for many opportunities. In the smart plant of the future, MV plays a significant role, in which automated production lines will adapt themselves to optimize productivity, performance, and profitability.”
The aims ResearchGate can be exemplified in the automotive industry which as noted earlier is expected to be at the forefront of MV implementations and expectations of ROI are high. Drivers & Controls reports on one specific use case, SiLC Technologies’ launch of “what it claims is the most compact and powerful coherent machine vision system available, with the highest resolution and precision, and the longest operating range.”
SiLC’s Eyeonic Vision System delivers levels of vision perception to identify and avoid objects with extremely low latency, even at distances of over half a mile. “At the heart of the system is a silicon photonics chip. With around 10 milli-degrees of angular resolution and mm-level precision, it provides more than ten times the definition and precision of existing LiDAR systems. This enables the sensor to measure the shape and distance of objects with high precision over long distances. The architecture enables the syncronization of multiple vision sensors for unlimited points/second. Uniquely, it also can provide polarization information.”
SiLC’s founder and CEO, Mehdi Asghari, said, “When bringing vision to machines, the criticality of ranging precision, direct monitoring of motion through instantaneous velocity, spatial resolution for recognition of fine features, and polarization for material detection, cannot be understated.”
Another look at MV’s value to the automotive industry is how it can be combined with artificial intelligence (AI) to make the once unfeasible process of inspecting every component coming off the line possible. EPP Europe reports the pairing of MV and AI enables a camera feed to be reviewed in real-time and have faulty widgets identified and tagged either physically or virtually.
Both the Audi A3 line at its Neckarsulm plant and Bosch VHIT, the vacuum & oil pumps manufacturing subsidiary of Bosch, are using this method, Audi to inspect welds and Bosch to drive digital transformation.
Audi assembles about 1,00 cars a day, each with 5,000 welds and it’s impossible to manually inspect each and every car. The fix was to devise a way to inspect 5,000 welds per car inline and infer the results of each weld within microseconds.
“A machine-learning algorithm was created and trained for accuracy by comparing the predictions it generated to actual inspection data that Audi provided,” writes EPP Europe. “The machine learning model used data generated by the welding controllers, which showed electric voltage and current curves during the welding operation. The data also included other parameters such as the configuration of the welds, the types of metal, and the health of the electrodes.
“These models were then deployed at two levels, firstly at the line itself and also at the cell level. The result was that the systems were able to predict poor welds before they were performed. This has substantially raised the bar in terms of quality.”
Bosch is trialing something similar, testing a new proof-of-concept camera-based quality program for use with real-time decision making in industrial settings. “The program captures data from cameras on manufacturing plant floors and logistics warehouses and harnesses machine learning algorithms to identify quality issues and feed information into the MES system, to generate an optimal decision in real time,” EPP Europe writes. “When securely connected to the cloud, the system benefits from continued access to advanced artificial intelligence algorithms and data analytics packages.”
That said, MV isn’t perfect and there will be some pitfalls to avoid. These include choosing the right hardware, being aware of image sensor and lighting, background discernment, object positioning, orientation, and scaling, and managing action and movement.
But, truth be told, MV is coming and there’s probably not an industry that won’t be impacted by the ramping up of its deployment.