Machine Vision Comes of Age

Vision technology is taking its place as a major sensing technology. Control engineers are finding it can be an indispensable tool for gathering system status information that can feed automatically into control systems—especially when there is motion involved. As vision technology has matured, image sensors have grown into multi-megapixel resolution, while frame grabbers and eventually merged into....


When I first started dealing with machine vision technology about 35 years ago, digital cameras were a laboratory curiosity, frame-grabber boards were just arriving as commercial products, and image processing was a black art. As vision technology matured, image sensors grew to multi-megapixel resolution, while frame grabbers grew in sophistication and eventually merged into the camera’s electronics. Finally, cameras even absorbed image processing computers as well. Imaging software—originally a library of utilities for functions called “thresholding,” “blob analysis,” and other arcane operations—morphed into wizards that engineers could use to create useful applications even if they had only rudimentary understanding of machine vision concepts.

Verifying the proper packing of aluminum cans in cartons using laser-based point scanners would require N+1 sensors, where N is the number of cans per case. A single smart camera can do the job.

Today, vision technology is ready to take its place as a major sensing technology. Control engineers are finding it can be an indispensable tool for gathering system status information that can feed automatically into control systems—especially when there is motion involved.

Many vision suppliers have dropped terms like “camera,” and “machine vision system” in favor of terms like “vision appliance,” and “vision sensor,” which better describe what their offerings can accomplish for engineers. Along the way, emphasis has shifted from system specifications to discovering what vision can do for you. What vision can do is collect huge amounts of real-world data very rapidly, and turn it into information concisely describing what engineers need to know.

With the new emphasis on what vision can do, instead of what it is, prices for some units have actually come down. Inexpensive units designed with just enough resources to do a specific class of tasks can now compete on price as well as performance with more traditional point-sensor arrays gathering real-time data in control applications. The advantage is that vision systems can be easier to set up, easier to train, and easier to maintain than complex multi-unit point-sensor arrays.

To be clear on the difference, point sensors measure a single parameter at a single location in space. Examples include thermocouples, inductive proximity sensors, position encoders, and laser beams. Vision systems, on the other hand, use area or line-scan camera technology to acquire data from a large number of points within a clearly defined field of view (FOV). They also cover a wide spectral range, from infrared (IR), through visible light (VIS), to ultraviolet (UV). This image-acquisition technology is backed up by powerful computing resources that extract exactly the information needed by the control system, whether that is measuring the temperature of a certain point on a motor, reporting the statistical distribution of quality-parameter variations, or checking proper placement of components on a printed circuit board.

Harley engineers catch the vision

The idea that vision, when used cleverly, can replace a surprising number of sensors is not new. Seven years ago I wrote an article on what was then a new robotic inspection system at the Harley-Davidson motor assembly plant. The system had been built by a Midwestern system integrator, whom I interviewed for the article.

Three ways to verify a threaded fastener: observe proper seating, count threads protruding from the assembly, or observe deformation of a lock washer.

Initially, the system was installed to verify correct installation of the twin-cam 88 engine timing chain. Needless to say, word spread fast among the engineers at Harley that a vision system was being installed and nearly everyone had an idea on how they would like to use it. By the time installation was complete, the system was programmed to perform 30 tests, including checking to make sure the sprockets were fully tightened down.

Ensuring that a fastener is tightened does not seem the sort of thing a vision system would be good at, but it is. There are at least three ways vision can verify that a nut is torqued in a manufacturing environment:

  • Verify proper seating — When an automated system installs a nut, it drives the nut home against something else, such as a washer or a captured part that the nut is intended to retain, the machine vision system can easily determine whether the nut is seated against it. If not, there will at least be a dark shadow exposing a gap, instead of a thin line where the nut meets the surface it should be pressed against.

  • Count threads — Today’s precise and repeatable automated manufacturing systems make every component the same, including the number of threads on any threaded fastener. By measuring the amount of thread protruding from a nut, a vision system can actually observe how much a bolt stretches when properly torqued.

  • Observe deformation — If the nut captures a split-ring or star lock washer, a wave washer, or even a captured deformable component, a vision system can see whether and by how much the captured part has deformed due to compression.

Since writing that article, I’ve seen a large number of applications where engineers used vision to solve difficult sensing problems by thinking outside the box. The box, for machine vision is, of course, automated inspection.

Seeing outside the box

Everyone knows that you can use machine vision to inspect assemblies, gauge dimensions, and read logos. It’s out-of-the-box applications that are the most interesting, and potentially the most useful for control engineers.

For example, last year I ran across an application where an electric utility wanted to monitor the level in oil-filled insulators atop high-voltage transformers. To efficiently transport large amounts of electric power, utilities boost the voltage into the megavolt range. Dissipative losses in transmission lines scale with the current carried, and the current required to transport a certain amount of power is inversely proportional to the voltage. A transmission line carrying, say, 1.2 megavolts can carry 10,000 times the power per ampere that a transmission line operating at 120 V can before exceeding its safe current level. To boost the voltage, or return it to service level, utilities use transformers.

High voltage transformers, however, are subject to power-robbing internal discharges. To prevent such discharges, transformer builders fill all the interior spaces with high-purity oil, which is a non-conductor. In addition to filling all the spaces, liquid oil resists damage from electrical discharges, so it has a much longer service life than solid insulators, such as epoxy or even glass.

The utility realized that the transformer oil picks up heat from the current-carrying copper transformer windings and transports it out to the case. If the oil level is low, the case will not be completely filled. The side wall will be warm up to the oil level, and cooler above it. The utility used an infrared camera to measure the height at which this abrupt temperature change occurred, and thereby measured the oil-fill level remotely without having to go past the safety fence enclosing the substation. Nothing had to be shut down, and a single picture could measure levels in many transformers. While the particular application was not automated at the time, it is easy to see how such a technique could be automated and used to control the level of any warm or cool liquid.

Because machine vision can measure any variable whose value translates into a difference in UV/VIS/IR emission or reflection, and observe up to millions of locations simultaneously, its range of applicability is limited only by the imagination of the engineers using it.

Today’s machine vision systems leverage advanced computer technology, including compact, powerful processors, advanced software algorithms, sophisticated user interfaces, and high-speed networks to acquire data, reduce it to the information needed, and deliver it to the control system on an as-needed basis. Packaged as “smart vision” products, these systems offer control engineers flexible sensing technology at a competitive cost-of-ownership level. ce

Also read:

Vision Sensors Error-Proof Oil Cap Assembly

Vision for Process Control

Author Information

C.G. Masi is a senior editor with Control Engineering. Contact him by email at

No comments
The Top Plant program honors outstanding manufacturing facilities in North America. View the 2013 Top Plant.
The Product of the Year program recognizes products newly released in the manufacturing industries.
The Engineering Leaders Under 40 program identifies and gives recognition to young engineers who...
Sister act: Building on their father's legacy, a new generation moves Bales Metal Surface Solutions forward; Meet the 2015 Engineering Leaders Under 40
2015 Mid-Year Report: Manufacturing's newest tool: In a digital age, digits will play a key role in the plant of the future; Ethernet certification; Mitigate harmonics; World class maintenance
2015 Lubrication Guide: Green and gold in lubrication: Environmentally friendly fluids and sealing systems offer a new perspective
Drilling for Big Data: Managing the flow of information; Big data drilldown series: Challenge and opportunity; OT to IT: Creating a circle of improvement; Industry loses best workers, again
Pipeline vulnerabilities? Securing hydrocarbon transit; Predictive analytics hit the mainstream; Dirty pipelines decrease flow, production—pig your line; Ensuring pipeline physical and cyber security
Cyber security attack: The threat is real; Hacking O&G control systems: Understanding the cyber risk; The active cyber defense cycle
Designing positive-energy buildings; Ensuring power quality; Complying with NFPA 110; Minimizing arc flash hazards
Building high availability into industrial computers; Of key metrics and myth busting; The truth about five common VFD myths
New industrial buildings: Greener, cleaner, leaner; New building designs for industry; Take a new look at absorption cooling; Offshored jobs start to come back

Annual Salary Survey

After almost a decade of uncertainty, the confidence of plant floor managers is soaring. Even with a number of challenges and while implementing new technologies, there is a renewed sense of optimism among plant managers about their business and their future.

The respondents to the 2014 Plant Engineering Salary Survey come from throughout the U.S. and serve a variety of industries, but they are uniform in their optimism about manufacturing. This year’s survey found 79% consider manufacturing a secure career. That’s up from 75% in 2013 and significantly higher than the 63% figure when Plant Engineering first started asking that question a decade ago.

Read more: 2014 Salary Survey: Confidence rises amid the challenges

Maintenance and reliability tips and best practices from the maintenance and reliability coaches at Allied Reliability Group.
The One Voice for Manufacturing blog reports on federal public policy issues impacting the manufacturing sector. One Voice is a joint effort by the National Tooling and Machining...
The Society for Maintenance and Reliability Professionals an organization devoted...
Join this ongoing discussion of machine guarding topics, including solutions assessments, regulatory compliance, gap analysis...
IMS Research, recently acquired by IHS Inc., is a leading independent supplier of market research and consultancy to the global electronics industry.
Maintenance is not optional in manufacturing. It’s a profit center, driving productivity and uptime while reducing overall repair costs.
The Lachance on CMMS blog is about current maintenance topics. Blogger Paul Lachance is president and chief technology officer for Smartware Group.