Machine vision boosts quality for mass-produced robotic workcells

Inside Machines: Feedback from machine vision adjusts robotic movement and enhances manufacturing quality. Vision-guided robots have greater position accuracy, providing closed-loop control.


On any given day, I can walk out of my office and see dozens, if not hundreds, of robots standing in neat rows waiting to become integration ready. They all look exactly the same, but the truth is, they’re not. Each piece of metal, each servo joint, is subtly different than the next. Put those robots to work, and depending on the temperature in the workcell, the robot’s physical dimensions will change again as metals expand (or contract) and electrical efficiencies vary.

Mass-produced industrial robots are expected to all be the same, but in reality, subtle differences in metal part construction, servo response, and real-time changes in ambient conditions will directly influence reach, speed, and movement path for an indu

These mechanical and electrical variations (or errors) stack one atop the next to determine the overall accuracy of the robot. Robotic accuracy is defined as the ability to go to a programmed spot in space. Customers regularly fail by trying to use a robot with 0.1 mm repeatability to position a part within 0.1 mm accuracy because they overlook the robot’s true accuracy—the ability to return to a given spot in space on demand.

Vision-guided robotics (VGR), or the use of industrial cameras connected to computers running image processing software to determine an offset for robot control, counter these effects by providing an objective method for determining the robot’s position and orientation in 3D space versus where it is supposed to be. But machine vision also has a stack error that depends on many factors, from intrinsic changes in lighting and sensor response to extrinsic variations in surface finish and part presentation due to material handling systems.

Unfortunately, most end users do not understand the sources of accuracy and repeatability, or how to account for stack errors in robotic and vision systems to create a VGR solution that works. It takes experience with machine vision and robot programming to define a complete application specification based on true robotic stack error matched with the right machine vision system.

Embracing the unique

Every VGR application is different. It’s different because every robot, environment, manufactured part, and process is different. As a result, there are many ways to peel the proverbial onion, but the ultimate goal is to design a system that accomplishes specific tasks, at a specific rate, based on known parameters, for the least cost.

And it all starts with a complete and thorough understanding of the application needs. What is the part? How does it vary in size, texture, and orientation to the robot based on actual production, not just CAD files? From temperature to changes in light, what are the ambient conditions of the workcell? What does the robot need to do with the part, and how will that affect your choice of robot, including speed, force, and the effect of part mass and momentum on robotic position? (See related article on inertia measurements.)

Armed with this information (and more), most customers will have a preference for a specific robot original equipment manufacturer (OEM) based on what’s already installed on their plant floor. Based on the part variations and part position requirements, the experienced designer can help select the specific robot model for the application. Each robot is a one-of-a-kind kinematic model comprised of unique mechanical segments and unique electrical (or hydraulic) controls. Most robot OEMs provide an absolute accuracy service that will determine that individual robot’s absolute accuracy, which can be useful for applications where robotic and vision stack error are very close to the application’s material handling accuracy and repeatability requirements.

After defining the application requirements and selecting the right robot, the designer has to figure out how to program the robot to do its job. The robot will need help finding incoming parts, either through fixtures that consistently present the part to the robot in a given 3D location and orientation, or through the use of a vision system to provide an offset to the standard robot path to accommodate variations in part position and orientation.

Today, more manufacturers are using vision rather than fixtures because fixtures are a custom expense, often do not offer the flexibility to handle different parts on the same line without additional costs, or offer the chance to reuse robotic workcells in other parts of the plant. Machine vision systems can be reprogrammed and, assuming the system and its components meet the specific needs of the new application (a big if), can be deployed around the plant like any other asset.

Put vision into VGR

Robotic racking applications like the doorframe application rendered here are good examples of how part position can vary despite well-designed dunnage and mechanical fixtures. Machine vision systems can account for position variations and ensure a succesOnce the application has been clearly defined, the next step is to determine what sort of information the robot needs from the vision system to perform to the necessary specification. Is the part relatively flat on a flat conveyor so a 2D vision system will be sufficient? Does the application require orientation and relative height information in addition to X and Y information—therefore, a 2.5D vision solution? Or do you require absolute 3D information for hole inspection in addition to providing an offset for the lugs or pick points on the part?

While 2D and 2.5D solutions are relatively straightforward and usually can be solved with one camera assuming that minimum spatial resolution can be achieved per pixel across the necessary field of view, designers have several options when it comes to 3D vision, namely single-camera 3D, single- or multi-camera 3D with structured light triangulation, and multi-camera stereoscopic vision. Each of these approaches offers advantages and disadvantages. For example, single-camera 3D solutions can be extremely accurate across relatively narrow fields of view but may require multiple images to create the 3D point set. Stereoscopic is highly accurate for large area fields of view and can be further improved with the use of structured light sources, such as light-grating projectors, LED, or laser line generators, but requires more hardware. All these systems depend on frequent calibration routines to ensure bumps, thermal expansion, and other factors do not generate inaccurate 3D data.

One of the least understood factors of a machine vision system involves lighting. Lighting and, more importantly, changes in lighting, will greatly affect machine vision systems, regardless of dimensional aspects of the vision solution. Lighting is often considered as the last part of the vision solution but should be considered early in the design since light interaction with the part as perceived by the camera is the basis for a successful machine-vision solution.

For example, if your workcell is in a room with windows, infrared lights may not be the best choice because the sun’s light is strongest in the red and infrared end of the visible spectrum. To determine the best “color” of light (white, blue, amber, red, etc.), understand the physics of light and optics. Does the VGR workcell need to sense very similar colors on the part, for example, requiring a color camera and light? Or are the colors different enough that a grayscale camera with bandpass filter and a complementary colored light can offer a cheaper solution with less data processing?

Much can be said on the art of matching colored illumination, but a basic rule of thumb is: Don't use a light source that’s similar to the ambient light in the room, and don’t use a light that is opposite the color of the part color because it will absorb that light (unless you’re considering backlight or darkfield illumination).

Simple is as simple does

This vision guided robot demo developed by Leoni Vision Solutions for the 2013 Automate conference demonstrates the capabilities of visual servoing, or the use of a machine vision system to guide a robot to a moving target. In the case of this demo, a visA successful VGR solution requires careful consideration of the application and specific performance requirements for the robot and the vision system, as well as the total performance of the combined VGR solution in respect to the application and associated production equipment. The solution is often complex. And while it would be useful for your VGR designer to have robotic programming and vision-system design expertise, few companies offer both. If you cannot find such an integrator to help guide your system development, be sure to ask your vision or robotic integrator about its partners on the other side of the design equation. What is their experience? What can they demonstrate?

In all fairness, VGR solutions are not necessarily the most complex automation problems that machine vision will help solve. Many robotic suppliers provide optional machine vision systems that are well integrated into their robotic control systems. However, a vision system is not a vision solution. The physics necessary to optimize the light, camera, and optics part of the equation alone can require considerable knowledge and expertise. Don’t be afraid to ask suppliers about their past experiences and client referrals. Also, associations [such as Automated Imaging Association (AIA), the North American trade association for the machine vision industry, and the Control Systems Integrator Association (CSIA)] have lists of companies that have passed certified vision professional and certified systems integrator courses. These companies have proven their system design knowledge across a wide range of applications and design environments. Working with the right supplier, a VGR solution can put the competitive edge back into an operation.

- Nick Tebeau is manager vision solutions, business unit industrial solutions, Leoni. Edited by Mark T. Hoske, content manager, CFE Media, Control Engineering and Plant Engineering,


At, August, see this article for links, more information. 

Key concepts

  • Using feedback from machine vision positions robots more precisely for higher quality.
  • Match the technology to the application.
  • Proper lighting helps machine vision accuracy.

Consider this

If vision guides robotics, what other motion-control applications could it enhance?

No comments
The Top Plant program honors outstanding manufacturing facilities in North America. View the 2015 Top Plant.
The Product of the Year program recognizes products newly released in the manufacturing industries.
The Engineering Leaders Under 40 program identifies and gives recognition to young engineers who...
IIoT grows up; Six ways to lower IIoT costs; Six mobile safety strategies; 2017 Salary Survey
2016 Top Plant; 2016 Best Practices on manufacturing progress, efficiency, safety
2016 Product of the Year; Diagnose bearing failures; Asset performance management; Testing dust collector performance measures
Future of oil and gas projects; Reservoir models; The importance of SCADA to oil and gas
Big Data and bigger solutions; Tablet technologies; SCADA developments
SCADA at the junction, Managing risk through maintenance, Moving at the speed of data
What controller fits your application; Permanent magnet motors; Chemical manufacturer tames alarm management; Taking steps in a new direction
Tying a microgrid to the smart grid; Paralleling generator systems; Previewing NEC 2017 changes
Package boilers; Natural gas infrared heating; Thermal treasure; Standby generation; Natural gas supports green efforts

Annual Salary Survey

Before the calendar turned, 2016 already had the makings of a pivotal year for manufacturing, and for the world.

There were the big events for the year, including the United States as Partner Country at Hannover Messe in April and the 2016 International Manufacturing Technology Show in Chicago in September. There's also the matter of the U.S. presidential elections in November, which promise to shape policy in manufacturing for years to come.

But the year started with global economic turmoil, as a slowdown in Chinese manufacturing triggered a worldwide stock hiccup that sent values plummeting. The continued plunge in world oil prices has resulted in a slowdown in exploration and, by extension, the manufacture of exploration equipment.

Read more: 2015 Salary Survey

Maintenance and reliability tips and best practices from the maintenance and reliability coaches at Allied Reliability Group.
The One Voice for Manufacturing blog reports on federal public policy issues impacting the manufacturing sector. One Voice is a joint effort by the National Tooling and Machining...
The Society for Maintenance and Reliability Professionals an organization devoted...
Join this ongoing discussion of machine guarding topics, including solutions assessments, regulatory compliance, gap analysis...
IMS Research, recently acquired by IHS Inc., is a leading independent supplier of market research and consultancy to the global electronics industry.
Maintenance is not optional in manufacturing. It’s a profit center, driving productivity and uptime while reducing overall repair costs.
The Lachance on CMMS blog is about current maintenance topics. Blogger Paul Lachance is president and chief technology officer for Smartware Group.
Motion control advances and solutions can help with machine control, automated control on assembly lines, integration of robotics and automation, and machine safety.
This article collection contains several articles on the vital role of plant safety and offers advice on best practices.
This article collection contains several articles on preventing compressed air leaks and centrifugal air compressor basics and best practices for the "fifth utility" in manufacturing plants.
Maintenance Manager; California Oils Corp.
Associate, Electrical Engineering; Wood Harbinger
Control Systems Engineer; Robert Bosch Corp.
click me