Vision-guided robots automate oil tool assembly

Machine vision locates parts for picking by one robot and checks the diameter and location of the pipe before it is threaded by a second robot, without fixturing or accurate locating. The application may lead to a new generation of vision-enabled robots to improve productivity and quality for oil tools.

By John Lewis April 19, 2012

Installing a thread protector onto exposed threaded pipe on oil tools is a difficult and time-consuming job. Currently no automation technologies perform this operation in industry because of the complexity of the operation and the many sizes and styles of thread protectors and pipes. JMP Engineering worked with an oil industry manufacturer to develop a flexible automation process that uses two robots guided by machine vision to process a wide range of parts and that can easily be configured to handle future variants without programming.

“The key to the success of the application is the use of machine vision to locate parts for picking by the first robot and to check the diameter and location of the pipe before it is threaded by the second robot,” said Scott Pytel, project manager at JMP Engineering.

Oil tool manufacturing automation

Oil tool manufacturing is characterized by large families of parts whose members are typically produced in relatively low production volumes. Oil industry parts are also not typically produced to the close tolerances required for precision part locating. For these reasons, the hard automation systems that are commonly used in the automotive industry and for other high-volume production tasks are not an option for most oil tool production jobs.

More flexible automation systems based on industrial robots offer more potential, but they have also seen relatively little use in the oil tool business because of various difficulties such as the need to pick unfixtured parts and handle many part numbers. As a result, few industrial robots are used in the oil industry.

The family of parts mentioned above has all of the characteristics of a typical oil industry part. Thread protectors are installed on oil and gas pipes to prevent them from being damaged during shipping. The family as a whole is assembled at relatively high volumes but none of the individual part numbers has the volume normally needed to justify automation. The oil tool manufacturer wanted to assemble thread protectors at a rate of about three per minute. The task of assembling the cap to the pipe is done with pneumatic tools, but the high levels of torque involved make it a demanding physical challenge.

The oil tool manufacturer talked to JMP Engineering to see if the company had any ideas on how to automate the task. JMP designs and builds industrial control, turnkey automation systems, and plant information solutions for the food and beverage, life sciences, environmental, automotive, metal processing, and other industries. JMP integrates machine vision and robots to handle applications where parts are not precisely located, not fixtured, and not clearly separated from each other.

Picking parts from a bin

In the bin picking operation, thread protectors are packed in bins in layers divided by cardboard sheets. The machine vision system rides on the robot arm. The vision system consists of an industrial machine vision camera that interfaces over a high-speed machine vision communication interface standard for industrial cameras with a frame grabber card on an industrial personal computer. A light emitting diode (LED) inside the camera enclosure generates red light that helps overcome ambient lighting to capture the image.

JMP programmers wrote a graphical user interface for the workcell in Microsoft Visual Basic that performs vision operations by calling vision tools from a machine vision software library, which provides preconfigured, tightly integrated acquisition support for the complete range of industrial cameras and video formats. The machine vision software application development environment makes it possible to configure acquisition tools, define vision tasks, and make pass/fail decisions without any programming. The machine vision library includes software tools to quickly and accurately gauge, guide, identify, and inspect parts despite variations in part appearance due to the manufacturing process.

Traditional pattern matching technology relies upon a pixel-grid analysis process commonly known as normalized correlation. This method looks for statistical similarity between a gray-level model (or reference image) of an object and portions of the image to determine the object’s X/Y position. Though effective in certain situations, this approach limits the ability to find objects and the accuracy with which they can be found under conditions of varying appearance common to production lines, such as changes in object angle, size, and shading.

Geometric pattern matching technology learns an object’s geometry using a set of boundary curves that are not tied to a pixel grid and then looks for similar shapes in the image without relying on specific gray levels. The result is a significant improvement in the ability to accurately find objects despite changes in angle, size, and shading.

A multifunction robot with 55.2-in. horizontal reach and 80 kg payload capacity moves the camera above the bin and signals that it is in position to take a picture of the bin. The PLC passes a request to the vision system to take a picture. The camera takes the picture and the vision tool identifies the location of each thread protector in the bin. The vision system then identifies the thread protectors in the image and calculates the location of each one. The Visual Basic interface makes the conversion from pixels in the camera image to millimeters required by the robot control system.

A PLC directs the robot to pick one of the thread protectors from the bin. The thread protectors come in 11 sizes ranging from 4 in. to 8 in. dia. The vision system is trained on each part number. It identifies the location of good parts and detects the presence of parts of the wrong size that are intermingled with good parts.

Robotic assembly

The robot hands off the part to a second robot (same model) that assembles the thread protector to the pipe. The PLC stores the position of all parts in one layer of the bin and commands the robot to pick them up one by one. When the bin is empty the robot removes the cardboard divider and the camera takes an image to determine the location of the parts in the next layer.

The second robot carries the thread protector over to a fixture where the oil tool assembly is located, exposing the sections of pipe where a thread protector is to be installed.

“Assembling two threaded fasteners is a challenging operation for a robot because the robot does not have the human operator’s ability to feel the connection between the threads,” said Kevin Ackerman, machine vision specialist at JMP Engineering. “The vision system helps overcome these challenges.”

An industrial machine vision camera attached to the second robot locates the pipe for thread protector installation. A brick red light shines on the pipe at an oblique angle to create a shadow that enables accurate measurement of the pipe diameter. The machine vision software circle tool is used to check the diameter of the pipe to ensure it matches the thread protector and also more accurately determine the location of the pipe. The robot arm has a compliance device that allows the pipe thread to pull the arm and the thread protector as it screws onto the pipe.

The most recent camera image is displayed on the screen along with results such as the size of the thread protector and size of the pipe. The part picking robot image and results appear on the left side of the screen and the thread assembly robot image and results appear on the right. A configuration menu enables the operator to configure the camera.

The automated calibration procedure takes advantage of a fixed, permanent target located near each robot. The camera mounted on each robot acquires four images of the target and between taking each picture moves a known distance. Based on these four images, the calibration routine determines the position of the robot in relation to the target.

“Calibration is a manual process that is executed on demand whenever someone believes that one of the robots has become inaccurate, perhaps because a bent gripper or the camera was bumped out of position,” Ackerman said.

“The robot was commissioned in JMP’s plant,” Pytel said, then “shipped to the customer’s plant where it is now running in production. It has demonstrated the ability to successfully pick and assemble thread protectors without fixturing or accurate locating in conditions that are common in oil tool manufacturing. There’s a good chance this application will lead to a new generation of vision-enabled robots that will help improve productivity and quality in the oil tool industry.”

Editor’s note: Applications where robots perform repetitive functions reduce risk for human injuries and errors due to distraction or boredom.

– John Lewis is market development manager, Cognex Corp. Edited by Mark T. Hoske, content manager, CFE Media, Control Engineering.

www.cognex.com 

www.jmpeng.com 

https://www.controleng.com/machinevision has more case studies and application articles about machine vision.

Related article: Technology checklist for oil tool automated assembly, below.


Author Bio: John Lewis, contributing editor, Association for Advancing Automation (A3).