New Tool: ProSource
Checkout our packaging and processing solutions finder, ProSource.
Start Your Search

Robotics, embedded PC make this handle applicator tick

'Dog-bone'-style carrier handles are applied at speeds to 60 cases/min by a pair of vision-guided robots, and it's all governed by an industrial PC.

PakTech has turned to a robotic solution from Fanuc to bring to its range of carrier-handle applicators a machine capable of operating at 60 cases/min. As with many robotic applications in packaging, this one relies on a machine vision system to communicate the exact location of the cases in which the bottles to be handled sit. The vision system comes from Sony.

“Fanuc suggested the Sony camera and we elected to go with it partly because it has a wide field of view,” says Dan Shook, director of operations at PakTech. “The space where we capture the digital image is only about 3 or 3½ feet above the conveyor, so to get a good image to analyze, we needed that wide angle.” Controls on the PakTech handle applicator are handled by a Beckhoff CX1020 embedded industrial PC.

Shook says this is PakTech’s first use of robotics. “We used to use linear actuators on a system that was what you might call ‘robot-like’ but was really more of a pick-and-place system. Speeds were limited because it was an intermittent-motion solution,” says Shook. “We tried a few options that would get us to continuous motion, but the price started to reach a point where who would buy it?” By integrating the Fanuc robots into their machine, continuous motion and affordability were both within reach.

“One of the key benefits with the controller from Beckhoff is that we can build our own HMI applications, including an ability to offer customers remote assistance capabilities,” says Shook. The HMI panel, he notes, is also from Beckhoff.

The two robots on the PakTech system operate as a tag team, with the upstream robot functioning as master and the downstream robot as slave. As a case passes under the Sony camera’s field of view, the camera takes a picture every so many encoder pulse counts. That data is used to mark the position and orientation of the case. The data is shared all along the network, and because the robots are on the network, they, too, know the position and orientation of each case. Then it’s just a matter of tracking that position down the line. If the upstream robot is occupied and can’t handle a case, the downstream robot “knows” this and will apply handles because the two robots share information over the Ethercat network.

Discover Our Content Hub
Access Packaging World's free educational content library!
Unlock Learning Here
Discover Our Content Hub