An important goal when it comes applying robotics in packaging is to avoid the need to create separate vision and robot programs in different software environments that then have to communicate with each other.
So it’s welcome news that DENSO’s binary control access protocol (b-CAP) robot interface is now supported by Matrox Imaging’s Design Assistant machine vision software. The Design Assistant application development environment can be used to implement vision-based guidance of DENSO robots without the need to directly program a DENSO robot controller.
Matrox Design Assistant 4 is the first hardware-independent integrated development environment (IDE) that lets users easily create an application flowchart and HMI and take projects from concept to completion in record time, without the need for conventional programming. Flexible project deployment options include Matrox Iris GT smart cameras, Matrox 4Sight GPm industrial imaging computers, and PCs with standard GigE Vision® or USB3 Vision® cameras.
“Matrox Imaging’s implementation of our b-CAP interface within Design Assistant means that all logic for the vision subsystem and the robot itself can be implemented within a single development environment,” says Yousuke Sawada, Manager, Controller Business Unit, DENSO WAVE Incorporated. ”Integrators will appreciate the significant reduction in development time and cost this affords.”
“Demand for robots is ever-increasing as companies seek the benefits of automation and DENSO robots are widely used in manufacturing industries around the world,” said Sam Lopez, director of sales and marketing, Matrox Imaging. “With our support of DENSO’s b-CAP interface in Design Assistant, our joint customers can now more quickly and cost-effectively deploy DENSO robots with vision-based guidance to improve quality, increase productivity, ensure safety and provide flexibility in their operations.”
In a typical screenshot of the Matrox Design Assistant flowchart-based vision software development environment, the main element is the already-completed flowchart for locating and picking parts lying on a plane surface. The flowchart has steps to connect to and initialize the robot, capture an image, locate the parts in the image, instruct the robot to pick one part at a time until there are no parts remaining, and abort the program if so instructed, which parks and closes the connection to the robot. The configuration panel drills into the flowchart step that instructs the robot to pick a part, showing the command sent to the robot in the robot’s native language.