You are on page 1of 38

Basic Introduction

The FANUC robot is a LRMate 6 axis robotic arm with R30iA Mate controller.
The robot can be controlled using a teaching pendant.
THe teaching pendant is used to increment each joint
angle. The speed of the robot can also be changed here.

Control Block diagram


MATLAB

Visual
Basic

PCDK
libraries

Robot
Controller

MATLAB is
used to
calculate or
generate
the
coordinates
of the end
effector

The
coordinates
are then
transferred
to VB where
the program
to move the
robot is
written.

The
compilation
of the
program
uses the
PCDK
libraries to
call the
required
functions.

The
program is
then
transferred
to the robot
controller,
which
moves the
robot.

Form And Code

Code's Structure
First we made buttons on the form and the by defining
some functions for those buttons we tried to program
the robots motion.
Outlines:
1) Through IP gets Connected
2) Joint angles are determined
3) Gripper position is determined
4) End effector is moved

We have defined a new user frame. According to


this user frame we want to move the robot. Also
we want to take images and calibrate them.
The coordinates of this user frame relative to
world frame and joint position are in this
picture:

Define your desired


User Frame on Teach
Pendant

iRvision

iRvision: By typing the "IP"(192.168.1.18) of the robot


controller on the address bar we can get access to the
iRvision software:

Put the IP in
the address bar

First of all we have to calibrate the images that we


get from the camera by starting a new calibration
procedure:
This button is for creation of Camera
Calibration tools, because of software
issues, it has no specific shape on it.

Created as a new
Calibration Tool

With the sheet that belongs to the calibration method we


calibrate the images (note that robot's gripper has to be sat in
the origin of the user frame when the procedure is going on).
Also note to the data and combo bar's adjustments:

Read the grade


spacing from
here

Focal distance is an
important parameter,
which can be found
from Fujinon's
catalogue. It has to be
right to give us correct
"Z" height.

Read it from
Fujinons
Catalogue

Note to the camera and


application frame(user
frame)'s positions:
These data are generated after
setting the setup (you can
always check for their validity)

The height of Camera


relative to the
Calibration Grid
(Surface)

Calibration gives the coordinate of points on the


screen:

Coordinates
of
Calibration
Points

Points coordinations bar


from a closer sight:

Y
Origin

Don't forget to save


the calibration
procedure after
completion of
calibrating
procedure.
After removing the
calibration sheet,
we still can see the
calibrated points
are on the screen:

Save the Calibration after


you are done with it.

From "Vision Process tools" section open up a new "2D single view
locator". We named it "FRIDAY" as you can see the name in the picture.
In the following slides we are going to adjust the adjustments to get the
desired locating tool to detect the position of objects below the camera
relative to Cal Grid:
This button is for creation of Vision
Process tools. Because of software
issues, it has no specific shape on it.

Created as 2D
single view locator
process tool

In the beginning there is one GPM locator that can only be


adjusted to detect one specific object, you can see the adjustment
toolbar in the next slide for this specific object among the other
objects.

Red box
search
window

After adjusting the


search window press OK

Set the search


window

One should use the application for


each button from the manual. In
brief, three buttons have the most
application:

1)Teach Pattern
2)Set Origin
3)Set Search Window

After setting the


adjustments we
can detect the
specific objects
among the other
objects; by
adjusting the
contrast we can
detect the other 3
similar objects.

Not detected
because its out of
the search window

Finding
Model ID 1

Not detected
because of low
contrast

Here the contrast has


been adjusted and the
seven similar objects
have been detected by
the process. Note that
the 8th object is out of
the blue search
window which was
adjusted in the previous
steps, so, it has not been
detected.

Finding
Model ID 1

Here we have three


kinds of objects, in
the following slides
we are going to
detect each one of
them, here objects
model id 1 have
been detected:

You can add new GPM (geometric


pattern match) locator tools for
each new object. Then, just adjust
the pattern, origin and the search
window if it is necessary.

The new object is


detected
according to its
GPM locator:

Finding
Model ID 2

GPM locator adjustment and


the 3rd detected object in the
image:
Finding
Model ID 3

After everything is done with the GPM locators we can


detect all the objects together:

Each item has its own number on the screen; here we


have
eight objects of model id 1,
one of model id 2 and
one of model id 3:

Use the proper


calibration (check
slides 12-18)

The coordinate of items


relative to the
calibration grid frame
and model id numbers are
sorted in the table below
iRvision's screen:

Example:
In this random
image, we want to
detect objects of
model id 1 among
the other objects:

Using the FTP


connection between
PC and robot's
controller, we can
generate this picture:

Not detected
because its out of
the search window

Detected model id 1 objects


among the other ones; compare
the pictures:

We can open the generated picture with MATLAB and


using the image processing commands detect the origin
of each cross:

The directory in
which the FTP
generated picture
is.

Pixel Coordinates of 7th


detected objects middle
cross

Origin of the cross


has been detected,
and a dot has been
sat up there to show
that we can have the
origin coordinates of
each detected items.
In this picture you see
the dot points in the
middle of crosses:

Dots in the middle


of the crosses to
show that we
know their exact
pixel coordinates

The future goals are:


1) Making difference among the crosses for different
objects.
2) Transferring the pixel coordinates to Matlab 2-D.
3) Moving the gripper of the robot to the top of the
object and picking it up and then, putting it in a
desired location.

You might also like