You are on page 1of 12

MECHATRONICS

November 23, 2010

SOFT TEL

MECHATRONICS
ROBOTICS

BY Khalil Raza Bhatti


11/23/2010

QUAID-E-AWAM UNIVERSITY OF ENGINEERING, SCIENCE & TECHNOLOGY, NAWABSHAH-PAKISTAN

Khalil Raza Bhatti


Q No. 1 Differentiate between Robots and Robotics? How
to classify Robots? Define Robot components and Robot
degrees of freedom? Identify robot coordinates and
reference frames?
The word robot is often applied to any device that works automatically or by remote control,
especially a machine (automaton) that can be programmed to perform tasks normally done by
people.

Before the 1960s, robot usually meant a manlike mechanical device (mechanical man or
humanoid) capable of performing human tasks or behaving in a human manner. Today robots
come in all shapes and sizes, including small robots made of LEGO, and larger wheeled robots
that play robot football with a full-size ball.

What many robots have in common is that they perform tasks that are too dull, dirty, delicate
or dangerous for people. Usually, we also expect them to be autonomous, that is, to work using
their own sensors and intelligence, without the constant need for a human to control them.
Looked at this way, a radio controlled airplane is not a robot, nor are the radio controlled
combat robots that appear on television. However, there is no clear dividing line between fully
autonomous robots and human-controlled machines. For example, the robots that perform
space missions on planets like Mars may get instructions from humans on Earth, but since it can
take about ten minutes for messages to get back and forth, the robot has to be autonomous
during that time.

Robotics is the engineering science and technology of robots, and their design,
manufacture, application, and structural disposition. Robotics is related to electronics,
mechanics, and software.Three Laws of Robotics
Asimov also proposed his three "Laws of Robotics", and he later added a 'zeroth law'.
Law Zero: A robot may not injure humanity, or, through inaction, allow humanity to come to

Khalil Raza Bhatti


harm.
Law One: A robot may not injure a human being, or, through inaction, allow a human being to
come to harm, unless this would violate a higher order law.
Law Two: A robot must obey orders given it by human beings, except where such orders would
conflict with a higher order law.
Law Three: A robot must protect its own existence as long as such protection does not conflict
with a higher order law.
The image of the "electronic brain" as the principal part of the robot was pervasive. Computer
scientists were put in charge of robot departments of robot customers and of factories of robot
makers. Many of these people knew little about machinery or manufacturing but assumed that
they did.

Main Parts of a Robot


A robot has five main parts:

1. Arm
2. Controller
3. Drive
4. End Effector
5. Sensor

Arm
The arm of the robot is a significant part of the robotic architecture. It
positions the End effector and the sensors that the robot will require. Most
arms resemble the human limb-the arm. Some of these arms have many
complex parts including fingers, wrists, and elbows. This enables the robot
different methods of movement.

Controller
The controller functions as the "brain" of the robot. It can also network to
other systems so that the robot may work together with other robots or
machines. Controllers can become very complicated. There are many computer-based
controllers on the market and many robot languages, such as Prolog.

Drive
The drive is the engine of the robot. It enables mobility and
movements between the joints of the arm. It can be powered by air,
electricity, and/or water.

End Effector
The End Effector is the hand connected to the arm. In humans, the

Khalil Raza Bhatti


End effector is the hand. However, in robots, the End effector can be of many different things.
It could range from a being a tweezer, to a blowtorch.

Sensors
Sensors provide a robot with feedback so that it can "understand" its surroundings-otherwise a
robot would be not only blind, but also deaf to its environment.

A few common kinds of sensors are listed below.

 Cameras - Cameras are inexpensive and usable for many kinds of imaging applications.
They enable a robot to process its environment so that it can move freely without
bumping into something.
 Range finding devices - There are four basic techniques for distance measurement using
electro magnetic radiation. They are Doppler methods, interferometry, phase
comparison, and pulse timing.
 Sonar sensors - These kinds of sensors work by measuring the time it takes for an
acoustic pulse to propagate through air or water, reflect from the environment, and
finally return to a detector, is proportional to the distance to that object.

Robots Degrees of Freedom:


In general, a rigid body in d dimensions has d(d + 1)/2 degrees of freedom (d translations
and d(d −1)/2 rotations). One line of reasoning for the number of rotations goes that rotational
freedom is the same as fixing a coordinate frame. Now, the first axis of the new frame is
unrestricted, except that it has to have the same scale as the original—so it has (d-1) DOFs. The
second axis has to be orthogonal to the first, so it has (d-2) DOFs. Proceeding in this way, we
get d (d-1)/2 rotational DOFs in ddimensions. In 1-, 2- and 3- dimensions then, we have one,
three, and six degrees of freedom.
A non-rigid or deformable body may be thought of as a collection of many minute particles
(infinite number of DOFs); this is often approximated by a finite DOF system. When motion
involving large displacements is the main objective of study (e.g. for analyzing the motion of
satellites), a deformable body may be approximated as a rigid body (or even a particle) in order
to simplify the analysis.

In three dimensions, the six DOFs of a rigid body are sometimes described using these nautical
names:

1. Moving up and down (heaving);


2. Moving left and right (swaying);
3. Moving forward and backward (surging);

Khalil Raza Bhatti


4. Tilting forward and backward (pitching);
5. Turning left and right (yawing);
6. Tilting side to side (rolling).
Robot Coordinates or Frame of Reference:
The basic idea of
perceptive planning
and control theory is to
introduce the concept
of a perceptive action
reference, a parameter
that is directly relevant
to the measured
sensory outputs and
the task. Instead of
time, the control input
is parameterized by the
perceptive action
reference. Since the
action reference is a
function of the real
time measurement, the
values of the desired
vehicle states are
functions of the
measured data. This
creates a mechanism
to adjust or modify the
plan based on the
measurements. Thus,
the planning becomes a closed loop real-time process. The planner generates the desired
values of the system, according to the on-line computed action reference parameter s. This
perception based planning and control scheme has been successfully applied to deal with
unexpected obstacles during robot motion and multi-robot coordination (Xi et al., 1996). When
multiple robots in a formation are involved in the same mission, perceptive reference
projection can be used for cooperated motion control of the multiple vehicles. To extend the
perceptive planning and control theory to the formation control composed of heterogeneous
robots, the perceptive motion reference has to be chosen such that all the information of the
robots in the formation are properly represented. As shown in Figure 2(b), the perceptive
reference not only considers the system output of one robot, but also the mission of the
formation described by the system output of all the robots in the formation. The motion of the
robot in this formation is coordinated by the common motion reference, s, which is related to
the system output of the robots. Since the task planner is driven by s, instead of time, the

Khalil Raza Bhatti


behavior of one robot in the formation will affect the mission of the formation by affecting the
motion reference s. For example, if the motion of one robot is stopped by an unexpected event,
such as an obstacle in unstructured environments, this event will affect the computation of s
according to the specification of the coordination scheme.

Q No. 2. Define Robot Application and


typical workspace for common robot
configuration?
Robot Applications:
Robotics has been of interest to mankind for over one hundred years. However our perception
of robots has been influenced by the media and Hollywood. One may ask what robotics is
about? In my eyes, a robots' characteristics change depending on the environment it operates
in. Some of these are:
Outer Space - Manipulative arms that are controlled by a human are used to unload the
docking bay of space shuttles to launch satellites or to construct a space station
The Intelligent Home - Automated systems can now monitor home security, environmental
conditions and energy usage. Door and windows can be opened automatically and appliances
such as lighting and air conditioning can be pre programmed to activate. This assists occupants
irrespective of their state of mobility.
Exploration - Robots can visit environments that are harmful to humans. An example is
monitoring the environment inside a volcano or exploring our deepest oceans. NASA has used
robotic probes for planetary exploration since the early sixties.
Military Robots - Airborne robot drones are used for surveillance in today's modern army. In
the future automated aircraft and vehicles could be used to carry fuel and ammunition or clear
minefields.
Farms - Automated harvesters can cut and gather crops. Robotic dairies are available allowing
operators to feed and milk their cows remotely.
The Car Industry - Robotic arms that are able to perform multiple tasks are used in the car
manufacturing process. They perform tasks such as welding, cutting, lifting, sorting and
bending. Similar applications but on a smaller scale are now being planned for the food
processing industry in particular the trimming, cutting and processing of various meats such as
fish, lamb, beef.
Hospitals - Under development is a robotic suit that will enable nurses to lift patients without

Khalil Raza Bhatti


damaging their backs. Scientists in Japan have developed a power-assisted suit which will give
nurses the extra muscle they need to lift their patients - and avoid back injuries.

Automated Hauling
Several home robots will carry dishes and other small loads from room to room. A friend,
recovering from hip surgery, used his Cye to carry food from the kitchen to the living room, and
the dirty dishes back into the kitchen again. Since he was on crutches, this was a real lifesaver.
Security
Home robots could easily be tied into a computerized home security system, and the robot's
mobility would allow more areas in the home to be protected.
Alarm Clock
With a little work I will soon be able to use Cybert as an alarm clock. Every morning he will roll
into my bedroom and wake me up; once he senses that I'm out of bed he will follow me into the
bathroom and deliver up-to-the minute news, weather, sports, and stock market information.
Home Automation
It would be a fairly easy task to connect a robot to an X10 home automation system. The robot,
linked to your PC, would then have access to lights, security features, and more.
Entertainment
Robotics is an exciting hobby for many people around the world. There are countless clubs,
websites, and books that have been written for those who are interested in the topic.
Education
Using a home robot like Cye not only teaches about robotics, it teaches spatial navigation,
mapping, ded reckoning, programming, and more.
Hazard Detection
It would be fairly easy to attach fire, smoke, carbon monoxide, and other detectors to a home
robot. Every night the robot could "make the rounds" to ensure that everything is okay.

Typical Workspace for Robot Configuration:


In order to study the workspace of a robot, the structure of the robot can be considered as
consisting of the arm and the hand. The arm is the large regional structure for global positioning
of the hand, which is the small orientation structure for orientating the tool.

The primary workspace of such a robot with a large regional structure and a small orientation
structure is determined by the arm. The hand generates the secondary workspace of a robot.

In performing tasks, a manipulator has to reach a number of workpieces or fixtures. Workspace


is a volume of space which the end-effector of the manipulator can reach. Workspace is also
called work volume or work envelope. The size and shape of the workspace depends on the
coordinate geometry of the robot arm, and also on the number of degrees of freedom. Some
workspaces are quite flat, confined almost entirely to one horizontal plane. Others are
cylindrical; still others are spherical. Some workspaces have very complicated shapes.

When choosing a robot arm for a certain industrial purpose, it is important that the workspace

Khalil Raza Bhatti


be large enough to encompass all the points that the robot arm will need to reach. But it's
wasteful to use a robot arm with a workspace much bigger than necessary.

The problem presented here is for a robot described in the previous section. We have seen that
the hand of the robot was moving with respect to its base by using linear actuators that allow to

change the distance between the pairs of points . But in practice these actuators have a

limited stroke and consequently the distance between the points has to lie within a given
range:

Furthermore the actuator is attached at point by joints (typically ball-and-socket or


universal joints) that have limited motion. Hence the platform may reach only a limited set of
position/orientation that is called its workspace.

Assume now that the hand has to follow a time-dependent trajectory: it is clearly important to
verify that all the points of this trajectory lie within the workspace of the robot. The trajectory is
defined in the following manner: the position/orientation parameter are written as analytic
function of the time T (which is assumed to lie in the range [0,1] without lack of generality).
Using the solution of the inverse kinematics (see the previous section) it is then possible to

express the distance as functions of the time. For the trajectory to lie in the workspace we
have to verify the 12 inequalities:

when T lie in the range [0,1]. As the analytical form of the position/orientation parameters may
be arbitrary we are looking for a generic algorithm that may deal with such arbitrary trajectory.

This can easily be done with interval analysis (see [9] for a detailed version). First we define the

trajectory in Maple and compute the analytical form of (and of any other constraint that
may limit the workspace of the robot). We get a set of inequalities that has to be satisfied for
any T in [0,1] if the trajectory lie within the workspace. The analytical form of these inequalities
are written in a file: this allow their interval evaluation for any T range by using the ALIASparser.
Then the general solving procedure of ALIAS may be used to determine if there is a T such that
at least one constraint is violated.

An important point is that the algorithm allow to deal with the uncertainties in the problem. A
first uncertainty occurs when controlling the robot along its nominal trajectory. Indeed the robot
controller is not perfect and there will be a positioning error: for a nominal value of the

Khalil Raza Bhatti


position/orientation parameters the reached pose will be where can be
bounded. A second uncertainty source is due to the differences between the theoretical
geometrical model of the robot and its real geometry. Indeed to solve the inverse kinematics we

use the coordinates of the in the reference frame and of the in the model frame. In
practice however these coordinates are known only up to a given accuracy: hence these
coordinates for the real robot may have any value within given ranges. Hence the inequalities of
the problem have not fixed value coefficients but interval coefficients. But this no problem for
interval analysis as the general solving procedures may deal with such inequalities. Hence if the
algorithm find out that all inequalities are verified for any T in [0,1], then this means that
whatever is the real robot and the positioning error of the robot controller the trajectory
followed by the robot will fully lie within the robot workspace.

Q. No 3. How to represent a point in space in matrix from vector O


space in matrix from? Represent a frame at the origin fixed
reference frame?

Representing a point in space


k
r
z P

 j

To specify the location of a point in cylindrical-polar coordinates, we choose an origin at


some point on the axis of the cylinder, select k to be parallel to the axis of the cylinder, and
choose a convenient direction for the basis vector i, as shown in the picture. We then use
the three numbers to locate a point inside the cylinder, as shown in the picture.

Khalil Raza Bhatti


In words
r is the radial distance of P from the axis of the cylinder
is the angle between the i direction and the projection of OP onto the i,j
plane
z is the length of the projection of OP on the axis of the cylinder.

By convention r>0 and

II.2.2 Converting between cylindrical polar and rectangular cartesian coordinates

The formulas below convert from cartesian (x,y,z) coordinates to cylindrical polar
coordinates and back again

Cylindrical-polar representation of vectors

k
ez
r e
z P
er
O
 j

When using cylindrical-polar coordinates, all vectors are expressed as components in the
basis shown. In words
is a unit vector normal to the cylinder at P

Khalil Raza Bhatti


is a unit vector circumferential to the cylinder at P, chosen to make
a right handed triad
is parallel to the k vector.

You will see that the position vector of point P would be expressed as

Note also that the basis vectors are intentionally chosen to satisfy

and are therefore the natural basis for the coordinate system.

II.2.4 Converting vectors between cylindrical and cartesian bases

k
ez
r e
z P
er
O
 j

Let
be a vector, expressed as components in . It
is straightforward to show that the components of a in {i,j,k} (
) are

Khalil Raza Bhatti


As a matrix

The reverse of this transformation is

In matrix form

Khalil Raza Bhatti

You might also like