You are on page 1of 8

IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS, VOL. 41, NO.

1, FEBRUARY 1994

Intelligent Robotics in Manufacturing,


Service, and Rehabilitation: An Overview
William A. Gruver, Senior Member, IEEE

Abstract- Advances in intelligent robotics are resulting in a


new generation of programmable,sensory-interactive,computercontrolled machines capable of operating with human supervision
or autonomously from sensed information. The design and integration of these machines require knowledge of actuators,
control, mechanisms, mobility, programming, and sensors. The
application of intelligent robotic technologies can increase the
productivity, safety, and the quality of l i e for people in a wide
range of tasks for land, space, and undersea environments. This
paper provides an overview of recent developments of intelligent
robotics to manufacturing systems, robotic aids for the disabled,
and service. The references highlight advances in robot control,
sensor integration, mechanical hands, manufacturingautomation,
walking machines, and powered prostheses.

I. INTRODUCTION

N laboratories and companies throughout the world, engineers are developing intelligent robotic systems for space,
land, and undersea environments. The emergence of intelligent
robotics as an engineering discipline has been motivated by
the increasing complexity of automated systems. Previously,
it was possible to view a robot as a computer-controlled
machine tool or a peripheral device operating alone. Robots
were preprogrammed to perform a specific task or a series
of tasks with only low-level feedback from external sensors.
This approach is no longer possible due to the interdependence of mechanical, electronic, and computational requirements of modem computer-controlled, sensory-interactive machines.
Robots produce mechanical motion that results in manipulation or locomotion. Industrial robots manipulate parts and tools
to perform manufacturing tasks such as material handling,
welding, spray painting, and assembly. Automated guided
vehicles transport materials in factories and warehouses. Telerobotic mechanisms provide manipulation capabilities in space
and undersea. Walking robots have application in hazardous
environments. In contrast to early preprogrammed robots, intelligent robots can operate in partially structured and unstructured environments by the use of advanced sensory feedback
mechanisms, and make decisions using learning and reasoning
algorithms.

11. INDUSTRIAL
ROBOTS

Mechanical characteristics for robotic mechanisms include


degrees of freedom, size and shape of the workspace, stiffness
and strength of the structure, lifting capacity, and velocities
and accelerations under load. Mechanical design is a major factor in determining performance measures such as repeatability,
positioning accuracy, and freedom from vibration.
The first justification for robots was to replace humans in
dangerous and repetitive jobs, and to replace fixed-automation
systems with more flexible equipment that could be reprogrammed for new situations. Because of their limited sensory
capabilities, early robots were best suited to tasks with clearly
prescribed paths of motion. Uses of these devices occurred
mainly in material transfer, machine loading and unloading,
spot welding, and spray painting. Applications of industrial
robots have been dominated by simple manufacturing tasks
for many years [l], [ 2 ] . The robots employed for these
purposes range from small models with lifting capacities of
several pounds to very large machines that can lift hundreds
of pounds. For heavy lifting tasks involving payloads up to
1000 Ib, overhead gantry robots were developed. Kinematic
configurations vary according to the size and shape of the
workspace envelope.
With later developments in robotics, industrial applications
exploited the robots speed and accuracy in precision drilling,
light machining tasks, gluing and fastening, small parts assembly, packing, sealing, testing, and measuring. Robots developed for these tasks have repeatabilities exceeding f0.025 mm
and move at high speeds. They are often fitted with customized
end-effectors that can be quickly connected or disconnected
under programmed control.
111. ROBOTSENSING AND CONTROL

Due to advances in microelectronic fabrication and the low


cost of microprocessors, robot controllers are implemented
by computer-based systems. Modem robot control systems
generate paths based on mathematical descriptions of the tasks
in a computer database [3]. The computer may also generate
trajectories that are not fixed, but vary with conditions of the
environment monitored by sensors. This sensory-interactive
type of control permits the robot to act appropriately in relation
to actual conditions, rather than relying on assumptions about
Manuscript received June 21, 1993; revised October 1, 1993. This paper
the workspace. For example, without custom fixtures and
is based on a plenary address at the 1992 IEEE International Conference on
Industrial Electronics, Control, Instrumentation, and Automation (IECON92), timing of the work flow, the location of parts may vary from
San Diego, CA, November 1992.
one instance to the next. Without sensory-interactive control
The author is with the School of Engineering Science, Simon Fraser
in such cases, the robot could proceed blindly through a set
University, Bumaby, BC V5A IS6, Canada.
IEEE Log Number 9214342.
of actions at preprogrammed but incorrect positions. Although
027&0046/94$04.00 0 1994 IEEE

GRUVER: INTELLIGENT ROBOTICS: AN OVERVIEW

robot control systems have been developed to process a variety


of sensory information [4], integrated sensors combining force,
tactile, acoustic, and visual information, for example, are not
provided on commercially available robots.
When separate microprocessors are used for joint control,
the coordination of axes can be obtained by an upper-level
microprocessor that converts a description of the end-effector
position to joint angles through an inverse kinematic model
of the robot and generates the path for each joint. The robot
controller receives inputs from a teach pendent that enables the
user to record joint positions for a desired path. An U 0 module
monitors on-off signals from sensors such as noncontact
switches indicating part presence prior to gripping. The upperlevel control interpolates the path so that the robot moves
smoothly, a feature called continuous path control. Many robot
controllers offer a high-level language programming language
for specifying motion, to program extemal sensors such as
vision and force, and to use computer-aided design systems to
program the robot off-line [ 5 ] .
Motion planning algorithms that avoid obstacles and coordinate multiple robots are needed for applications of intelligent
robots. Although commercially available robot controllers
currently do not have these capabilities, techniques based on
graph search and nonlinear programming have been developed
to determine trajectories that avoid obstacles [6]-[7]. If robot
motion were planned to accommodate moving obstacles, multiple devices could be operated safely in the same workspace
by planning a task for one robot with all other motions
regarded as obstacles. Such a capability could lead to higher
productivity in manufacturing. It also could result in increased
robot mobility for navigation.
The robot sensory system gathers specific information
needed by the control system, and, in more advanced robotic
systems, maintains an internal model of the environment to
enable prediction and decision making. Sensors that detect
position, velocity, acceleration, visual features, proximity,
sound intensity, force, torque, heat, and even X-ray radiation
are valuable feedback elements that can be used to control the
motion of industrial robots, robot vehicles, machine tools, and
special-purpose inspection machines [8]-[9].
Tactile sensors can be mounted on the robot end-effector
to detect contact with objects. These devices may be simple
switches or transducers indicating the magnitude and direction
of forces. Arrays of transducers can be used to sense force
patterns, thereby enabling the robot to identify and localize
objects. Forcekorque sensors, mounted at the robots wrist,
can sense forces generated by the object being handled.
These forces may be due to the weight of the object being
manipulated or contact with other objects or surfaces. Such
sensors are used to adjust grasping, to avoid applying excessive
forces, and to guide the proper mating of surfaces and parts.
A common means of noncontact sensing for robots uses
photoelectric diodes to provide on-off signals and determine
proximity. In robot vision, solid-state video cameras capture
an image such as a part to be grasped or a distance to be
measured. The image can be processed by computer analysis
using pattern recognition techniques, and used to evaluate
grasp positions and automatically determine flaws based on

Pre-mired Gp(

Fig. 1.

Experimental infrared fire detection system.

a comparison with stored images in the computer. Computer


vision may use ambient light or structured light. Ambient light
systems rely on normal sources of illumination, while structured light systems provide special patterns of illumination
whose shape and orientation are known to the sensory system.
The advantage of structured light is that the special patterns
of illumination may be chosen to simplify and accelerate the
processing required to interpret the image. Procedures for
determining depth in images rely on geometric calculations
associated with triangulation procedures, but in structured light
systems, the triangulation occurs between the camera and the
light source. In ambient light systems, corresponding points
in two images taken from different viewing positions must be
triangulated, and this is much more difficult to achieve. Speed
is important in robot vision because visual information must
be used in real time by the control system to correct the robots
movements, a process called visual servoing.
Whatever system of illumination is used, the techniques
of image processing are similar. At low levels, they include
algorithms to threshold, and to find lines, edges, comers,
and connected regions. These low-level vision processes are
sufficient to perform many fundamental visual servoing operations. More complex systems include image-understanding
operations that allow the robot to identify objects, determine
orientation, and identify defects [lo]. This requires the robot to
have knowledge about the types of objects in its environment.
Such a knowledge base can be used, and combined with
inputs from sensors, to generate a world model describing
the state of the environment. The world model of the robot
can be continually updated on the basis of new sensory input,
and can also generate expectations about how the view will
transform under movements in progress. The world models
hypotheses in tum assist the interpretation of incoming sensory
data. Advanced robot sensory systems process information
from many different sensors. A major obstacle in using these
systems is the difficulty to implement algorithms and computer
hardware that can deliver useful information at rates fast
enough for real-time control.
Computer vision has many applications in addition to manufacturing. A key factor in predicting the spread of fires in
buildings is the temperature distribution of walls. To measure
this distribution a digital image analysis system, shown in

IEEE TRANSACTlONS ON INDUSTRIAL ELECTRONICS,VOL. 41, NO. 1, FEBRUARY 1994

Fig. 2. Sensor signals generated from buming wool cloth and newspaper.

Fig. 1, was developed to analyze infrared images generated by


burning standard materials in a scale model under controlled
conditions [ 1 11. The infrared camera was fitted with a bandpass
filter enabling the camera to see through the flame and thereby
measure the wall temperature that is 500" cooler than the
flame.
Neural network and fuzzy logic methods were used to
integrate sensor data for the detection of building fires and to
develop a strategy for extinguishment. A reasoning system was
developed with data from four common types of fire sensors:
infrared, ultraviolet, thermocouple, and a radiation-type smoke
detector [12]. Fig. 2 is an example of the output signals
produced by cloth and newspaper for each of the fire sensors.
The fuzzy reasoning system generates a warning signal that
can select a mode of extinguishment matched to the type of
fire, and thereby provide a basis for robotic fire fighting.

IV. MANUFACTURING
AUTOMATION
Flexible manufacturing also requires high levels of sensory
interaction and processing. Tool management is important
because machine down-time often results from tooling-related
problems. Moreover, the costs for perishable and durable
tooling in a flexible manufacturing system (FMS) may be
greater than labor and raw material combined. Losses due to
operator error, damaged tools, overstocking and understocking
of tooling, lost or misplaced tools, under-used inserts, and poor
quality parts can be avoided through the use of modem tool
management methods [ 131.
Robots must be capable of locating and positioning threedimensional parts, mating them to high tolerances, and performing complex joining operations, such as screwing bolts.
Robots with redundant degrees of freedom can reach a specific
position and orientation in more than one way, for example,
as needed for inspection inside the body of a car that is being
assembled. Robots are usually equipped with sensors to enable
them to feel and adaptively respond to forces from sticking
and jamming of parts during positioning and mating. Since
assembly usually requires the presence of many parts in the
workspace, intelligent robots with visual sensing can identify

&

Fig. 3. Diagnostic monitoring system.

and locate parts without the need for complex, inflexible


part-presentation equipment such as feeders, positioners, and
custom end-effectors. Vision capabilities required for assembly
range from simple to complex. In many cases, parts have only
a few stable positions in which they rest, so that recognition
and location of two-dimensional outlines are sufficient to
acquire and orient them.
Computer-controlled machinery can increase the productivity of manufacturing; however, failures can cause catastrophic
effects to production and may result in serious injuries to
workers. Using sensor-based monitoring, computer numerically controlled (CNC) machines and industrial robots can
be controlled in real time. Potential malfunctions of robotic
machines should also be monitored and the resulting patterns
evaluated so that scheduled maintenance is implemented before a breakdown occurs, as shown in Fig. 3. When used in
situations characterized by imprecise or fuzzy information,
different pattern recognition methods applied to the same
situation may result in ambiguous results. Fuzzy logic can be
used to combine this information by assigning a membership
function to each of the pattern recognition methods and basing
the evaluation on a multiattribute assessment method [14].
Software development is a significant part of the cost
of engineering a robotic workcell. System architectures and

GRUVER: INTELLIGENT ROBOTICS: AN OVERVIEW

programming environments must be able to handle the demands of flexibility, performance, and complexity that occur in modem production processes. For this reason, textual
programming languages are provided on industrial robots,
CNC machines, and other components of flexible automation
systems [14]. During the last decade, various programming
languages have been proposed for use in factory automation and computer integrated manufacturing (CIM) systems.
Some of these languages have attained several generations of
development, and they now include concepts from knowledgebased systems, model-based systems, discrete-event systems
and Petri nets, graphical languages and object-oriented programming languages, as well as related methodologies and
tools from database engineering and communication networks
[15]. Building on textual languages developed for industrial
robots and numerically controlled machine tools, manufacturing programming environments provide a common interface
to industrial devices and sensors that would otherwise require
specialized programming by different people with different
levels of skills and experience.

Fig. 4. SFUlSUAA dextrous robot hand.

-U-

V. DEXTROUS
ROBOTHANDS
Dexterity is important for industrial assembly, prosthetics,
and human movement. The usefulness of multifingered robot
hands to perform fine manipulation and grasping tasks is
widely recognized. Multifingered robot hands can accomplish
complex assembly tasks, achieve robotic hand cooperation for
fixtureless assembly, operate in hazardous environments, explore remote locations, and perform other important functions.
Dextrous robot hands have been developed for many uses.
Significant progress has occurred in mechanism and structure
design, low-level control, and integration. We shall describe
four designs for robot hands that have been developed for use
as research tools. The design of the StanforaJPL Dextrous
Hand [ 171 was motivated by anthropomorphic considerations
as well as kinematic and control issues. This mechanism is
composed of two fingers and a thumb, each with three degrees
of freedom, and actuated by 12 dc motors and 12 cables. The
Utah/MIT Dextrous Hand [18] is one of the most ambitious
efforts to develop an anthropomorphic robotic hand. This
device has three fingers and one thumb, each with four degrees
of freedom, and actuated by 32 independent tendons and 32
pneumatic cylinders. Each finger has three parallel axis joints
to provide flexiodextension motion and an additional joint,
perpendicular to the other axes, to provide radidulnar motion.
Because of the large number of axes to coordinate, dextrous
hands usually require complex control systems [19].
Combining features of dextrous hands and prosthetic hands,
the BelgradeNSC Hand [20] is an anthropomorphic endeffector for robot manipulators with five fingers and four
m o t o r s 4 n e for each pair of fingers, one for rotation, and
one for flexiodextension. Each finger has three parallel axis
joints with one degree of freedom to provide curling action.
When a finger pad contacts a grasped object, the other fingers
continue to close until the pressure on all the finger pads is
approximately equal.

GrasDinn
position-

for finger

Locations of

fiber optic bundles

Fig. 5.

Fiber optic method for object recognition with a dextrous hand.

In an effort to simplify the mechanism and the controller,


an improved design for a dextrous robotic hand has been
developed at Simon Fraser University in collaboration with
the Beijing University of Aeronautics and Astronautics, China
[21]. Based on an open-loop three-bar linkage, the SFUBUAA
Dextrous Hand has three degrees of freedom for each finger,
two rotary joints to provide flexiodextension motion, and
a third rotary joint, perpendicular to the other axes, for
radialhlnar motion. In the prototype mechanism, shown in
Fig. 4,three joints are actuated by three motors, four tension
cables for two flexiodextension joints, and the same four
tension cables plus two additional tension cables for the
radialhlnar joint. The device uses a new method for twoway tendon operated actuation with one motor for each joint
so that a nine-degree-of-freedom finger mechanism requires
exactly nine independent actuators to achieve the compound
motion of the joints of the finger mechanism, grasping, and
fine manipulation. Each joint uses a precision potentiometer to
measure joint angular position for control feedback. Encoders
on each motor provide signals for position and speed control.
Dextrous hands can be applied, not only to grasp and
manipulate, but also to determine 3D coordinates of an object
by sensing forces at the fingertips and acquire mass features
while grasping the object [22]. In addition, grasped objects
can be recognized by the use of fiber optic bundles placed at
the fingertips of the dextrous hand [23], as shown in Fig. 5.

IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS, VOL. 41, NO. 1, FEBRUARY 1994

Since the dynamic formulation of grasping has the same


form as a manipulator, controllers for dextrous hands can
be based on algorithms for robot control [4]. The computational effort required, however, to implement advanced control
methods can be very large. Not only must inertial terms be
evaluated, but we also have to process nonlinear terms having
many trigonometric calculations. Some of these difficulties
can be overcome by the use of feedforward neural networks
[24] in which the dynamic terms are learned from experience.
Future implementations of controllers for intelligent robots
will contain special-purpose hardware to implement advanced
control laws.
VI. MOBILEROBOTS
Simple mobile robots have long been employed for material
movement in factories, mail delivery in office buildings, and
similar tasks requiring intelligent choice of path and a limited
ability to avoid obstacles. Mobile robots for the home are
being developed, but economic justification of these devices
is still difficult. Developments in robot control and sensing,
combined with advances in mechanisms for robot locomotion,
will generate a significant expansion of applications for mobile
robots. Among these applications are robots for guard duty,
commercial cleaning, home vacuuming, and other forms of
service. Service robots have been developed for applications in
agriculture, house cleaning, construction, mining, fire fighting,
rescue, surveillance, handling of hazardous materials, military
uses, and undersea exploration [16]. In some cases, sensory,
world-modeling, and control capabilities have been added to
existing machines that already possess actuation and mobility,
such as mining equipment. In other cases, entirely new devices
were designed. Service robots represent the greatest potential
application of intelligent robotics in this decade.
Mobile robots may have wheels, tracks, and legs. They
can be designed to crawl in pipes and other crowded spaces.
Tracked mobile robots developed for military applications and
surveillance are the most stable and rugged of these devices.
An automated guided vehicle (AGV) is a wheeled mobile robot
that typically operates in a factory by following a guide wire
embedded in the floor. Payload capacity of these vehicles
varies from 500 lb to several tons. An AGV for light duty
work, such as mail delivery in office buildings, uses a stripe
attached to the floor. While an AGV can be programmed like
industrial robots to follow a preprogrammed trajectory, and
to make decisions along the way depending on inputs from
external signals, its motion is usually determined by the guide
wire or stripe. There is a trend toward development of systems
that are guided by passive markers, active communication
beacons, as well as other forms of wireless guidance.
One of the difficulties in using autonomous mobile robots
is the complex sensory requirement imposed by navigation in
unstructured environments. Control requirements for mobile
robots are considerably more difficult than those for industrial
robots. Since the control and perception processes in a mobile
robot decompose naturally across a set of layers, a hierarchical
structure can be utilized for control. At the lowest level
are the motor controllers and processes that describe and

.. . .

use feedback from sensor signals. These processes occur


with cycle times on the order of 10-100 ms. At the next
layer, perceptual information is brought together and overall
control is computed to direct the vehicle. These processes
occur at 100 ms-1 s. The third layer in the architecture
consolidates perception and locomotion. Cycle times in this
layer operate at 100 ms for road following to 10 s for
local path planning. Approaches for planning these actions
have been based on preprogrammed algorithmic methods,
while other methods have employed artificial intelligence rulebased decision making. At the highest level, the supervisory
computer operates at control cycles of 1-100 s. Because this
level requires decision making with large amounts of poorly
structured knowledge, expert systems and fuzzy logic are
useful tools.
VII. WALKING MACHINES
Walking machines, a category of mobile robots, have been
built with two, four, and six legs [25]. One company markets a
four-legged walking machine that can walk over obstacles, lift
up to five times its weight while stationary, and two times
its weight while moving. The control of gait stability for
legged locomotion systems, however, is a major research issue
for the implementation of legged walking machines. Static
stable locomotion during low-speed motion is characterized
by the center of gravity of the machine occurring within the
stable region determined by the location of the feet. Dynamic
stable locomotion occurs during high-speed motion, typically
for machines with four or fewer legs. Since dynamic stable
locomotion is necessary for the walking of biped robots,
methods have been developed to test and evaluate its use [28].
Research in biped locomotion also has provided numerous
results on kinematic and dynamic modeling, control algorithms, and other forms of stability. Methods for synthesizing
biped walking trajectories include recording human kinematic
data, feedback control laws, and methods based on passive
interaction of gravity and inertia. The author and his colleagues
have investigated the form of trajectories that should be
followed, and the choice of appropriate speeds and torques.
Solutions to these research problems have resulted in a better
understanding of high-level planning and control requirements and their implementation for biped walking machines
[26]-[28].
These analytical results were tested on an experimental
biped robot shown in Fig. 6. Each leg of this walking machine
has a thigh, shank, and sole, and six degrees of freedom with
three actuators in the hip joint, one in the knee joint, and two
in the ankle joint. The total height of the walking machine is
92 cm and its total weight is 45 kg. Twelve actuators provide
the capability to change the position and orientation of the
legs. Eight load cell sensors are located under each foot to
detect the vertical ground reaction force. The relative angles
between adjacent joints are monitored by optical encoders
and their speeds are measured by tachometers. The motors
are driven by servoamplifiers for velocity control. A harmonic
drive between each motor and its joint provides gear reduction,
and microswitches monitor joint limits.

....

-.

GRUVER: INTELLIGENT ROBOTICS: AN OVERVIEW

Fig. 6. Experimental 12-degree-of-freedom biped walking machine.

A computer controller has been implemented for this walking machine. A 386 PC served as a host computer for trajectory
planning, joint coordination, and data collection. Twelve Intel
8097 microcontrollers were used for joint controllers with the
dc servomotors, optical encoders, and joint sensors, and an
additional 8087 collects sensory data. Data were collected in
real time and stored in data files for recall and evaluation.
Data communication between the PC and the 8097 controllers
was implemented by 13 serial channels. Serial transfer was
chosen because it simplifies cabling and is supported by a
wide range of commercial products. The joint controller is
based on a modular design which, because of the U 0 features
of the 8097, was easily implemented. The joint controller
communicates with the host computer and carries out lowlevel distributed control functions. Within a 10 ms sample
time, the controller performs the required calculations and
sends an analog velocity signal to the dc motor driver. Except
for the communication driver, other high-level software for
the PC was programmed in C. Joint control software was
developed in PL/M-96 on the PC, and then downloaded to
the controllers.
Prosthetic devices have been developed to provide replacements for lost or damaged human functions. Prostheses for
upper limbs must be accurate and versatile so that they can
partially replace the functionalities of a human arm and hand.
The earliest upper-limb prostheses offered simple motional
functionalities without providing anthropomorphic features
[29], [30]. Command signals for electrical prostheses are provided by electromyographic (EMG) signals on the surface of
the skin resulting from muscle action or directly by nerve sig-

nals that are obtained from implanted sensors. Commercially


available prostheses, however, have not exploited modem
robotic technologies, e.g., tactile sensing at the fingertips, a
feature that humans routinely use in daily tasks.
A one-degree-of-freedom mechanical hand having automatic shape adaptation and fingertip force feedback has been
developed in collaboration between Simon Fraser University
and Tsinghua University, Beijing, China [31]. In a prototype
implementation, shown in Fig. 7, a mechanical hand has been
equipped with a multiple-speed transmission for changing
fingertip forces and a mechanism for grasping objects of
different shapes. The mechanism also has applications as a
flexible end-effector for robots [32].
The SFU/TU mechanical hand has three fingers controlled
by a single motor, and it provides the following four design and operational features: intemal synergy of the finger
motion, autonomous shape adaptation, three hand operating
functions, and automatically increasing fingertip forces. The
configuration of the finger mechanism can approximate the
motions of the three phalanges of the human finger. The
motions of each finger segment are not individually controlled,
but they are connected by means of linkages and gears to
achieve motions similar to human fingers. The second feature
of this hand is its design for autonomous shape adaptation
introduced in the Belgrade Hand by Tomovic and Bekey [20].
This feature provides equivalent additional degrees of freedom
for the fingers without the need for active control. When any
finger pad contacts a grasped object, the other fingers close
until the pressure on all of the finger pads is approximately
equal. Finally, through a self-adaptable palm mechanism, the
hand can perform grasping, pinching, and holding operations.
In another project with Tsinghua University, a prosthetic
arm with an integrated four-finger hand is being developed
[33]. The prosthesis has three degrees of freedom capable of
the basic motions of elbow flexion-extension, wrist rotation,
and hand gripping. As shown in Fig. 8, the hand mechanism
has four fingers resembling a human hand, and each finger
has three anthropomorphic joints. To stabily and reliably grasp
an object, the hand mechanism incorporates a shape-adapting
palm mechanism. When a finger pad contacts an object, all
fingers close until the pressure on all the finger pads is
approximately equal. To ensure that the fingertip forces and
speeds of the prosthesis approximate the characteristics of
a human hand, a multiple-speed transmission switches to a
slower speed when the hand touches an object.

1
Fig. 8.

Fig. 9. Exoskeletal walking aid.

SFU/TU four-finger prosthetic hand.

VIII. EXOSKELETAL
WALKINGAID
The largest number of disabilities involves the loss or
impairment of lower limbs. An active exoskeletal walking
mechanism could significantly improve a persons capability
for movement in daily living. Although commercially available
mechanisms are not available, prototype mechanisms have
been built [34]. A device to assist persons with lower-limb
disorders has been developed in collaboration with faculty
at Tsinghua University, China [35]. With the aid of this
mechanism, a person can walk smoothly using a rolling
support or crutches. The device also can be used as a training
device for patients with functional disabilities. Exoskeletel
mechanisms have applications for other purposes, for example,
to provide therapeutic exercising of the arm and shoulder
during the rehabilitation of stroke patients.
The exoskeletal walking aid, illustrated in Fig. 9, is based
on a planar multibar mechanism with one degree of freedom,
integrated with the waist socket as a frame for the mechanism.
The hip and the knee joints are driven by the same motor. To
simplify the mechanical structure, the ankle is passive. The
soles of the feet on the walking mechanism incorporate a
medium-hard plate spring with a rubber surface. The multibar
mechanism of the walking mechanism has been designed for
optimal motion, and the mechanism is capable of generating
preprogrammed, human walking gaits.
The walking mechanism consists of a waist socket fabricated from a lightweight material and a multibar exoskeletal
mechanism fabricated from a lightweight alloy. It includes a
motor, a transmission with self-locking capability, and a gait
control system. The hip and knee joints have been designed
to allow sitting.
To achieve alternating cyclic motion of the legs, the device
is controlled by a closed-loop system consisting of a comparator, trigger generator, counter, D/A converter, and pulse width
modulator. The controller changes the walking speed within
a prescribed range and automatically coordinates both legs.

The walking mechanism is equipped with force feedback from


each foot to realize the gait and provide stable locomotion. The
device is equipped with a rechargeable battery that can operate
for 3 h. In its prototype form, the total weight of the walking
mechanism is 10.5 kg including batteries.

IX. CONCLUSIONS
Intelligent robotics is a highly interdisciplinary subject
requiring knowledge from different fields of engineering.
Mechanical design is critical for applications of robotics, but
expertise is also needed in power electronics, computer integration, and software engineering. Robotics will continue to
be influenced by advances in actuators, control, mechanisms,
mobility, programming, and sensing. The development and
commercialization of these technologies to diverse applications will occur worldwide.
The development and integration of robotic systems will be
accelerated by the use of approaches involving knowledgebased systems, neural networks, and fuzzy logic. As indicated
in this paper, there are many applications for robotics in manufacturing automation, aids for the disabled, and service. Further
research into multirobot coordination, robots with parallel
structures and redundant degrees of freedom, and sensor-based
control will be important to successful applications of robots
in these areas.
Finally, collaborative research between universities and
industry, particularly at the international level, will encourage
the application of robotic technologies to new fields relevant
to societal needs.
REFERENCES
[1] I. D. Meyer, Applications of robots, in International Encyclopedia of
Robotics: Applicafions and Automation, R. Dorf and S. Nof, Eds. New
York: Wiley Interscience, 1988.
[2] J. Engleberger, Robotics in Pracfice, AMACOM Div., Amer. Management Ass., New York, NY, 1980.
[3] C. S. G. Lee, R. C. Gonzalez, and K. S. Fu, Tutorial on Robotics, 2nd
ed. New York: IEEE Computer Society Press, 1986.

GRUVER IhTELLIGENT ROBOTICS: AN OVERVIEW

[4] A. Koivo, Fundamentals for Control of Robot Manipulators. New


York Wiley. 1991.
[5] W. A. Gruver and B. I. Soroka, Programming high level languages,
in International Encyclopedia of Robotics: Applications and Automation,
R. C. Dorf and S. Nof, Eds. New York Wiley, 1988, pp. 1203-1235.
[6] C. L. Shih,T. T. Lee, and W. A. ciruver, A unified approach for robot
motion planning among moving polyhedral obstacles, IEEE Trans.
Syst,.,M a , Cybenr., vol. 20, pp. 903-915, JUly/AUg. 1990.
[7] C. L. Shih, J. P. Sadler, and W. A. Gruver, Collision avoidance for
two SCARA manipulators, in Proc. 199I IEEE Int. Con$ Robotics
Automation, Sacramento, CA, Apr. 1991, pp. 674-679.
[8] F. K. Weigand and W. A. Gruver, Fordtorque sensors for assembly
robots, in Computers in Mechanical Engineering. New York ASME
Press, Dec. 1986, pp. 49-54.
[9] T. A. Deane and W. A. Chver, Using X-ray vision to verify SMDboard quality, in Electronics Test. Miller Freeman, Feb. 1987.
[lo] W. E. L. Grimson, Object Recognition by Computer. Cambridge, MA:
M.I.T. Press, 1990.
[l 11 A. Arakawa, K. Saito, and W. A. Gruver, Automated infrared imaging
temperature measurement with application to upward flame spread
studies, J. Combustion and Flame, vol. 92, pp. 222-230, 1993.
[12] T. Oizumi, K. Saito, and W. A. Gruver, A fire detection and extinguishment system using fuzzy inference and neural networks, in Proc.
IFToMM-jc Int. Symp. Theory of Machines and Mechanisms, vol. 1,
Nagoya, Japan, Sept. 1992, pp. 81-86.
[13] W. A. Gruver and M. T. Senninger, Tooling management in a flexible
manufacturing system, in Mechanical Engineering. New York ASME
Press, Mar. 1990, pp. 40-44.
[14] H. S. Tzou, W. A. Gruver, M. Fang, and Y. Rong, Diagnostic monitoring of industrial robots, in Computer-Aided Production Engineering, V.
C. Venkatesh and J. A. McGeough, Eds. Amsterdam: Elsevier Science,
1991, pp. 353-362.
[15] W. A. GNVer and J. C. Boudreaux, Eds., Intelligent Manufacruring: Programming Environments for CIM, Advanced Manufacturing
Ser. London: Springer-Verlag. 1993.
[16] J. Engleberger, Robotics in Service. Cambridge, MA: M.I.T. Press,
1989.
[17] J. K. Salisbury and J. J. Craig, Articulated hands: Force control and
kinematic issues, Int. J. Robotics Res., vol. 1, no. 1. 1982.
[18] S. C. Jacobsen etal., Design of the Utah/MIT dextrous hand, in Proc.
1986 IEEE Int. Con$ Robotics Automation, May 1986.
[19] S. C. Jacobsen et al., Control strategies for tendon-driven manipulators, IEEE Conrr. Syst. Mag., vol. 10, no. 2, 1990.
[20] S. T. Venkataraman and T. Iberall, Dextrous Robot Hands. New York:
Springer-Verlag, 1990.
[21] G. Guo and W. A. GNVer, A new design for a dextrous robotic hand
mechanism, IEEE Contr. Syst. Mag., pp. 35-38, July 1992.
Adaptive sampling pattem
[22] L. Hassebrook, G. Guo, and W. A. &r,
and mass feature extraction for tactile object recognition with a threefingered hand, in Proc. I992 In?. Con$ Intell. Robots Syst., Raleigh,
NC, July 1992, pp. 1591-1596.
[23] Z. Katz, G. Guo, and W. A. Gruver, Fiber optics method for object
recognition with multifingered robot hands, in Proc. IEEE Inr. Con$
Indus?. Electron., Contr., Automation, San Diego, CA, Nov. 1992, pp.
762-766.
[24] G. Guo, K. Jin, and W. A. Gruver, Control of dynamic grasping systems
using neural network approximation, in Proc. 1991 IEEE Int. Symp.
Intell. Contr., Arlington, VA, Aug. 1991, pp. 196202.

11

[25] M. Vukobratovic, Legged Locomotion Robots and Anthropomorphic


Mechanisms, Monograph. Belgrade, Yugoslavia: Mihailo Pupin Inst.,
1976.
[26] C. L. Shih, W. A. Gruver, and T. T. Lee, Inverse kinematics and inverse
dynamics for control of a biped walking machine, J. Robotic Syst., vol.
10, pp. 531-555, June 1993.
[27] C. L. Shih and W. A. Gruver, Fuzzy logic force control for a biped
robot, in Pmc. I991 IEEE In?. Symp. Intell. Contr., Arlington, VA,
Aug. 1991, pp. 860-866.
[28] Control of a biped robot in the double support phase, IEEE Trans.
Syst., Man, Cybem., vol. 22, pp. 729-735, July/Aug. 1992.
[29] Amer. Acad. Orthopaedic Surgeons, Atlas of Limb Prosthetics: Surgical
and Prosthetic Principles. St. Louis, MO: Moshy, 1981, pp. 219-258.
[30] D. S. Childress, Powered limb prostheses: Their clinical significance,
IEEE Trans. Biomed. Eng., vol. BME-20, no. 3, pp. 220-227, 1973.
[31] G. Guo, X. Qian, and W. A. Gruver, A single-DOF multi-functional
prosthetic hand, in Proc. I992 ASME Biennial Mechanisms Con$,
Phoenix, AZ, Sept. 1992, pp. 149-154.
[32] G. Guo, X. Qian, and W. A. Gruver, Multi-function mechanical hand
with shape adaptation, U.S. Patent pending, Aug. 1993.

William A. Gruver (S68-M70SM80) received


the B.S.E.E., M.S.E.E., and Ph.D. degrees from the
University of Pennsylvania, and the Diploma in
Automatic Control Systems from Imperial College,
University of London.
He is a Professor of Engineering Science at Simon
Fraser University British Columbia, Canada. Previously, he was Director of the Center for Robotics
and Manufacturing Systems, University of Kentucky. His industrial experience includes senior positions with General Electric Company in robot
control and computer integrated manufacturing, IRT Corporation in electronic
inspection, LTI Robotic Systems which he cofounded, and aerospace engineering positions with NASA and the DLR German Space Research Establishment.
He is the author of many publications on robotics and manufacturing automation, including a new hook entitled Intelligent Manufacturing: Programming
Environmentsfor CIM (Springer-Verlag, 1993). His research treats intelligent
robotics in manufacturing and service with emphasis on grasping, motion
planning, and sensor-based control. Current areas of applications in his
research include multifingered hands, assembly automation, inspection and
defect recognition, and programming environments for manufacturing systems
integration.
Dr. Gruver received the Alexander von Humboldt Senior Scientist Award
for intemational contributions to research. He is an Associate Editor of the
ON SYSTEMS,
MAN,AND CYBERNETICS
and has served as
IEEE TRANSAC~IONS
an AdCom member of the SMC Society. He was an Associate Editor for the
ON ROBOTICS
AND AUTOMATON
and a founding officer of
IEEE TRANSAC~IONS
the IEEE Robotics and Automation Society. He is General Chairman of the
1994 IEEE International Conference on Robotics and Automation to be held
in San Diego, CA, and General Chairman of the 1995 IEEE International
Conference on Systems, Man, and Cybemetics to be held in Vancouver,
Canada.

You might also like