Engineers have made significant strides replicating vision and touch in automated systems. Assembly line robots may soon be equipped with other humanlike features, such as hearing, smell and taste. This article reveals the latest automated sensory technology research and explores potential assembly applications

It's a cool, crisp early-October morning in 2020. Jordan Roswell steps out of his aerocar after landing at the vast Futura Manufactronics assembly plant in the middle of Lake Michigan. It's been a relaxing 20-minute ride from his rustic log cabin in northern Wisconsin. During his short commute, Roswell thought about some of the products he saw on display a few days earlier at the 41st annual Assembly Technology Expo in Chicago.

Roswell is a manufacturing engineer at Futura's floating factory that assembles vehicles for the Pluto division of GENMOTO Corp. [a company formed after the mega-mega-merger of General Electric, General Motors and Motorola in 2010]. Pluto specializes in 1950s-era vehicles that feature retro-styling on the outside coupled with state-of-the-art fuel cell technology and drive-by-wire electronics under the hood.

Each vehicle is made entirely from a hybrid composite of recycled plastic, steel and aluminum. The vehicles are the hottest things to hit the personal transportation industry since the minivan debuted 36 years earlier.

Futura Manufactronics is a contract assembler that makes vehicles for several different auto companies [GENMOTO hasn't owned or operated an assembly line in more than 10 years]. Products are produced on a build-to-order basis. Customers specify the features they want by e-mail, videophone or telegraph [the latest nostalgia fad] and receive their custom-made vehicles less than 3 days later.

As he bites into a Starbuckrogers doughnut fresh from his desktop microbakery, Roswell glances out his glass-encased office that overlooks the entire assembly line. A legion of mobile, humanoid roboassemblers acting with complete autonomy perform a wide variety of tasks with incredible speed and total accuracy.

Two seconds after a retina scanner verifies Roswell's identity, a heads-up display appears on the glass wall. Voice-activated charts and graphs provide updates on all the robots' sensory systems. Vision, tactile, auditory and olfactory sensors appear to be functioning without any problems. Satisfied that everything is normal, Roswell reacts to his sense of hunger, and returns to his doughnut.

While this scenario may sound a bit far-fetched, assembly robots capable of emulating all five human senses are more fact than fiction. Automated sensory technology could be coming soon to an assembly line near you. In fact, within 20 years, perhaps the only humanlike feature robots won't possess will be a sense of humor.

Within 20 years, autonomous humanoid robots equipped with basic human senses might perform multiple assembly tasks. Photo courtesy Honda Motor Co.

Back to the Present

Most observers never expected machines to ever come close to replicating human senses. "Nothing is more indisputable than the existence of our senses," proclaimed 18th century French mathematician and philosopher Jean Le Rond d'Alembert. And, as recently as 1980, industrial robot pioneer Joseph Engelberger made the bold statement: "No robot can hope to match man with his acute senses." But, when it comes to robotic technology, 20 years is a long time.

The fields of robotics and human sensory automation are converging. The result will be machines that act more and more like human assemblers.

Sensors that correspond to the human senses of sight, hearing, touch, smell and taste are allowing robots to take on a new dimension. Faster processing power has improved not only the speed of robots, but their ability to sense and negotiate the task at hand. "More sensors or senses enable the gathering of more data to use to make decisions," says David Wright, Ph.D., chief scientist at Braintech Inc. (Vancouver, BC).

Today, most industrial robots still perform a limited range of repetitive assembly tasks with few sensors and little or no flexibility. They typically require a tremendous amount of programming to make them useful. Many tasks cannot be programmed in advance, because real-world operating conditions cannot be predicted exactly.

Current industrial robot arms are position-controlled. They go where they've been told to go, even if there's something in the way. With sensory perception, however, if a robot arm finds an unexpected object in the work envelope, it can take appropriate action without damaging the object.

Human sense can be very difficult to replicate in robots because vision, touch, hearing, smell and taste are extremely subjective. For instance, something that looks blue and feels cold to one person may look black and feel warm to another.

Scientists huddled in university research centers and government laboratories believe artificial sensors will soon surpass the acuity of human sensors. So far, they've made the most progress in machine vision systems. Touch research is also making tremendous strides and is the focus of some of the sexiest experiments. Developments in automated smell, hearing and taste systems are not far behind.

While it's becoming easier to develop new sensory technology, coming up with practical applications is a slower process. "Cost is one the biggest barriers to commercial application," says Rick Maxwell, program manager for assembly at Fanuc Robotics North America Inc. (Rochester Hills, MI). "The challenge is to make automated sensory systems more cost-effective and easier to apply to robots. For instance, cameras equipped with features such as auto-focus and auto-iris currently exist, but the price tag limits their widespread use in industrial robots."

"At present, it's much more cost effective to use humans than create a robot with all human senses," adds Wright. "Current generations of pick-and-place robots already carry out complex assembly tasks without need for all human senses."

Vision

"With the sense of sight, the idea communicates the emotion . . . ."
-- Alfred North Whitehead

More than 50 percent of the human brain is dedicated to visual intelligence. Humans infer knowledge and wisdom from shapes and movement patterns in objects seen through the eye, leading them to some action.

Adding vision systems to robots allows more complex tasks to be accomplished. For instance, a digital camera that can recognize problems, such as a part that has shifted, can relay messages that allow a robotic arm to make last-second adjustments.

Artificial vision is enabling industrial robots to do things that were impossible just a few years ago. Until recently, robotic vision systems were expensive and provided limited visual acuity. Cheaper components, such as PCs and digital cameras, and faster processors have dramatically reduced the cost of producing a robot with artificial vision. According to Wright, vision systems are now easier to install and easier to integrate with robots.

Machine vision technology and robotic motion control are becoming more intertwined. Vision is being used to enhance assembly accuracy. But, robotic vision applications have only scratched the surface. The next step is to enable a robot to recognize and react to an object without having to feed it the CAD file of the part. This way, the robot would be able to teach itself to assemble almost any object.

"Vision is the easiest sense to duplicate and integrate in a robot today," adds Fanuc's Maxwell. Today's cameras provide very high resolution. In some respects, they are better than human eyes, especially in applications that call for infrared or ultraviolet vision.

Despite recent advancements in machine vision, industrial robots still lack stereo vision and the ability to easily track moving objects. But, that characteristic of vision that most humans take for granted will soon be integrated with robots on the factory floor.

One unique aspect of the human eye is its ability to follow a moving object. Researchers in France recently developed a machine vision system that improves the ability of a robot to emulate that kind of tracking movement. With a visual guidance system that can adjust for variations in speed and trajectory of the target, a robot could perform tasks that would be impossible with blind placement.

The higher the sampling rate of a vision system, the more closely a robot can follow a target and respond to changes in speed and direction. A research team at the Massachusetts Institute of Technology (Cambridge, MA) has also been tackling the motion-vision coordination challenge. The Vision and Touch Guided Manipulation group at MIT's Artificial Intelligence Lab has developed robots that can catch free-falling objects.

A robot equipped with a whole arm manipulator (WAM) has been programmed to catch fast randomly moving objects. The WAM relies on a pair of fast eye gimbals (FEGs) to locate and track a wide variety of objects, such as sponge balls, long cylindrical cans and paper airplanes. The FEGs provide stereo vision similar to human eyes.

The next generation of vision-guided industrial robots is slowly making their way out of the lab. Last year, Fanuc Robotics unveiled what it claims is the first intelligent robot with both vision and touch-sensing capabilities. "The I-21i robot has sensory control using a 3D visual sensor and a six-axis force sensor," says Maxwell. The visual sensor can locate randomly positioned parts in any orientation. According to Maxwell, the force sensor can be used to insert parts in horizontal, diagonal and vertical directions.

Touch

"Nothing that we love over-
much is ponderable to our touch."
--William Butler Yeats

Although robot hands emulate the structure of human hands, they are far from dexterous. Indeed, traditional robotic grippers lack flexibility. Their fixed-jaw geometry cannot grasp parts with changing orientation and securely hold them.

The solution to the problem lies in haptic devices. The term haptic is based on the Greek word, haptikos, meaning to grasp or perceive. Researchers are developing tactile sensing devices that allow robots to touch things and actually feel them. Special sensors attached to the end of a robot arm can detect whether the machine is touching something soft, such as a sponge, or something hard, such as a round metal part, and apply the appropriate force.

Haptic applications for industrial robots have taken longer to evolve than machine vision, because it is a more difficult challenge. According to Ralph Hollis, Ph.D., principal research scientist at the Robotics Institute of Carnegie Mellon University (Pittsburgh), rendering visual images is a one-way street, while haptic devices are two-way.

"Eyes take in photons, but don't shoot them out," explains Hollis. "A hand manipulates, but there is force feedback, too. So, any kind of haptic device that we use to interface with robots must take input from users as well as deliver output through the same mechanism."

Tactile sensing is more complex than vision or hearing. For example, to satisfy the eye that an image is moving, it is enough to display 15 still pictures a second. The haptic equivalent--fooling a fingertip into believing it's feeling a surface--takes a thousand impulses a second. In addition, while eyes respond exclusively to light, hands and fingers respond to force, vibration and temperature.

The human touch system also is very sensitive. "There are about 2,000 receptors in each of our fingertips whose only role is to gauge qualities like texture, shape and the ability to cause friction," says Mandayam Srinivasan, Ph.D., director of MIT's Laboratory for Human and Machine Haptics. "There may be even more sensors for gauging warmth or coolness, and for detecting mechanical, chemical or heat stimuli."

The BioRobotics Laboratory at Harvard University (Cambridge, MA) is attempting to define the ways that tactile information can improve robot dexterity. Much of the research revolves around vibratory information that can signal important events, such as the first instant of contact and the onset of slip.

"These events convey information about the state of the hand-object system that is essential for robust control of manipulation," says Robert Howe, director. "Vibrations also provide perceptual information about properties such as surface texture and friction."

The Dextrous Manipulation Lab at Stanford University (Palo Alto, CA) is using a cyber glove to interact with a two-fingered robotic arm that feeds back force. Whatever the user does, the robot does. The robot feeds back what it "feels" to flexible sensors embedded in the glove's fingers.

Despite the challenges confronting haptic researchers, Hollis believes grippers that mimic human dexterity will open new doors for industrial robots. Bill Townsend, president of Barrett Technology Inc. (Cambridge, MA), is converting laboratory research into real-world products. His company's BarrettHand is an intelligent, highly flexible eight-axis gripper based on haptic research conducted at MIT and the University of Pennsylvania.

The hand reconfigures itself in real time to conform to a wide variety of parts shapes without tool-change interruptions. Each of the grasper's three humanlike fingers, capable of 180 degrees of movement, is independently controlled by one of three servomotors, which are housed in the palm body, along with microprocessors, sensors and other electronics. The fingers grip objects by curling together in a fist and applying pressure.

Hearing

"With sound, the emotion communicates the idea . . . ."
--Alfred North Whitehead

Most efforts to integrate industrial robots and auditory sensory systems have focused on voice-recognition. Voice-activated controls allow human operators to tell robots what, where and when to do different tasks. Robots that "listen" to spoken instructions and rules can learn what to do when they encounter certain situations.

Industrial robots equipped with auditory sensors can also provide valuable feedback to human operators. For instance, hearing devices can determine if a product "sounds" correct. Certain sounds can be recorded and added to a database.

If a door lock installed on a car does not sound right, for instance, engineers could be alerted to the potential problem before the vehicle leaves the assembly line. Automotive assemblers also use electronic ears to detect missing parts, such as connecting rod bearing inserts, in engines.

However, giving robots the ability to hear sounds in an assembly environment can be very challenging. Ambient background noise, such as parts handling equipment and ventilation systems, can interfere with automated listening devices.

Ultrasonic sensors provide an effective way to limit such interference. Researchers at Pacific Northwest National Laboratories (Richland, WA) have developed an ultrasound inspection device that detects cracks in bolts more easily and less expensively than alternatives.

The sonar-like sensor can look through the end of a fastener to detect small cracks and corroded areas. It allows fasteners to be inspected while in place. Assemblers are provided with a visual representation of the fastener and any fractures or degradation.

Smell

"It takes little talent to see clearly what lies under one's nose, a good deal of it to know in which direction to point that organ."
--Wystan Hugh Auden

Many products, such as automobile seats and interior modules, need to be checked for unpleasant smells before and after assembly. Traditionally, this task is carried out by professional sniffers who possess an acute sense of smell.

Unfortunately, the human olfactory system is extremely subjective. Different people are affected in different ways by similar odors. And, humans are limited by the number of products they can smell in one day.

Automated systems, such as electronic noses, have been developed to supplement human sniffers. An electronic nose is a device used to analyze the content of air through the classification of odors. It uses an array of very small sensors to detect gaseous molecules and simulate the odor-sensing capabilities of the human nose.

Electronic noses have been used successfully in the food industry for applications such as checking the freshness of cheese, fish and fruit. But, the technology is also applicable to assembly processes, especially in the auto industry.

Engineers at Ford Motor Co. (Dearborn, MI) are using electronic noses to identify good and bad samples of carpet, cloth, leather, plastic, wood and other materials used in automotive interiors. The e-nose uses an array of 12 chemical sensors. Each sensor responds to different components within an aroma to produce a "fingerprint" that identifies the material under test. The machine can then seek a match with other fingerprints in its memory.

The individual sensors have a polymer surface that acts as a conductor between two electrodes. The polymers react with the aroma molecules in an air sample, varying their electrical resistance. The change in voltage across the polymer is then measured by passing a current between the two electrodes. Each of the 12 conductive polymer sensors has a different structure and responds to different molecules.

The electronic nose operates in conjunction with an autosampler, where test materials in small glass vials are heated for 3 hours at 80 C. A needle is then automatically inserted through the rubber seal in the vial to draw off a sample of air, including aroma molecules, and feed it to the sensor head for testing.

Ford is encouraging its suppliers to start using electronic noses when assembling interior components and modules.

Taste

"Everyone carries his own inch-rule of taste . . .."
--Henry Adams

Compared to the senses of sight, hearing, touch and smell, scientists know relatively little about human taste. Indeed, taste is the most mysterious human sense and the hardest to replicate with robots or sensors. It is subject to more personal nuance than vision, touch and other senses.

Not surprisingly, robotic taste applications lag behind other sensory research. That probably is a good thing. After all, an industrial robot with a taste disorder could devour a lot of expensive parts.

Similar to sense of smell, the sensation and sensory process behind taste is based on chemistry. A fourfold classification system is used to determine taste: sweet, salty, bitter and sour. Researchers have tried to mimic human taste buds by linking together sensors that detect a variety of compounds.

Alpha MOS America (Hillsborough, NJ) recently unveiled what it claims is the first electronic tongue. The Astree sensing system tests liquids and conducts taste analysis. Dissolved organic and inorganic compounds are tested for qualitative and quantitative applications.

Most applications for automated taste systems center around R&D and quality control in the food, beverage, cosmetic and pharmaceutical industries. But, possible industrial applications for a robotic arm equipped with taste sensors would include testing paints, coatings, sealants, adhesives and solders before or after they are applied to parts. Leak detection and testing is another potential application that could benefit from automated taste systems.

Calling All Humanoids

In 1950,Super Science Storiesmagazine published an illustration of humanoid robots working on an assembly line some 50 years in the future. Today, that scene remains futuristic, but the sight of humanoids on the plant floor may not be too far away.

As sensory perception systems improve, industrial robots will take on increasingly humanlike qualities. Within 20 years, there's a good chance that autonomous humanoid robots equipped with all five basic human senses will be performing numerous assembly tasks. On the other hand, tomorrow's robots may look more like insects. Robotics and physiology researchers at several universities are developing a new class of biologically inspired machines called biomimetic robots.

Most robots that look and act like humans exist only in the basements of university laboratories. But, humanoids that combine intelligence and mobility hold commercial potential. Honda Motor Co. (Tokyo) has been at the forefront of developing humanoids to generate publicity for its technical prowess and to explore possible manufacturing applications.

Honda recently unveiled a state-of-the-art humanoid called Asimo (the name stands for "advanced step in innovative mobility"). The two-legged robot walks upright, responds to sudden movements and performs basic assembly tasks, such as turning a screw.

It's the third-generation of humanoids made by Honda engineers. The company previously developed robots dubbed P2 and P3. Asimo stands 3.95-feet tall and weighs 94.8 pounds. It can walk 1 mph with a natural, stable gait. Each of Asimo's two legs feature six degrees of freedom. The humanoid boasts 105 degrees of vertical arm movement.

Honda engineers developed intelligent, real-time flexible walking technology called "i-Walk." With the ability to predict and react to sudden movements, Asimo can shift its center of gravity and change direction.

"More computational power and more proven, richer and varied applications are required before anthropomorphic or humanoid robots become cost effective," says Braintech's Wright.

While researchers have been progressing steadily with artificial sensation, the final frontier for industrial robots is artificial intelligence. No one knows when or if robots will be able to make conscious ethical decisions and evoke emotional reactions on the assembly line.

But, scientists are making strides in areas such as commonsense reasoning, introspective consciousness and neural networks. When cognitive and manipulatory abilities converge with sensory perception, robots will be able to recognize arbitrary objects by sight and know how they will interact.

Seventh Generation Technologies Inc. (Boulder, CO) recently unveiled a digital brain that integrates vision, auditory, touch and speech recognition within a neural network software package. The software enables robots to recognize and name objects in its vision.

Despite such promising technology, many observers are conservative when they gaze into their crystal ball. "It will be at least 20 years before we have a robot that can do intuitive things that humans do well," predicts Barrett Technology's Townsend. "And, we must be careful to think of these new robots as just tools, not as a replacement for human assemblers."

Electromagnets, activated by a computer, attract permanent magnets and cause the sphere to move into a new position. Photo courtesy Johns Hopkins University

Sidebar: Spherical Motor Boosts Flexibility

In addition to sensory perception and cognitive reasoning, dexterity distinguishes humans from robots. However, a new globe-shaped motor capable of rotating in any direction could lead to more precise robotic joints.

The spherical motor consists of a globe filled with 80 permanent magnets resting atop a saddle of 16 circular electromagnets. Shaped like a ball in a socket, the motor turns in any direction, allowing a full 360 degrees of motion from a single device. It promises to give robotic arms greater flexibility and precision with unhindered mechanical motion.

"A conventional motor turns on an axis, moving in one direction," says Gregory Chirikjian, an associate professor of mechanical engineering at Johns Hopkins University (Baltimore) who developed the magnet-guided device. "Our design is able to achieve a much wider range of motion."

Traditional robotic arms need six or more conventional motors to position and orient objects in three dimensions. According to Chirikjian, the spherical motor behaves like a human shoulder joint, rather than an elbow joint. As a result, he claims "three spherical motors could give a robotic arm a greater range of motion than arms that have six traditional motors.

"You'd be able to use far fewer joints because each spherical motor would have more freedom of motion," explains Chirikjian. "This would also enable the robotic arm to be more accurate, because every time you have a joint you introduce a little bit of play, a little bit of wiggle to the arm. When you have six or more traditional motors, that little bit of wiggle adds up."

Chirikjian and his team use a 12-inch diameter plastic sphere as the rotor. The rotor is assembled by fixing 0.75-inch diameter permament magnets to the inside of the sphere. A tapered pedestal acts as the base, houses the stator structure and supports the rotor. An adjustable saddle holds the 2-inch diameter stator electromagnets.

By activating two or more electromagnets, an operator causes them to attract certain permanent magnets inside the sphere. This attraction pulls the sphere into a new position. The saddle can slide up and down in the pedestal to adjust the reluctance of the motor. Power for the electromagnets is supplied by a standard 24-volt magnet supply.

Sidebar: Self-Programming Robots

The field of biotechnology may spawn the development of intelligent, self-programming industrial robots. Genetic programs can enable robots to make smooth, efficient movements.

Supplied with the necessary initial data, a genetic program autonomously develops a number of different procedures. It selects the best of these variants, develops them further and filters out the best of these once more.

Several generations of programs evolve, of which the youngest is invariably superior to the previous generations. This procedure is repeated until the optimal program solution is reached. The programs can autonomously design algorithms for solving complex problems that otherwise could either not be solved at all or would be very time consuming.

Researchers at DaimlerChrysler (Auburn Hills, MI) believe genetic programs will benefit numerous assembly applications. Traditionally, a programmer has to train a robot in a "teach-in" session lasting several days. Each individual movement sequence is programmed, and the robot has to keep executing the appropriate movements until they are sufficiently refined by the program.

Those same movements can be carried out with the help of a genetic program. First, a robotics expert provides the program with all necessary spatial data. The virtual scenario is then set up on the factory floor by the program itself, and the quest for the optimal movement sequence begins. It only takes about half a day for a genetic program to be developed.