As a professor in the University of Pennsylvania’s School of Engineering and Applied Science, Daniel Lee works on important — and sometimes well-funded — research projects that advance humankind’s scientific knowledge and understanding.
Over the last five years, Lee’s projects have included coaching a highly-competitive canine soccer team, entering a challenging road race in the California desert and teaching a dog how to do a back flip. That last one is even harder than it sounds, as Lee recently demonstrated. The pooch successfully flipped backwards on a rubber mat on top of a three-foot high demonstration table but then caught the edge and fell on the floor with a loud thud. Said Lee: “Sometimes they’re not so intelligent.”
Lee’s subject was not a real dog, but a Sony Aibo — a robot. And while there is an element of whimsy in his research, Lee’s projects — teaching robotic dogs to play soccer as a cohesive team, or programming a car with sensors and small motors to navigate a traffic-laden city street with no driver — all advance the professor’s ambitious goal. Quite simply, he wants to learn how to make robots think and act like humans.
At a recent lecture titled, “Smart Robots: What’s Next?” sponsored by the Executive Master’s in Technology Management program — a partnership of Penn Engineering and Wharton — Lee said there is much to be learned before robots can routinely behave like humans in a broad range of tasks. For many years, the challenge was in the basic technology that makes robots work. But despite advances in that technology, human-like intelligence — even the ability to recognize a face — remains beyond our current reach, Lee noted as he posed a key question for his field: “What makes it so hard to build something intelligent?”
Replacing Human Soldiers
The answer is more than merely academic. In the United States, the Pentagon is spending billions — as much as $100 billion over a number of years, according to some technology analysts — to develop robots that can aid or replace human soldiers. The U.S. Department of Defense’s Future Combat Systems (FCS) modernization initiative is funding research to develop that technology. Among its recipients are Lee and others on a Penn team that was awarded $22 million by the Army Research Lab to create robots that can operate in combat zones with little supervision.
While military uses have tended to dominate commercial development of autonomous robots in America, business opportunities for smart robots are also sizable, according to Lee, who points out that Japan’s research into intelligent robotics has been oriented towards helping that nation’s rapidly aging population perform domestic tasks.
Advanced research into artificial intelligence may take commercial development of robots beyond what one expert recently called “the three Ds — anything dull, dirty or dangerous.” On March 31, Honda demonstrated a helmet-like device that can read human brain waves and transmit them to a humanoid robot, also built by Honda. With such a device, a person can make the robot, named Asimo, perform simple tasks, including moving its arm.
“In Japan, they see robots as a kind of service,” said Lee, while in the United States there is slower movement toward commercial tasks for robots. “In the U.S., the National Science Foundation is funding a lot of the work so there is a mixture, but it’s more from the defense budget.”
Lee envisions commercial applications down the road. Current research on these technologies has great potential to impact future industries and businesses. “We can now build miniaturized devices and equipment containing many sensors, actuators and computational electronics; the only problem is getting them to perform intelligently and robustly in a variety of environments. Not only will future gadgets store information and allow users to play games, but they will also be more helpful,” he says.
A physicist by training with an undergraduate degree from Harvard and a PhD in condensed matter physics from MIT, Lee was drawn toward the study of robots after he went to work for Bell Labs during the 1990s. He wanted to learn more about why — despite rapid advances in computer technology — robots were still unable to perform tasks that humans and other animals can handle with ease. In particular, Lee now studies the biology of living creatures to better understand how we compute information, and works to transfer that information to the world of robots.
“You [have] all these devices with all these computerized [components] and all these sensors, but people don’t know what to do with them,” Lee said of the current state of robot development. To build a smarter machine, the best road map to follow is the human brain.
The notion of mimicking biology to advance technology is not a new concept. Lee noted that before the idea of aerodynamic lift was first applied in the early 20th century, people believed you only needed wings to be able to fly. “You would see a guy put on a pair of wings and jump from a building — but that didn’t work very well.” The Wright Brothers’ successful flight in 1903 was, in part, a result of studying more closely the real mechanics of bird flight. Birds flap their wings for stability and propulsion — but they fly because the curvature of their wings speeding through the air creates more air pressure below them than above them, providing the lift necessary for flight. “So the pilot’s main job became to pull a flap to warp the wings of the flying machine.”
But the human brain is much more difficult to replicate than a bird’s wings.
Computers can apply “brute force” number crunching to outplay a chess champion, Lee noted, but that approach can only be taken so far. The brain, he said, is a much more elegant machine that is far from fully understood. “Traditional computer algorithms which do fast search and brute computation will not make machines intelligent. So we need to develop algorithms that approach these problems in different ways in order to build robots that can perform in complex environments. There has been some nice work already, but there is still a very long ways to go before a machine can be as smart as a dog.”
However, other feats of biological creatures have been mastered with robots — as Lee demonstrated with his Sony Aibo. “Hello,” he shouted to the electronic canine, causing the device to pivot its head directly toward him. He walked to the other side and said, “Hey, over here,” and the robot spun its head around. To develop this robot skill, designers had to develop sensors for sound that would mimic human sound-processing: Nanosecond differences in the arrival of sound waves to the left or right sensor tell the robot where the sound originated, and which way to turn its head.
Over time, Lee said, the steady improvement in robots’ ability to sense their surroundings and perform motorized tasks will enhance their artificial intelligence — their ability to perform certain tasks without continuous human guidance. This is where international soccer competitions involving four-robot teams of the Sony Aibo dogs have proved both a fun diversion and invaluable learning tool for Lee and his students.
The “UPennalyzers” — a soccer team created by Lee, Penn engineering students and faculty — consists of four robotic dogs, each embedded with computers programmed to use their sensors to analyze the field of play. They use wireless communications to talk with their teammates — but they are not controlled by humans during the course of the 20-minute games, which are played on a three-meter by five-meter field.
Lee said the robot games were crude at first. “Ever see a six- or seven-year-old [kids’] soccer game, where everybody goes after the ball at the same time? That’s how this was, so the ball would go into the corner for a half-hour, or they would kick it into their own goal.” But over time, the robots developed the intelligence to play soccer at a higher level, as Lee demonstrated with a video of a sophisticated goal by the UPennalyzers in their most recent match. “Now they can communicate: ‘You go for the ball and I’ll play defense.'”
One reason the team uses robotic dogs instead of humanoid robots is that two-legged locomotion has proved extremely difficult for robots. In recent years, Lee and his team have received at least $3.5 million in grant money from DARPA — the Pentagon’s Defense Advanced Research Projects Agency — to teach robots to walk and also to develop a class of robots that can use sensors to autonomously navigate difficult terrain. The military advantages of such devices, which could transport heavy loads over mountains or across deserts in a dangerous combat zone, are obvious, but successful two-legged navigation remains something of a Holy Grail for robot programmers. “They want a machine that can walk on its own with 200 pounds over a mountain, because you can’t take a car up a trail,” he said.
New Use for Cigarette Lighters?
However, developing driverless smart cars — a project that’s also the recipient of significant DARPA funding, with the ambitious goal of making one-third of U.S. military vehicle transportation driverless by 2015 — has proven both more successful and a source of competitive pride for Lee and his students. They recently were among just six teams to complete the entire course at the 2007 DARPA Urban Challenge in Victorville, Calif., held on the streets of an abandoned U.S. Air Force base.
“Little Ben” — the name that the Ben Franklin Team, composed of students and faculty from both Penn and Lehigh, gave to their overhauled Toyota Prius — had to navigate city streets while obeying traffic laws and signs and avoiding collisions with the other finalists. To make the event more realistic and challenging, there were also 50 human test drivers, wearing crash helmets, cruising the streets in their vehicles. A cast of onlookers stood behind thick concrete barriers.
Carnegie Mellon University, working with General Motors, won the event and the $2 million cash prize; Stanford University received federal funding to develop their robotic smart cars. With less money available, the Ben Franklin Team developed a car that required less power — and burned less gas — by clever placement of fewer sensors. Said Lee: “With the Prius, we were able to use the cigarette lighter to power all the computers on board.”
Lee told his Penn audience there will be a sure way to know when robots have reached the next level of intelligence, and that is when they have mastered the highly complex human task of loading a dishwasher. “Industrial automated robots took us a long time,” he said, “and now we need a way to get a robot to load a dishwasher, when every night you have a different set of dishes and glasses.”