When people think of self-driving cars, the image that usually comes to mind is a fully autonomous vehicle with no human drivers involved. The reality is more complicated: Not only are there different levels of automation for vehicles — cruise control is an early form — but artificial intelligence is also working inside the car to make a ride safer for the driver and passengers. AI even powers a technology that lets a car understand what the driver wants to do in a noisy environment: by reading lips.
In Silicon Valley, there is a race to develop the best technology for autonomous vehicles. “It’s perhaps among the most exciting times to be talking about autonomous vehicles,” said Wharton professor of operations, information and decisions Kartik Hosanagar on a panel at the recent AI Frontiers conference in Silicon Valley. “Ten years back, most of the work with autonomous vehicles was just going on in research labs and various educational institutions.” About five years ago, only Google and a handful of companies were testing them. “Today, there’s a frenzy of activity,” he said. “Just in California, the number of companies that have licenses to do testing and operating of driverless vehicles is already somewhere in the 30 to 50 range.”
Globally, the U.S. and China are ahead in the self-driving race. Germany and Japan, despite being famous for their autos, are behind. “The key difference is AI,” said Tony Han, co-founder of China-based autonomous vehicle company JingChi. “China and the U.S. are leading in AI.” When it comes to self-driving regulations, China and the U.S. also lead. What’s driving this intense interest are three mega-trends: the rising popularity of electric vehicles, emergence of the shared economy that is powering ride-sharing firms like Uber and Lyft, and advancements in artificial intelligence. If you think about it, he said, autonomous driving is really about combining a robot driver with an electric car.
According to Han, most autonomous vehicle firms are developing technology that is suitable for what he calls a level 4 roadster. There are five levels of automation in self-driving cars. Level 1 is the most minimal, with a typical feature being cruise control that has been around for years. Level 5 is the most advanced, with the vehicle being fully autonomous. Level 4 is a notch below — a highly automated level where the car can operate in certain situations without driver intervention or attention, such as in specially fenced off areas or in traffic.
“This is not a recommendation engine for Netflix. The AI has to be spot on.” –Danny Shapiro
AI Inside the Car
Danny Shapiro, senior director of automotive at chipmaker Nvidia, said tech companies take the development of autonomous vehicle technology seriously because the stakes are high. “This is not a recommendation engine for Netflix,” he said at the conference. “The AI has to be spot on.” That means it requires “extreme” computing power and a lot of code, Shapiro said. In the self-driving vehicle’s trunk are powerful computers and graphics processing units doing deep learning to parse all the data coming in — to determine such things as whether the object ahead is a person, another car, a fire hydrant and so on.
Even if it will take some time for fully autonomous vehicles to hit the market, AI is already transforming the inside of a car. Front-facing cameras can identify people in the vehicle and track the driver’s eye position to see whether he is falling asleep or distracted — and even read the driver’s lips. Sensors and cameras outside the car work with interior technology to enhance safety. For example, the car warns audibly that there is “cross traffic danger” if another vehicle is about to run a red light. It can also say things like “Careful! There is a motorcycle approaching the center lane!” to alert the driver in case he or she wants to do a lane change. “There’s going to be a whole host of guardian angel-type features even if we’re not fully self-driving,” Shapiro said.
Indeed, a major goal of self-driving companies is to make driving safer. Human error is responsible for 94% of car crashes, said Jeff Schneider, senior engineering manager at Uber and a research professor at Carnegie Mellon University. He noted that half of the mistakes leading to accidents were due to recognition errors — the driver was not paying attention or did not see something coming. The other half was the result of a decision error: The driver was going too fast or misunderstood the situation.
According to Schneider, self-driving vehicles can address these two types of errors. Problems of recognition would be mitigated by using sensors, radar, cameras, Lidar (a remote sensing system) and other tools. The cars can see 3D positioning of objects and other things around them, receive 360-degree camera views in high resolution and access other pertinent data such as velocities of objects. Meanwhile, sophisticated computing systems analyze the landscape to make the right driving decisions.
“Put yourself in the [position] of the person writing code [for driverless cars]. You have absolute chaos.” –Jeff Schneider
One way to help accuracy is by incorporating redundancy in the systems. For example, if a road sign were somehow obscured, measures are put in place to make sure the self-driving car does not get confused. Schneider said the car’s own map would inform it that there is a road sign at that location. Also, these vehicles go through enormous amounts of data to train them to operate under various conditions such as snow, rain, sleet and floods. Autonomous vehicle companies even use computer-generated conditions to train the car to drive through such things as a blinding sunset. “Using a rack of servers, we can generate over 300,000 miles [of driving] in just five hours, and test algorithms on every paved road in the U.S. in just two days,” Nvidia’s Shapiro noted.
To be sure, these are complicated tasks for the car. “Put yourself in the [position] of the person writing code” who has to account for people crossing the street, other cars on the road, billboards, traffic signs ahead and lanes for cars, bikes and pedestrians, among others, Schneider said. “You have absolute chaos.”
Safety and Security
To skeptics who see a fully autonomous vehicle as a pipe dream, it would be helpful to look back at how far autonomous vehicles have come, Schneider said. As early as the 1980s, Carnegie Mellon University’s NavLab project already was equipping vans with computers and sensors for automated and assisted driving. “It was the age of robotics when the rule was to keep the video running just in case something good happens,” he said. In 1995, the university’s “No Hands Across America” drive from Pittsburgh to Southern California was 98% autonomous and included a 70-mile stretch without human intervention, Schneider said.
To skeptics who see a fully autonomous vehicle as a pipe dream, it would be helpful to look back at how far autonomous vehicles have come.
In 2000, the university moved to off-road vehicles. The new things added to the roadsters were GPS and Lidars to make it easier to pinpoint objects and get around them. Seven years later, at the DARPA Grand Challenge, a contest for autonomous vehicles, a major development was the addition of good maps that provided a full reconstruction of the environment. “AI took a step forward,” Schneider said. CMU won the contest. It was also at this point that Google recognized the potential of autonomous vehicles and started its self-driving project, he said. Since then, AI, machine learning and deep learning have gotten even better.
Still, will consumers feel comfortable riding in a self-driving car? Based on Uber’s experience testing autonomous vehicles in Pittsburgh and Phoenix, Schneider said, the public seems to be open to riding in them. While there was some concern initially that people would be scared of these cars, “what we found was exactly the opposite,” he said. For example, since riders cannot choose a self-driving Uber, some customers would chase these vehicles while calling for rides in hopes of landing the car.
However, what could put a damper on the development of mass market self-driving cars is the business model. For now, it’s still more economical to own a car than take Uber everywhere. “If you just run the numbers, financially it’s not cheaper to do that than to own your own car,” Schneider said. “Once autonomous vehicles work and they’re everywhere … it won’t make sense to own a car.”
Mastering Innovation: From Idea to Value Creation
Become the catalyst for company-wide change when you learn how to construct the architecture that drives innovation in an organization.
Learn more.
Join The Discussion
One Comment So Far
Anumakonda Jagadeesh
Excellent.
An autonomous car (also known as a driverless car, self-driving car, robotic car, autos) and unmanned ground vehicle is a vehicle that is capable of sensing its environment and navigating without human input. On November 7, 2017 Waymo announced that it had begun testing driverless cars without a safety driver at the driver position. There is still an employee in the car.
Autonomous cars use a variety of techniques to detect their surroundings, such as radar, laser light, GPS, odometry and computer vision. Advanced control systems interpret sensory information to identify appropriate navigation paths, as well as obstacles and relevant signage. Autonomous cars must have control systems that are capable of analyzing sensory data to distinguish between different cars on the road.
The potential benefits of autonomous cars include reduced mobility and infrastructure costs, increased safety, increased mobility, increased customer satisfaction and reduced crime. Specifically a significant reduction in traffic collisions; the resulting injuries; and related costs, including less need for insurance. Autonomous cars are predicted to increase traffic flow; provided enhanced mobility for children, the elderly, disabled and the poor; relieve travelers from driving and navigation chores; lower fuel consumption; significantly reduce needs for parking space; reduce crime; and facilitate business models for transportation as a service, especially via thesharing economy.
Among the main obstacles to widespread adoption are technological challenges, disputes concerning liability; the time period needed to replace the existing stock of vehicles; resistance by individuals to forfeit control; consumer safety concerns; implementation of a workable legal framework and establishment of government regulations; risk of loss of privacy and security concerns, such as hackers or terrorism; concerns about the resulting loss of driving-related jobs in the road transport industry; and risk of increased suburbanization as travel becomes less costly and time-consuming. Many of these issues are due to the fact that autonomous objects, for the first time, allow computers to roam freely, with many related safety and security concerns.
A classification system based on six different levels (ranging from fully manual to fully automated systems) was published in 2014 by SAE International, an automotive standardization body, as J3016, Taxonomy and Definitions for Terms Related to On-Road Motor Vehicle Automated Driving Systems. This classification system is based on the amount of driver intervention and attentiveness required, rather than the vehicle capabilities, although these are very loosely related. In the United States in 2013, the National Highway Traffic Safety Administration (NHTSA) released a formal classification system,[ but abandoned this system in favor of the SAE standard in 2016. Also in 2016, SAE updated its classification, called J3016_201609.
Levels of driving automation
In SAE’s autonomy level definitions, “driving mode” means “a type of driving scenario with characteristic dynamic driving task requirements (e.g., expressway merging, high speed cruising, low speed traffic jam, closed-campus operations, etc.)”
• Level 0: Automated system issues warnings and may momentarily intervene but has no sustained vehicle control.
• Level 1 (”hands on”): Driver and automated system shares control over the vehicle. An example would be Adaptive Cruise Control (ACC) where the driver controls steering and the automated system controls speed. Using Parking Assistance, steering is automated while speed is manual. The driver must be ready to retake full control at any time. Lane Keeping Assistance (LKA) Type II is a further example of level 1 self driving.
• Level 2 (”hands off”): The automated system takes full control of the vehicle (accelerating, braking, and steering). The driver must monitor the driving and be prepared to immediately intervene at any time if the automated system fails to respond properly. The shorthand ”hands off” is not meant to be taken literally. In fact, contact between hand and wheel is often mandatory during SAE 2 driving, to confirm that the driver is ready to intervene.
• Level 3 (”eyes off”): The driver can safely turn their attention away from the driving tasks, e.g. the driver can text or watch a movie. The vehicle will handle situations that call for an immediate response, like emergency braking. The driver must still be prepared to intervene within some limited time, specified by the manufacturer, when called upon by the vehicle to do so. In 2017 the Audi A8 Luxury Sedan was the first commercial car to claim to be able to do level 3 self driving. The car has a so called Traffic Jam Pilot. When activated by the human driver the car takes full control of all aspects of driving in slow-moving traffic at up to 60 kilometers per hour. The function only works on highways with a physical barrier separating oncoming traffic.
• Level 4 (”mind off”): As level 3, but no driver attention is ever required for safety, i.e. the driver may safely go to sleep or leave the driver’s seat. Self driving is supported only in limited areas (geofenced) or under special circumstances, like traffic jams. Outside of these areas or circumstances, the vehicle must be able to safely abort the trip, i.e. park the car, if the driver does not retake control.
• Level 5 (”steering wheel optional”): No human intervention is required. An example would be a robotic taxi.
Public opinion surveysIn a 2011 online survey of 2,006 US and UK consumers by Accenture, 49% said they would be comfortable using a “driverless car”.
A 2012 survey of 17,400 vehicle owners by J.D. Power and Associates found 37% initially said they would be interested in purchasing a fully autonomous car. However, that figure dropped to 20% if told the technology would cost $3,000 more.
In a 2012 survey of about 1,000 German drivers by automotive researcher Puls, 22% of the respondents had a positive attitude towards these cars, 10% were undecided, 44% were skeptical and 24% were hostile.
A 2013 survey of 1,500 consumers across 10 countries by Cisco Systems found 57% “stated they would be likely to ride in a car controlled entirely by technology that does not require a human driver”, with Brazil, India and China the most willing to trust autonomous technology.
In a 2014 US telephone survey by Insurance.com, over three-quarters of licensed drivers said they would at least consider buying a self-driving car, rising to 86% if car insurance were cheaper. 31.7% said they would not continue to drive once an autonomous car was available instead.
In a February 2015 survey of top auto journalists, 46% predict that either Tesla or Daimler will be the first to the market with a fully autonomous vehicle, while (at 38%) Daimler is predicted to be the most functional, safe, and in-demand autonomous vehicle.
In 2015 a questionnaire survey by Delft University of Technology explored the opinion of 5,000 people from 109 countries on automated driving. Results showed that respondents, on average, found manual driving the most enjoyable mode of driving. 22% of the respondents did not want to spend any money for a fully automated driving system. Respondents were found to be most concerned about software hacking/misuse, and were also concerned about legal issues and safety. Finally, respondents from more developed countries (in terms of lower accident statistics, higher education, and higher income) were less comfortable with their vehicle transmitting data. The survey also gave results on potential consumer opinion on interest of purchasing an automated car, stating that 37% of surveyed current owners were either “definitely” or “probably” interested in purchasing an automated car.
In 2016, a survey in Germany examined the opinion of 1,603 people, who were representative in terms of age, gender, and education for the German population, towards partially, highly, and fully automated cars. Results showed that men and women differ in their willingness to use them. Men felt less anxiety and more joy towards automated cars, whereas women showed the exact opposite. The gender difference towards anxiety was especially pronounced between young men and women but decreased with participants’ age.
In 2016, a PwC survey, in the United States, showing the opinion of 1,584 people, highlights that “66 percent of respondents said they think autonomous cars are probably smarter than the average human driver”. People are still worried about safety and mostly the fact of having the car hacked. Nevertheless, only 13% of the interviewees see no advantages in this new kind of cars (Wikipedia).
In 2016, motor vehicle-related crashes on U.S. highways claimed 37,461 lives. Our research tells us that 94 percent of serious crashes are due to dangerous choices or errors people make behind the wheel. NHTSA and our State partners have worked for decades to help drivers make safer choices – to wear seat belts on every trip, to avoid driving intoxicated, drowsy, or distracted – but people still make those choices and still crash. Driver assistance technologies seek to address these errors and save lives, Today’s new vehicles already include proven automated safety features that help drivers avoid crashes by warning them of crash risk and, in some cases, helping drivers brake or steer when they don’t react quickly enough. As driver assistance technologies improve, they may eventually result in vehicles that can control all aspects of the driving task: truly “self-driving” vehicles(Automated Vehicles for Safety ,NHTSA)
“At the end of 2015, Denis Sverdlov, CEO of auto manufacturer Kinetik, announced a joint venture between Formula E and Kinetik to create the Roborace self-driving race series within a year. Roborace will feature 20 identical cars allocated to 10 teams. They will run on the same circuits as Formula E, except without drivers.
The cars won’t be remote-controlled, either, they’ll be fully autonomous, using theNVIDIA Drive PX 2 supercomputer to run the software. All cars will be mechanically identical so that the winning team’s success will depend on the best artificial intelligence (AI).
In creating the new series, Sverdlov hopes to showcase self-driving cars guided by AI and powered by electricity. While this is a race series that will probably only have 20 entrants, Kinetik believes the day is not far off when self-driving cars will be the norm thereby improving the environment and road safety.
Nevertheless, the biggest challenge self-driving cars will have to overcome on the road is being able to react to the randomness of traffic flow, other drivers, and the fact that no two driving situations are ever the same.
AI will outmaneuver human drivers
According to Danny Shapiro, senior director of automotive at NVIDIA, the latest autonomous technology is adept at handling this type of diverse environment. By using deep learning and sensor fusion, it’s possible to build a complete three-dimensional map of everything that’s going on around the vehicle to empower the car to make better decisions than a human driver ever could.
However, this requires massive amounts of computing to interpret all the harvested data, because normally the sensors are “dumb sensors” that merely capture information. Before being actioned the information has to be interpreted. For example, a video camera records 30 frames per second (fps), where each frame is an image, made up of several color values, and thousands of pixels.
There is a massive amount of computation required to be able to take these pixels and figure out, “is that a truck?” or “is that a stationary cyclist?” or “in which direction does the road curve?” It’s this type of computer vision coupled with deep neural-network-processing that is required by self-driving cars.
Deep learning adds context to AI
Moving toward true AI, deep learning is a set of algorithms in machine learning that attempt to model high-level data concepts by using architectures of multiple non-linear transformations. Various deep learning architectures such as deep neural networks (DNNs), convolutional neural networks (CNNs), and deep belief networks are being applied to several fields such as computer vision, automatic speech recognition, natural (How AI is Making Self-Driving Cars Smarter
The stage is set for artificial intelligence to dominate our roads. Here’s how artificial intelligence is improving self-driving cars, Steve Crowe, Robotics Trend)..
Dr.A.Jagadeesh Nellore(AP),India.