fbpx

Ready, set, brake

While autonomous vehicles begin to appear on roadways, gaps in knowledge are blocking the way to their full integration. Carolina researchers are asking the tough questions to ensure that the driverless car picking you up will be safe for passengers, bicyclists and pedestrians alike.

A graphic of a person walking across the street.
Graphic by Corina Cudebec

Just as no one could’ve predicted the rise of social media in the last decade, no one today can predict precisely how driverless cars will look, operate or shape our lives. All we know is that autonomous vehicles are coming. But they’re coming slower than most people think.

For every solution solved on the long road to ubiquitous self-driving cars, there are dozens of questions putting on the brakes. “It sounds like it’s going to be this fantastic new world, but then you start picking apart all the different subtle questions in between,” said Michael Clamann, senior human factors engineer and autonomous vehicle expert with the UNC Highway Safety Research Center.

Those questions go beyond software and hardware. Making autonomous vehicles safe requires perfect alignment of computer science, engineering, psychology, sociology, and policy. The scope of societal impact goes further, reaching into a world that we can’t yet imagine.

“We’re starting to recognize that technologies are complicated and thinking about the pluses and minuses is important,” said Noreen McDonald, chair of the Department of City and Regional Planning. “It’s going to be a long time before our cities are a bunch of people driving around in autonomous vehicles.”

At Carolina, researchers from a variety of disciplines are thinking through challenges from pedestrian safety to city parking, bringing us closer to this “new world” one day at a time.

Building the brain

Autonomous vehicles use a variety of sensors to identify pedestrians and navigate safely. But going a step further and predicting their walking paths could reduce pedestrian accidents and fatalities even more, especially in urban areas. Carolina computer scientist Aniket Bera and his team have developed one of the leading algorithms in the world for doing just that.

“For autonomous vehicles, the biggest problems is how well they can learn and understand the surroundings,” he said. “If it can learn the surroundings, it can drive efficiently and safely.”

Computer scientists have to train autonomous vehicles to identify individuals and groups as they stroll along crowded sidewalks using data-driven artificial intelligence and mathematical models.

“People tend to train models based on just data, but what we have done is try to make a combination of data and physically-based simulation,” Bera said.

This combination of psychology, artificial intelligence, simulation models, and mathematics has evolved through three stages over the years: pedestrian tracking, prediction, and behavior modeling.

The first stage, pedestrian tracking, teaches the “brain” of the autonomous vehicle the basics such as what a person looks like in a crowd where only a hand or shoulder may be visible in the camera sensors’ live video feed. Once the computer learns that, it can predict where a pedestrian will go next. That’s where pedestrian physics come into play.

Bera and his colleagues discovered that as a crowd gets denser, its movement becomes similar to that of fluids. Using a combination of fluid dynamics and collision avoidance models, Bera’s model can accurately predict where a pedestrian will walk several seconds into the future, with accuracy not dropping below 80 percent until after seven seconds. The system retrains itself with every new video frame, comparing the difference between its prediction and reality in real time.

Bera teamed up with the psychology department to take the algorithm yet another step forward to include behavior modeling. A data-driven process, thousands of simulations train the artificial intelligence to assign each pedestrian a combination of personality traits and behaviors, increasing the prediction’s accuracy and influencing navigation decisions. For example, children typically walk more aggressively and are more likely to dart out into the road, so the car needs to slow down or stop in response.

They’re also trying to find cultural differences in walking styles. Pedestrians in western countries, for example, prefer more personal space between each other than people in eastern countries.

“The system is agnostic to the culture, but it can also learn culture,” Bera said. “We can train the algorithm to learn from hundreds of videos from multiple countries at the same time, and the artificial intelligence learns the patterns in just a few minutes. That would be almost impossible for humans.”

With Bera’s research on human behavior modeling, he’s showing future driverless cars how to see people as more than obstacles to avoid. “A tree is different from a human being,” Bera said. “There’s emotion involved. There are ethics involved. We’re trying to make autonomous vehicles smarter by giving them a brain that understands humans better — an effort I think few people in the world are working on.”

Continue reading on Endeavors’ website.