Gillings lab uses AI to predict local air quality
Unlike current models, the DeepCTM bot shows the impact of proposed development at the neighborhood level.
When a manufacturing plant or large store eyes your community for its next location, how will the new neighbor affect the air that you breathe?
Artificial intelligence models can predict the effects on a broad scale but not at the town or neighborhood level. These broad-scale tools also require supercomputers and technical knowledge.
A research team at UNC Gillings School of Global Public Health has produced a software-based AI bot called DeepCTM that can predict air quality using local data without the need for extensive computational infrastructure or high-level expertise. Plus, the bot accounts for how chemical reactions in the local atmosphere change what’s in the air.
Led by Will Vizuete, a professor in the school’s environmental sciences and engineering department, the team is testing the bot for use on laptops and phones so public health officials and other citizens can use the predictions to lessen the adverse effects of poor air quality.
“The models that we have with chemistry work for big cities, but they don’t work well at scaling to a neighborhood or single house,” said Vizuete.
Funded by a Gillings Innovation Lab grant, Carolina and George Mason University are developing the bot for first use by the Environmental Defense Fund. Carolina’s high-performance computing research servers are processing the data. The fund will use DeepCTM predictions about proposed and existing local natural gas power plants to gauge their impact on vulnerable communities in Washington, D.C., and Florida.
To improve DeepCTM’s accuracy, Vizuete trains the bot on datasets then compares its results with those from the Environmental Protection Agency’s supercomputing model. Both employ computer code and mathematics to create three-dimensional simulations of the atmosphere’s behavior. Users input data that includes weather information and data on emissions from cars, power plants and other sources. The model is open to researchers worldwide.
“It’s a mass model. Mass in, mass out. Mass can’t be destroyed or created. We’re basically tracking mass. But the tricky bit about this is that mass can be emitted, be diluted and transported, and can be transformed into other things,” Vizuete said.
Refining the bot’s prediction of atmospheric chemistry is primary for Vizuete. “The chemistry of the atmosphere, that’s the hardest part. It’s hard to do the meteorology, but there’s more discovery and research in the chemistry part of the model than anything else,” he said.
The team has refined the bot so that it can mimic the EPA’s supercomputing model for air quality management. They want the bot to evolve from mimicking to more precision.
“We give the bot the same meteorologic conditions and the same emissions as the full model. Then we ask it, ‘What would you expect the model to predict?’” Vizuete said. “It’s the same idea as ChatGPT producing a probability of the next word. What’s the probability, the most likely outcome, based on the inputs and the bot’s training?”
Making the bot more precise and easier to use, Vizuete said, will remove barriers to its use: computational cost, the requirement of a supercomputing cluster and technical knowledge for best use and analysis.
“What we’re doing through the Gillings Innovation Lab is proof of concept. We want to produce a tool that is accurate, then apply it with EDF in real-word situations to show that it works,” he said. “If we have that, then we have preliminary data and confidence. There’s a case study that we can point to and use to possibly help North Carolina communities.”