When we talk about artificial intelligence at CleanTechnica, we’re often focused on autonomous vehicles. Because Tesla has been chasing autonomy along with battery-electric vehicles, the topics seem to go hand-in-hand, with some people considering them inextricably linked. For some of us, the idea of a gas-powered self-driving car or an EV without any autonomy features is a sort of heresy.
But, as with any dogma, it’s always good to ask questions. Because we focus on clean technology and solutions to climate change, we need to look at both EV and autonomous vehicle technology and evaluate their individual environmental merits. While it’s pretty clear that EVs are a great alternative to ICE, we’ve looked at their impacts from time to time, especially when it comes to inefficient EVs that use more energy and take up an unusually big part of the battery supply.
A recent article from Virginia Tech shows us that we must also do this for autonomous vehicle technology and other AI tech. While it has great potential to do a lot of good for humanity, we must take an objective look at the environmental costs so we can make fully informed decisions. The big issue: the energy consumption involved.
To provide context on the energy consumption of AI technology, VT’s PR team gave an example. It took OpenAI nine days to train one of their early model chatbots called MegatronLM, during which 27,648 kilowatt hours of energy was consumed. This energy usage equates to what three average U.S. residential homes would use over the course of an entire year.
The significant amount of energy required for training AI models highlights the importance of developing more energy-efficient methods for AI to reduce its carbon footprint and move towards a more sustainable future.
As you’d expect to see in a press release from a university, they want us to know that they’re part of the solution. Walid Saad, a professor at Virginia Tech and the Next-G Faculty Lead for Virginia Tech’s Innovation Campus, has teamed up with Amazon to investigate the development of sustainable AI through a concept called green federated learning, or green FL.
“As more and more people adopt these types of technologies at scale (ChatGPT and Large Language Models being a case-in-point), it is imperative that we find ways to make them more sustainable, energy-efficient, and friendly to the environment,” said Saad. “If not, we could reach a point where the benefits of AI become more of an ethical concern, particularly when we think about our carbon footprint.”
This innovative approach involves implementing collaborative AI algorithms using a distributed machine learning technique. By leveraging this technology, the research team aims to reduce the energy consumption associated with traditional AI, ultimately working towards a more sustainable future for the field. With the development of green FL, the hope is that AI can be utilized without compromising efforts towards environmental sustainability.
Walid Saad and his team are focused on finding ways to make distributed AI systems, such as federated learning systems, more energy-efficient and sustainable both in the training and inference phases. To achieve this goal, the research team is exploring methods of minimizing the energy consumption associated with AI algorithms and improving scalability for larger numbers of wirelessly interconnected devices, thereby reducing the overall environmental impact of these technologies. Through their efforts, Saad and his team aspire to pave the way towards utilizing AI in a more sustainable and eco-friendly manner.
Seeking Efficiency Doesn’t Mean Giving Up On The Benefits of AI Technology
When I first read the headline of the VT article, I have to admit that my eyes initially rolled a bit. Like the bitcoin and cryptocurrency environmental debate, sometimes people have legitimate environmental concerns and sometimes people are just hunting for any excuse for their prejudices and political agendas.
But, after reading the VT article, its pretty clear that their researchers aren’t just trying to grind some sort of neo-luddite or pro-state axe against artificial intelligence technology. Instead of merely identifying a problem, they are working hard to identify solutions to it.
Among these solutions is federated learning. a distributed machine learning approach where the model or algorithm is trained using data that is stored on multiple, decentralized devices instead of a centralized location or cloud server. Rather than sending data to a central location, the data remains on the device and the model is trained where it was collected. This method improves privacy and security as sensitive data does not leave the device, and reduces the amount of data that must be transferred and stored, which can lower costs and increase efficiency.
By leveraging aggregated information with data from different devices, Federated Learning is useful when individual data is sparsely available but the collective pool of data can provide more insight, enabling the deployment of more powerful AI algorithms. Federated Learning is a technique that can improve collaboration while reducing energy consumption to make AI more sustainable.
Federated Learning improves power efficiency by reducing the amount of data that must be transferred and stored, which can significantly lower energy expenditure. Additionally, since models are trained on decentralized devices rather than a central cloud server, it reduces the need for long-distance data transfers, thereby saving energy.
On top of this, because models are only partially trained at each device, rather than in its entirety, the time required to train the model overall is reduced and can result in more efficient energy use. By taking advantage of local data while using fewer resources and energy, Federated Learning is an effective approach to making AI more sustainable and energy-efficient.
For EVs, this could mean not only less energy consumption, but greater range when using computer hardware for autonomous driving in the future.
This basically means that we can get the benefits of machine learning and related AI technologies without unnecessary environmental impacts. And really, this is at the heart and soul of what clean technology is. Unlike far left extremist groups who set fire to SUVs on dealer lots or slash tires in a bid to punish consumption, the idea should instead be to help humanity keep the things it loves whenever possible while taking care of environmental needs at the same time.
If VT’s research (and the efforts of others working on Federated Learning) pans out, a real win-win solution for AI technology can be achieved.
Have a tip for CleanTechnica? Want to advertise? Want to suggest a guest for our CleanTech Talk podcast? Contact us here.
CleanTechnica Holiday Wish Book
Our Latest EVObsession Video
CleanTechnica uses affiliate links. See our policy here.