This is a guest post written by Lucila Houttuijn Bloemendaal, who is part of the AGU Voices for Science program where she is one of 35 advocates for science.
Lucila Houttuijn Bloemendaal is a PhD student in Earth and Environment at Boston University studying sedimentology and coastal geology. Lucila works to understand how coastlines change with sea level rise, storms, and flooding to inform coastal resiliency decisions. Before, Lucila was at Duke University studying Earth and Ocean Sciences and doing research in paleoceanography, reconstructing the past thermocline in the Tropical North Atlantic and relating that to changes in large-scale ocean circulation. Alongside mucking around in the marshes and beaches of Massachusetts, Lucila has been working on science outreach and communication through American Geophysical Union’s Voices for Science program. Twitter: @lucilabradorite
Original article: Strubell, E., Ganesh, A., & McCallum, A. (2019). Energy and Policy Considerations for Deep Learning in NLP, (1). Retrieved from http://arxiv.org/abs/1906.02243
Artificial intelligence is used to develop algorithms that can process human languages and even respond to you. Behind every voice assistant like Amazon’s Alexa is a network of algorithms that help the voice assistant understand and interact with us. Behind every voice assistant are also hundreds of thousands of pounds of CO2 emissions. Where do these emissions come from and what can we do about it?
Artificial Intelligence and Human Language
Can you guess how much carbon was emitted to get you a fully-functioning Alexa in your home? There are carbon emissions from the raw resources like rare earth elements for electronics, the manufacturing, and getting it shipped to you, but what about getting Alexa to understand you when you are asking for the week’s weather? A team of researchers at University of Massachusetts Amherst, including Emma Strubell, who studies natural language processing, set out to find out. Natural language processing is behind Alexa’s ability to interpret what you are saying, and ultimately give you a response. It is also behind Google’s search engine and many other technological applications working with human languages. Natural language processing is an application of deep learning, a subset of artificial intelligence that learns and can predict patterns in data. Deep learning is challenging to develop with high accuracy, is incredibly important for modern technologies, and uses a shocking amount of energy.
With these deep learning techniques, the most accurate computer algorithms are usually the most energy-intensive to train and perfect. These algorithms can take hours or days to run and “learn” patterns. Newly developed algorithms, which require many iterations of training and experimentation, can even take half a year (depending on the number of graphics processing units used). All this time taken to train and develop algorithms can be directly translated to kilowatt hours used.
Since most energy does not come yet from carbon-neutral or renewable sources, but fossil-fuel sources, most of the energy used in these deep learning models emits CO2. An extreme example of energy usage in deep learning is training one big natural language processing algorithm with neural architecture search. Neural architecture search essentially automates the design of artificial neural networks, which in turn are algorithms designed to recognize patterns. The research team found that training such a computer model contributes to around 630,000 pounds of CO2 emissions; and remember, this is not yet the final product. For comparison, an average car emits around 130,000 pounds CO2 in its whole lifetime. These computations can also cost on the order of hundreds of thousands of dollars, limiting who can develop these algorithms.
The Cost of Computer Science Carbon
Aside from the environmental and financial cost of deep learning techniques, there may also be social costs. Climate change does not affect everyone equally, and these neural networks contribute to climate change through the daunting amount of CO2 emissions it produces. The effects of climate change and other practices affecting the environment disproportionately impact disenfranchised communities with little say so far in the environmental regulation and management process. These neural networks contribute to climate change and thus these social inequities through the daunting amount of CO2 emissions they produce. Energy used in training these algorithms may also be better used powering a home or hospital.
The Benefit in the Cost-Benefit
So, these abstract algorithms that we now rely on for everyday life release more carbon than several transcontinental flights. Are all neural networks hurting our efforts to mitigate climate change? These deep learning networks have become essential to how we work, and many applications of artificial intelligence are being used to help the environment. Neural networks can help us water crops while conserving water; others help us track the spread of diseases; some even map where humans live (check out this previous Envirobites article).
Deep learning and more largely, artificial intelligence, have a place in climate mitigation and in our efforts to understand the world. These technologies help us track land use change, manage resource use, and predict future environmental change. The benefits are immense, and to make sure this research remains in line with our environmental goals, the algorithms need to be efficient. Requiring published models to detail the computing hours and energy used in developing them may be a first step. The energy consumption and carbon emissions of this technology needs to be acknowledged and detailed, and minimizing energy use should be one of the developers’ goals. Next time, ask Alexa how much it has contributed to carbon emissions.
Cover image: A server farm. Photo by Laboratorio Linux. https://www.flickr.com/photos/[email protected]/33371413545