A report released earlier this year by the KTH Royal Institute of Technology in Sweden shows that the Internet accounts for more than 10 percent of global electricity use. Back in 2012, the Internet’s share of electric consumption stood at 8 percent.
Gobbling up electricity
“There is a strong trend to push electricity consumption onto the network and data center infrastructure where energy costs are less transparent to consumers. Some challenges are identified for networking and data-center sectors. Of these the global roll-out of LTE (Long Term Evolution for mobile broadband) will be a crucial determinant of future electricity demand,” the report states (Inside Scandinavian Business).
The study estimates Internet electricity consumption to rise by 1,982 TWh to 3,422 TWh per year, which would be an annual increase of 10 to 13.5 percent. To put the figures into perspective, renewable energy sources like solar and wind only add about 2,151 TWh per year to the global electricity pool. The increasing use of smartphones, tablets, data centers, IOTs, will only keep pushing the Internet’s electricity consumption share.
According to the International Energy Agency (IEA), global electricity demand rose by 4 percent in 2018, which is its fastest growth rate since 2010. Though renewable energy production was up, coal and gas-fired power plants continued to be the major source of electricity, driving up carbon emissions from the sector by around 2.5 percent.
The future of the Internet and electricity
Back in 2017, Swedish researcher Anders Andrae warned that the growth of Internet-powered devices could end up eating up a substantial portion of worldwide electricity in the future.
“Global computing power demand from Internet-connected devices (ICT), high-resolution video streaming, emails, surveillance cameras and a new generation of smart TVs is increasing 20 percent a year, consuming roughly 3-5 percent of the world’s electricity in 2015… Without dramatic increases in efficiency, the ICT industry could use 20 percent of all electricity and emit up to 5.5 percent of the world’s carbon emissions by 2025,” according to The Guardian.
Data centers in the U.S. consumed around 70 billion kilowatt-hours of electricity in 2014, which equaled the electricity consumption of 6.4 million American homes. As more devices are connected to the Internet, right from the watch you wear on your arm to the automated car that runs on the road, the need for data centers will go up. Since most servers need to be kept at a temperature below 80°F for operation, cooling accounts for almost 40 percent of the electricity consumption in data centers.
There are about 3 million data centers in America, a large number of which are located in Loudoun Country in northern Virginia. According to county officials, nearly 70 percent of the world’s Internet traffic runs through the area, which shouldn’t be surprising given that big tech companies like Google, Microsoft, Amazon, etc., have their data centers in the region.
Debra Chan from China Water Risk believes that learning to minimize electricity consumption of data centers will go a long way in cutting down carbon emissions in the future. “The ICT sector can definitely lead the world in aggressive decarbonization because they’re the sector that will add on the most power going forward… They have the capability [and] they have the scale,” she said to Fortune.
However, this would require that data centers shift away from using non-renewable energy sources. Though Google and Amazon promised to make a 100 percent shift to clean energy, only 4 and 12 percent of their data centers in Loudoun County are actually powered by renewables.