I recently came across a paper on the energy costs of video streaming, written by the Borderstep Institute in German. It repeats the myth of insanely high streaming energy costs. That seems to have started with a fatally flawed analysis by the Shift Project, now reasonably refuted by the IEA.
The Borderstep paper takes the average data transfer volume and energy consumption of data centers, as well as that for the broadband network, calculates an average efficiency of 30 kBit/Ws for broadband, and 70 kBit/Ws for the data centers, and uses that to extrapolate 1000W power consumption for streaming a 25 MBit/s 4K stream. This result is wrong, because it neglects the actual engineering:
- Streaming uses very little processing power on the server side
All popular videos are stored already fully encoded on the servers, in multiple bit rates, to improve responsiveness. Even a typical home NAS, drawing 40W power, can stream 40 4K streams in parallel over a 1 Gbit/s connection. In the data centers, we see units drawing 1000W serving 20 Gbit/s. Even with a generous allowance for overheads and reserve capacity I’d be surprised if a 4K stream draws more than 5W, for an efficiency of 5 Mbit/Ws. Given that the typical HDD delivers 50 to 100 Mbit/Ws, this feels reasonable.
-
Streaming is mostly via CDNs, and the backbone is now fiber
With a Content Delivery Network, the data will only have to travel a few hops and at most a few hundred kms, and this can be done efficiently with fiber. DWDM type connections manage 160km with 1 Gbit/s with the transceiver drawing 4W per side, that is 125 Mbit/Ws. Assuming that the switch draws the same power, that we need at most 5 hops, and adding a 5x safety margin, we end up at 2.5 MBit/Ws. This is 10W for the backbone, likely even lower.
-
The last mile for broadband has basically a fixed power consumption
While in the backbone and data centers you aggregate demand, and so can ensure a pretty high load and efficiency, the last mile is essentially always on. So the energy consumption stays the same, no matter whether you use it sparingly for surfing, or intensely for streaming. You draw 10W if you have fiber, 20W if you have DSL. Your WiFi router at home will also draw a fixed 5 to 10 Watts.
A 4K stream with fiber to the home, and an efficient router will draw up to 5W + 10W + 10W + 5W = 30W, a low bit rate stream over DSL with an old router will be 0.5W + 1W + 20W + 10W = 31.5W. This is similar to the 18W for data transmission the IEA assumes, without taking your home wifi into account. It also shows that with current technology 4K is roughly the tipping point where the backbone starts to consume more power than the last mile, and we need to start monitoring our consumption. Still, this is 300 GB/day.
If you are worried about the climate impact of streaming, look at the TVs instead. A 65″ HD display draws 200W, six times the power needed to move the data.