The finding that global data centers likely consumed around 205 terawatt-hours (TWh) in 2018, or 1 percent of global electricity use, lies in stark contrast to earlier extrapolation-based estimates that showed rapidly-rising data center energy use over the past decade (Figure 2). Three primary efficiency effects explain this near-plateau in energy use: First, the energy efficiency of IT devices—and servers and storage drives in particular—has improved substantially due to steady technological progress by IT manufacturers. Second, greater use of server virtualization software, which enables multiple applications to run on a single server, has significantly reduced the energy intensity of each hosted application. Third, most compute instances have migrated to large cloud- and hyperscale-class data centers, which utilize ultra-efficient cooling systems (among other important efficiency practices) to minimize energy use (Figure 2).