The World’s Technological Capacity to Store, Communicate, and Compute Information

tracking the global capacity of 60 analog and digital technologies during the period from 1986 to 2007



We estimated  the world’s technological capacity to store, communicate, and compute information, tracking 60 analog and digital technologies during the period from 1986 to 2007. In 2007, humankind was able to store 2.9 × 1020 optimally compressed bytes, communicate almost 2 × 1021 bytes, and carry out 6.4 × 1018 instructions per second on general-purpose computers. General-purpose computing capacity grew at an annual rate of 58%. The world’s capacity for bidirectional telecommunication grew at 28% per year, closely followed by the increase in globally stored information (23%). Humankind’s capacity for unidirectional information diffusion through broadcasting channels has experienced comparatively modest annual growth (6%). Telecommunication has been dominated by digital technologies since 1990 (99.9% in digital format in 2007), and the majority of our technological memory has been in digital format since the early 2000s (94% digital in 2007).

Hilbert, M., & López, P. (2011). The World’s Technological Capacity to Store, Communicate, and Compute Information. Science, 332(6025), 60 –65. doi:10.1126/science.1200970
Download the publication for free by clicking HERE
There is a SERIES OF RELATED STUDIES HERE, which analyses complementary aspects, such as the international distribution, the content distribution, the technological drivers of the global infromation explosion, historical and methodological aspects, and related aspects.
Related PowerPoint Slides are HERE
Statistical Background (302 pages, citing all >1,100 sources): López, P., & Hilbert, M. (2012). Methodological and Statistical Background on The World’s Technological Capacity to Store, Communicate, and Compute Information


For a short video presentation see: The Economist ideas economy opening speech .



Analogies & Orders of magnitude in perspective

How much is this? Prepare for some big numbers:

In 2007, humankind was able to store some 300 exabytes of optimally compressed information in their storage devices.

  •         This is equivalent to 61 CD-ROMs per person. Piling up the imagined 404 billion CD-ROM would create a stack from the earth to the moon and a quarter of this distance beyond.

–        80 times more information per person than in the historic Library of Alexandria (300 AC-300 AC).

–        One could cover the entire area of the United States or China with thirteen layers of books.

–        Stored on newspapers and selling each newspaper for US$ 1 (2007), we would require more than our entire global Gross Domestic Product (GDP) to buy this information (17 % more).

–        There are roughly as many stars in our galaxy as the average amount of bits per person.

–        If we would use a grain of sand to represent one bit (e.g. grain of sand or not), we would require 315 times the amount of sand on the world’s beaches.

In 2007, humankind received 1.9 zettabytes of optimally compressed information through its one-way communication broadcasting channels.

–        Every person receives the informational equivalent of 174 newspapers per day.

–        All TV-sets in the world running for almost 3 hours per day.


In 2007, humankind telecommunicated 65 exabytes of optimally compressed information through its two-way telecommunication channels.

  •      Every person in the world is telecommunicating the equivalent of 6 newspapers per day.

–        Sending one double-printed newspaper sheet per “pigeon post” bird, would require an army of 4 times the entire global bird population (all kinds birds) per day working for us.

–        Using word-based chat, one would need to chat for 2 months as 3 weeks nonstop to communicate the information that the average person telecommunicates through multimedia content in one day only.


In 2007, humankind was able to compute 6.4 x 1018 instructions per second on its general-purpose computers.

More than 25,000 times more powerful than IBM’s supercomputer Watson that competed on the game-show Jeopardy in 2011. Extrapolating our numbers to 2011, Watson represents some 0.001 % of the world’s general-purpose computational capacity in 2011

On a manual calculating machine, a master of the abacus would require 50 times the time since the big bang to carry out as many computations as our general-purpose computers can do in one second. If you and I would do it by hand, we would probably require 2,200 times the period since the big bang, and this without taking one small break…



While these orders of magnitude are mindboggling, they are still tiny compared to the orders of magnitude nature deals with:

–        If we would like to give a distinct name to every star in the observable universe (these names would have to be 28 characters long), we would only be able to store the name of every thousandth star in one of our technological memories.

–        All the DNA molecules of one human adult can store more information than all of our technological storage devices combined.

–        There are 1,150 times more atoms in 12 grams of carbon12 (or in one “mole”, Avogadro's number) than bits that we telecommunicate in an entire year.

–        There are 330 million times more bacteria on earth than bits that the world broadcasts per year.

–        The maximal number of neural nerve impulses of a group of 64 people is equal to the number of instructions all our computational devices combined can compute (in 1999 it was equal to nerve impulses executed by one human brain).


While natural and biological evolution is extremely powerful, it is also incredibly slow and stays quite constant, while technological evolution grows exponentially. Comparing the rates of progress of the world’s technological capacity to process information with the growth of the world’s economic power (GDP), it results that:

–      –  Broadcasting has grown at about the same speed as world’s GDP;

–  Our information storage capacity has grown 4 times faster and telecommunication capacity has grown roughly 5 times faster than the world’s economic power;

–  General-purpose computation has expanded its capacity 9 times faster than the world economy.



Previous and related work:

ca. 200 B.C.: Library of Alexandria, Demetrius of Phaleron. How many books are there in the library?.

1975: Japanease Ministry of Post and Telecommunications, Information Flow Census.

1983: Ithiel de Sola Pool, Tracking the Flow of Information, Science 221, 609-613.

1997: Michael Lesk, How Much Information Is There In the World?  

2000-2003: Peter Lyman, Hal Varian and others, How much Information?, 2000 and 2003, University of California, Berkeley. 

2003: David Bounie, The International Production and Dissemination of Information, École Nationale Supérieure des Télécommunications.

2007-2008: John Gantz and others, The Digital Universe, 2007 to 2014 IDC and EMC. 

2009: Russell Neuman, Yong Jin Park and Elliot Panek, Tracking the Flow of Information Into the U.S. Home, University of Michigan.

2009: John Short, Roger Bohn, and others, How Much Information?, Report on American Consumers 2009, 2010 Report on Enterprise Server Information ,University of California, San Diego/ USC Marshall School of Business: How much Media, 2013.

2011: Intro to Special iJoC Section on "How to Measure ‘How Much Information’? Theoretical, methodological, and statistical challenges for the social sciences". International Journal of Communication 6, 1042–1055.