JOURNAL ARTICLES peer reviewed

**MartinHilbert.net *ARTICLES(Journals) *BOOKS *(working)PAPERS *PRESENTATIONS *e-democracy  online book *CV & bio *In the NEWS *Work Photos *Travel Page *Quotes *Info Capacity *ComplejidadCurso

I pursue a multi-disciplinary approach to the role of information and communication in society and have published in the fields of communication, public policy, economic development, psychology, women's studies, political science, telecommunications, and forecasting (see also my Google Scholar Website). The articles here are sorted according to:

  • ICT and Development
  • World's Info Capacity
  • Others (psychology, complexity, etc.)

ICT and development

 Towards a Conceptual Framework for ICT for Development: Lessons Learned from the Latin American “Cube Framework”
December 2012, Martin Hilbert
Information Technologies & International Development, Volume 8, Issue 4; 243–259. Spanish version: Hacia un Marco Conceptual para las TIC para El Desarrollo: Lecciones Aprendidas del “Cubo” Latinoamericano; ITID, Vol. 8, issue 4; pp. 261–280.

ABSTRACT: The ICT for development community has long searched for comprehensive and dequate conceptual frameworks. In 2003, the United Nations Regional Commission for Latin America and the Caribbean (UN-ECLAC) proposed a threedimensional conceptual framework that models the transition toward information societies as the interplay among technology, policy, and social change. It has its theoretical roots in Schumpeterian innovation theory. This so-called "cube framework” has been adopted on several occasions throughout the region at the local, national, and international levels. It has been employed in all stages of the policy cycle to identify areas and priorities for research and hands-on policy making (planning), to coordinate actors and stakeholders (execution), and to monitor progress toward information societies (evaluation). This article presents the framework and its particularities, reviews some of the diverse applications it has found during recent years, provides concrete suggestions on how it could be used in the future, and discusses its strengths and limitations. The cube is not a dynamic model that can make predictions, but it turns out to be useful as a conceptual framework; it can be used to structure the often-confused discussion about what is involved in the ongoing social transformation. 

 

Digital gender divide or technologically empowered women in developing countries? A typical case of lies, damned lies, and statistics
November 2011, Martin Hilbert
Women's Studies International Forum, Volume 34, issue 6; p.479-489, http://dx.doi.org/10.1016/j.wsif.2011.07.001
Download an electronic version (pdf) of the Accepted Author Manuscript for FREE here also: related interview here 

ABSTRACT: The discussion about women's access to and use of digital Information and Communication Technology (ICT) in developing countries has been inconclusive so far. Some claim that women are rather technophobic and that men are much better users of digital tools, while others argue that women enthusiastically embrace digital communication. This article puts this question to an empirical test. We analyze data sets from 12 Latin American and 13 African countries from 2005 to 2008. This is believed to be the most extensive empirical study in this field so far. The results are surprisingly consistent and revealing: the reason why fewer women access and use ICT is a direct result of their unfavorable conditions with respect to employment, education and income. When controlling for these variables, women turn out to be more active users of digital tools than men. This turns the alleged digital gender divide into an opportunity: given women's affinity for ICT, and given that digital technologies are tools that can improve living conditions, ICT represents a concrete and tangible opportunity to tackle longstanding challenges of gender inequalities in developing countries, including access to employment, income, education and health services.

 

The end justifies the definition: The manifold outlooks on the digital divide and their practical usefulness for policy-making
September 2011, Martin Hilbert
Telecommunications Policy, Volume 35, issue 8; p.715-736, http://dx.doi.org/10.1016/j.telpol.2011.06.012
Download an electronic version (pdf) of the Accepted Author Manuscript for FREE here

ABSTRACT: Based on the theory of the diffusion of innovations through social networks, the article discusses the main approaches researchers have taken to conceptualize the digital divide. The result is a common framework that addresses the questions of who (e.g. divide between individuals, countries, etc.), with which kinds of characteristics (e.g. income, geography, age, etc.), connects how (mere access or effective adoption), to what (e.g. phones, Internet, digital TV, etc.). Different constellations in these four variables lead to a combinatorial array of choices to define the digital divide. This vast collection of theoretically justifiable definitions is contrasted with the question of how the digital divide is defined in practice by policy makers. The cases of the United States, South Korea, and Chile are used to show that many diverse actors with dissimilar goals are involved in confronting the digital divide. Each of them takes a different outlook on the challenge. This leads to the question if this heterogeneity is harmful and if countries that count with a coherent national strategy and common outlook on digital development do better than others. It is shown that the effect of a coherent vision is secondary to tailor-made sector-specific efforts. On the contrary, a one-size-fits-all outlook on a multifaceted challenge might rather be harmful. This leads to the conclusion that it is neither theoretically feasible, nor empirically justifiable to aim for one single definition of the digital divide. The digital divide is best defined in terms of a desired impact. Since those are diverse, so are the definitions of the challenge. The best that can be done is to come up with a comprehensive theoretical framework that allows for the systematic classification of different definitions, such as the one presented in this article.

 

Information Societies or “ICT Equipment Societies?” Measuring the Digital Information-Processing Capacity of a Society in Bits and Bytes
May 2010, Martin Hilbert, Priscila Lopez, Cristian Vasquez
The Information Society: An International Journal, Volume 26, issue 3; informaworld, p.157-178.

ABSTRACT: The digital divide is conventionally measured in terms of information and communication technology (ICT) equipment diffusion, which comes down to counting the number of computers or phones, among other devices. This article fine-tunes these approximations by estimating the amount of digital information that is stored, communicated, and computed by these devices. The installed stock of ICT equipment in the consumer segment is multiplied with its corresponding technological performance, resulting in the “installed technological capacity” for storage (in bits), bandwidth (in bits per second), and computational power (in computations per second). This leads to new insights. Despite the rapidly decreasing digital equipment divide, there is an increasing gap in terms of information-processing capacity. It is shown that in 1996 the average inhabitant of the industrialized countries of the Organization for Economic Cooperation and Development (OECD) had a capacity of 49 kibps more than its counterpart from Latin America and the Caribbean. Ten years later, this gap widened to 577 kibps per inhabitant. This innovative approach toward the quantification of the digital divide leads to numerous new challenges for the research agenda.

 

When is cheap, cheap enough to bridge the digital divide? Modeling Income Related Structural Challenges of Technology Diffusion in Latin America
May 2010, Martin Hilbert
World Development, Volume 38, issue 5; Elsevier Inc., p. 756-770, http://dx.doi.org/10.1016/j.worlddev.2009.11.019. Download the pre-print version (pdf) for FREE here.

ABSTRACT: The article presents a model that shows how income structures create diffusion patterns of Information and Communication Technologies (ICTs). The model allows the creation of scenarios for potential cuts in access prices and/or required subsidies for household spending in Mexico, Uruguay, Brazil, and Costa Rica. One analyzed scenario would require the reduction of ICT prices to as low as 4% of the current price levels (to US$ 0.75 per month), or alternatively, a subsidy as high as 6.2% of GDP (a figure comparable to public spending on education plus health). This is the income reality of the poor. Neither existing technological solutions nor existing financial mechanisms are sufficient to cope with this economic reality. The alternatives, such as a prolonged period of public access, are discussed.

 

Foresight tools for participative policy-making in inter-governmental processes in developing countries: Lessons learned from the eLAC Policy Priorities Delphi
September 2009, Martin Hilbert, Ian Miles, Julia Othmer
Technological Forecasting & Social Change, Volume 76, issue 2; Elsevier Inc., p. 880-896. Download the accepted author's manuscript (pdf) for FREE here.

ABSTRACT: The paper shows how international foresight exercises, through online and offline tools, can make policy-making in developing countries more participatory, fostering transparency and accountability of public decision-making. A five-round Delphi exercise (with 1454 contributions), based on the priorities of the 2005–2007 Latin American and Caribbean Action Plan for the Information Society (eLAC2007), was implemented. This exercise aimed at identifying future priorities that offered input into the inter-governmental negotiation of a 2008–2010 Action Plan (eLAC2010). It is believed to be the most extensive online participatory policy-making foresight exercise in the history of intergovernmental processes in the developing world to date. In addition to the specific policy guidance provided, the major lessons learned include (1) the potential of Policy Delphi methods to introduce transparency and accountability into public decision-making, especially in developing countries; (2) the utility of foresight exercises to foster multi-agency networking in the development community; (3) the usefulness of embedding foresight exercises into established mechanisms of representative democracy and international multilateralism, such as the United Nations; (4) the potential of online tools to facilitate participation in resource-scarce developing countries; and (5) the resource-efficiency stemming from the scale of international foresight exercises, and therefore its adequacy for resource-scarce regions. Two different types of practical implications have been observed. One is the governments' acknowledgement of the value of collective intelligence from civil society, academic and private sector participants of the Delphi and the ensuing appreciation of participative policy-making. The other is the demonstration of the role that can be played by the United Nations (and potentially by other inter-governmental agencies) in international participatory policy-making in the digital age, especially if they modernize the way they assist member countries in developing public policy agendas.


The Maturing Concept of e-democracy: From e-Voting and Online Consultations, to Democratic Value Out of Jumbled Online Chatter

 September 2009
Journal of Information Technology & Politics (JITP), Volume 6, issue 2; American Political Science Association, p. 87-110, http://www.jitp.net. Download an electronic version (pdf) of this article for FREE here.

ABSTRACT: Early literature on e-democracy was dominated by euphoric claims about the benefits of e-voting (digital direct democracy) or continuous online citizen consultations (digital representative democracy). High expectations have gradually been replaced with more genuine approaches that aim to break with the dichotomy of traditional notions of direct and representative democracy. The ensuing question relates to the adequate design of information and communication technology (ICT) applications to foster such visions. This article contributes to this search and discusses issues concerning the adequate institutional framework. Recently, so-called Web 2.0 applications, such as social networking and Wikipedia, have proven that it is possible for millions of users to collectively create meaningful content online. While these recent developments are not necessarily labeled e-democracy in the literature, this article argues that they and related applications have the potential to fulfill the promise of breaking with the longstanding democratic trade-off between group size (direct mass voting on predefined issues) and depth of argument (deliberation and discourse in a small group). Complementary information-structuring techniques are at hand to facilitate large-scale deliberations and the negotiation of interests between members of a group. This article presents three of these techniques in more depth: weighted preference voting, argument visualization, and the Semantic Web initiative. Notwithstanding these developments, the maturing concept of e-democracy still faces serious challenges. Questions remain in political and computer science disciplines that ask about adequate institutional frameworks, the omnipresent democratic challenges of equal access and free participation, and the appropriate technological design.

 

Estrategias Nacionales para el desarrollo digital
February 2008
Política digital: innovación gubernamental, nexos publications, Number 42, ISSN 1665-1669, pp 18-26

ABSTRACT: Todos los países de la región han formulado algún tipo de instrumentos de política pública para fomentar su desarrollo digital. En el caso de que la política implique una coordinación a nivel nacional, es posible hablar de estrategias nacionales. Éstas se constituyen en procesos que se dan en diferentes etapas, y están sujetas a factores exógenos y endógenos, los cuales definen distintos modelos de implementación de una estrategia nacional. Este artículo está enfocando en el diseño de las instituciones que los países han creado para coordinar los esfuerzos tendientes a la creación e implementación de una agenda digital nacional.

 

Municipios digitales: la influencia de los proveedores

October 2005

Política digital: innovación gubernamental, nexos publications, Number 26, ISSN 1665-1669, pp 56-58; http://www.politicadigital.com.mx/IMG/pdf/PD-26-2.pdf

 

ABSTRACT: La introducción del gobierno electrónico (e-gobierno) es una de las mayores fuerzas conductoras para el desarrollo de sociedades de la información en América Latina. Esto se hace más evidente a nivel local, donde la relación gobierno-ciudadano parece ser más intensa. El presente artículo muestra los modelos de servicios en línea que ofrecen los llamados “municipios digitales” en Chile y Perú. Para conformar el estudio, se aplicaron 106 cuestionarios a funcionarios de 341 municipios chilenos, en tanto que en Perú se obtuvieron 77 cuestionarios procedentes de 194 provincias municipales y 1,634 distritos municipales.

 

Comment on the Financing Aspect of the Information Society for Developing Countries 

Mid-2004

Information Technologies and International Development (ITID), Spring/Summer 2004, Vol. 1, No. 3-4, pp 79-80; MIT Press Journals, The World Summit on the Information Society in Reflection

ABSTRACT: The issue of financing has been the most difficult subject of the WSIS negotiations. Only hours before the Summit opening session, delegations agreed on consensus language. Finally the Declaration of Principles recognizes “the will expressed by some to create an international voluntary ‘Digital Solidarity Fund’... The urgent questions now are: Are existing mechanisms enough to ensure the creation of a universal Information Society? How much money will be needed to close the digital divide? Statistical data available for such calculations is scarce. However, in order to put the dimensions into perspective, we can do some rough estimates to provide a first insight to the magnitude of the financing challenge facing the digital divide. The estimates in our studies show that while in highincome countries the average per capita ICT expenditure is around US$2,500 per year, half of the population in Latin America has less than US$100 per capita per year, or US$2 per week, to spend on the technology....

 

In the Midst of the Transition: Challenges and Chances for Latin American Economies

October 2000

Connect-World and Telecom, Latin America Fourth Quarter Issue 2000: Consolidating the Gains and Delivering on the Promise, IV 2000, http://connect-world.com/

 

Book Review: Eric Schmidt & Jared Cohen, The New Digital Age:

Reshaping the Future of People, Nations and Business

August 2013

International Journal of Communication, Volume 7, 1571-1575.

 

 

>>> The author's versions of 6 of the articles from 2009-2011 are openly published by USC in the thesis of my second doctorate:

Mapping out the Transition toward Information Societies: Social Nature, Growth, and Policies

(please cite the respective Journal articles directly, as indicated in this doc)

 

 

WORLD INFO CAPACITY:

more can be found on this dedicated page: http://www.martinhilbert.net/WorldInfoCapacity.html


How much of the global information and communication explosion is driven by more, and how much by better technology?
April, 2014, Martin Hilbert
Journal of the American Society for Information Science and Technology, Vol. 65; issue 4, pp. 856-861
Download an electronic version (pdf) of the Accepted Author Manuscript for FREE here 

ABSTRACT: Technological change in the digital age is a combination of both more and better technology. This work quantifies how much of the technologically-mediated information and communication explosion during the period of digitization (1986–2007) was driven by the deployment of additional technological devices, and how much by technological progress in hardware and software. We find that technological progress has contributed between two to six times more than additional technological infrastructure. While infrastructure seems to reach a certain level of saturation at roughly 20 storage devices per capita and 2 to 3 telecommunication subscriptions per capita, informational capacities are still expanding greatly. Besides progress in better hardware, software for information compression turns out to be an important and often neglected driver of the global growth of technologically-mediated information and communication capacities.

 

Technological information inequality as an incessantly moving target: The redistribution of information and communication capacities between 1986 and 2010
April, 2014, Martin Hilbert
Journal of the American Society for Information Science and Technology, Vol. 65; issue 4, pp. 821–835
Download an electronic version (pdf) of the Accepted Author Manuscript for FREE here 

ABSTRACT: The article provides first-time empirical evidence that the digital age has first increased and then (only very recently) decreased global, international and national inequalities of information and communication capacities among and within societies. Previous studies on the digital divide were unable to capture the detected trends appropriately, since they worked with proxies, such as the number of subscriptions or related investments, without considering the vast heterogeneity in informational performance among technological devices. We created a comprehensive dataset (based on over 1,100 sources) that allows measuring the information capacity directly, in bits per second, bits, and instructions per second. The newly proposed indicators provide insights into inequalities in access to, usage of, and impact of digitized information flows. It shows that the digital divide has gone into a second stage, which is based on a relative universalization of technological devices and a continuously evolving divide in terms of communication capacity.

 

What Is the Content of the World's Technologically Mediated Information and Communication Capacity: How Much Text, Image, Audio, and Video?
March, 2014, Martin Hilbert
The Information Society: An International Journal, Volume 30, issue 2; p.127–143. Download an electronic version (pdf) of the Accepted Author Manuscript for FREE here 

ABSTRACT: This article asks whether the global process of digitization has led to noteworthy changes in the shares of the amount of text, images, audio, and video in worldwide technologically stored and communicated information content. We empirically quantify the amount of information that is globally broadcast, telecommunicated, and stored (1986–2007) and assess the evolution of the respective content shares. Somewhat unexpectedly, it turns out that the transfer from analog to digital has not led to toward increasing shares of media-rich audio and video content, despite vastly increased bandwidth. First, there is a certain inertia in the evolution of content, which seems to stick to stable proportions independent of its technological medium (be it analog vinyl and VHS tapes, or digital CDs and hard disks). Second, the relative share of text and still images actually captures a larger portion of the total amount than before the digital age. Text merely represented 0.3% of the (optimally compressed) bits that flowed through global information channels in 1986 but grew to almost 30% in 2007. On another level, we are seeing an increasing transition of text and images from one-way information diffusion networks (like newspapers) to digital storage and two-way telecommunications networks, where they are more socially embedded. Both tendencies are good news for big-data analysts who extract intelligence from easily analyzable text and image data.

 

How to measure "How Much Information"? Theoretical, methodological, and statistical challenges for the social sciences
April 2012, Martin Hilbert
International Journal of Communication, Volume 6, 1042-1055. Guest Editor introduction to iJoC Special Section How to Measure “How Much Information”?

ABSTRACT: The question of “how much information” there is in the world goes back at least to the time when Aristotle’s student Demetrius (367 BC–ca. 283 BC) was asked to organize the Library of Alexandria in order to quantify “how many thousand books are there” (Aristeas, ca. 200 BC). Pressed by the exploding number of information and communication technologies (ICTs) the recent decades, several research projects have taken up this question more systematically since the 1960s. In the eight articles of this Special Section, authors of some of the most extensive of those inventories discuss findings, research priorities, advantages, and limitations, as well as methodological and measurement differences in their approaches. As guest editor of this Special Section, I start by providing some of the main conclusions that I draw from this exercise. The goal of these conclusions is to offer the reader a quick overview about the current state of the art, as well as some of the recurrently mentioned challenges (a much more detailed and balanced description of the challenges will be found within the different articles). I also review the historical context of the most well-known and extensive of these inventories, which will provide the reader with the necessary background in the art and science of information quantification.

 

How to Measure the World’s Technological Capacity to Communicate, Store and Compute Information?
Part I: results and scope &
Part II: measurement unit and conclusions

April 2012, Martin Hilbert and Priscila López
International Journal of Communication, Volume 6, 936-955; 955-979
.

ABSTRACT: Part I of this two-part article reviews methodological and statistical challenges involved in the estimation of humanity’s technological capacity to communicate, store, and compute information. It is written from the perspective of the results of our recent inventory of 60 technological categories between 1986 and 2007 (measured in bits and MIPS [million-instructions-per-second]). In Part I, we summarize the results of our inventory, and explore a series of basic choices that must be made in the course of measuring information and communication capacities. The most basic underlying assumptions behind our estimates include—among others—decisions about what is counted as (1) communication, (2) storage, and (3) computation; if technological capacities or consumption of information is measured; and if unique information is distinguished from duplicate information. We compare our methodological choices with different approaches taken in similar studies. The article shows how the particular question on the researcher’s mind, as well as the availability of source data has and will influence most of the methodological choices in different exercises.

Part II focuses on the adequate unit of measurement for quantifying information. We propose an information theoretic measure that approximates the entropy of the source (which we call “optimally compressed bits”). We explain the interpretation, creation, usage, benefits, and limitations of this unit of measurement. A more coherent understanding of information volumes and magnitudes starts with a thorough understanding of the methodological choices involved in related inventories. We also discuss statistical lessons learned in our exercise (which is informed by more than 1,100 sources) in the roughly 300-page supporting online Appendix that is available at http://www.martinhilbert.net/WorldInfoCapacity.html.

 

The World’s Technological Capacity to Store, Communicate, and Compute Information
April 2011, Martin Hilbert and Priscila López
Science, Volume 332, ino. 6025; p.60-65,
FREE ACCESS to the article through this site: http://www.martinhilbert.net/WorldInfoCapacity.html

ABSTRACT: We estimated the world’s technological capacity to store, communicate, and compute information, tracking 60 analog and digital technologies during the period from 1986 to 2007. In 2007, humankind was able to store 2.9 × 1020 optimally compressed bytes, communicate almost 2 × 1021 bytes, and carry out 6.4 × 1018 instructions per second on general-purpose computers. General-purpose computing capacity grew at an annual rate of 58%. The world’s capacity for bidirectional telecommunication grew at 28% per year, closely followed by the increase in globally stored information (23%). Humankind’s capacity for unidirectional information diffusion through broadcasting channels has experienced comparatively modest annual growth (6%). Telecommunication has been dominated by digital technologies since 1990 (99.9% in digital format in 2007), and the majority of our technological memory has been in digital format since the early 2000s (94% digital in 2007).

 

Information Societies or “ICT Equipment Societies?” Measuring the Digital Information-Processing Capacity of a Society in Bits and Bytes
May 2010, Martin Hilbert, Priscila Lopez, Cristian Vasquez
The Information Society: An International Journal, Volume 26, issue 3; informaworld, p.157-178.

ABSTRACT: The digital divide is conventionally measured in terms of information and communication technology (ICT) equipment diffusion, which comes down to counting the number of computers or phones, among other devices. This article fine-tunes these approximations y estimating the amount of digital information that is stored, communicated, and computed by these devices. The installed stock of ICT equipment in the consumer segment is multiplied with its corresponding technological performance, resulting in the “installed technological capacity” for storage (in bits), bandwidth (in bits per second), and computational power (in computations per second). This leads to new insights. Despite the rapidly decreasing digital equipment divide, there is an increasing gap in terms of information-processing capacity. It is shown that in 1996 the average inhabitant of the industrialized countries of the Organization for Economic Cooperation and Development (OECD) had a capacity of 49 kibps more than its counterpart from Latin America and the Caribbean. Ten years later, this gap widened to 577 kibps per inhabitant. This innovative approach toward the quantification of the digital divide leads to numerous new challenges for the research agenda.

 

 

 

 

OTHERS (psychology, complexity, etc):

Scale-free power-laws as interaction between progress and diffusion
2013, Martin Hilbert
Complexity, Volume 19/4, pp. 56–65;
Download an electronic version (pdf) of the Author Manuscript for FREE here

ABSTRACT: While scale-free power-laws are frequently found in social and technological systems, their authenticity, origin, and gained insights are often questioned, and rightfully so. The article presents a newly found rank-frequency power-law that aligns the top-500 supercomputers according to their performance. Pursuing a cautious approach in a systematic way, we check for authenticity, evaluate several potential generative mechanisms, and ask the “so what” question. We evaluate and finally reject the applicability of well-known potential generative mechanisms such as preferential attachment, self-organized criticality, optimization, and random observation. Instead, the microdata suggest that an inverse relationship between exponential technological progress and exponential technology diffusion through social networks results in the identified fat-tail distribution. This newly identified generative mechanism suggests that the supply and demand of technology (“technology push” and “demand pull”) align in exponential synchronicity, providing predictive insights into the evolution of highly uncertain technology markets.

 

Book Review: "Sociology and Complexity Science", by B. Castellani & F. William Hafferty; Springer, 2009
December 2012, Martin Hilbert
Automatica, Vol. 48, issue 12, pp. 3187–3188 10.1016/j.automatica.2012.08.002

ABSTRACT: Sociology and complexity science have long evolved together. In the early 1900s, the economist Vilfredo Pareto popularized the scale-free power law, in the 1930s Jacob Moreno drew the first applied network graphs (the so-called sociograms), in the 1960s Stanley Migram discovered the famous six degrees of separation, in the 1970s Mark Granovetter taught us about the strength of weak ties, and in the 1980s Robert Axelrod demonstrated the complexity of cooperation. These and many other contributions to the social sciences happened before complexity science was established as a field of inquiry by itself (the Santa Fe Institute, which is exclusively dedicated to the study of complex systems, was founded in 1984, for example). Castellani and Hafferty have taken up the long overdue task to examine how both fields, sociology and complexity, have evolved in parallel, and how they are related nowadays.

 

Toward a synthesis of cognitive biases: How noisy information processing can bias human decision making
March 2012, Martin Hilbert
Psychological Bulletin, Volume 138, issue 2, Mar 2012, 211-237. Download an electronic version (pdf) of the Accepted Author Manuscript for FREE here. Also: related interview

ABSTRACT: A single coherent framework is proposed to synthesize long-standing research on 8 seemingly unrelated cognitive decision-making biases. During the past 6 decades, hundreds of empirical studies have resulted in a variety of rules of thumb that specify how humans systematically deviate from what is normatively expected from their decisions. Several complementary generative mechanisms have been proposed to explain those cognitive biases. Here it is suggested that (at least) 8 of these empirically detected decision-making biases can be produced by simply assuming noisy deviations in the memory-based information processes that convert objective evidence (observations) into subjective estimates (decisions). An integrative framework is presented to show how similar noise-based mechanisms can lead to conservatism, the Bayesian likelihood bias, illusory correlations, biased self– other placement, subadditivity, exaggerated expectation, the confidence bias, and the hard–easy effect. Analytical tools from information theory are used to explore the nature and limitations that characterize such information processes for binary and multiary decision-making exercises. The ensuing synthesis offers formal mathematical definitions of the biases and their underlying generative mechanism, which permits a consolidated analysis of how they are related. This synthesis contributes to the larger goal of creating a coherent picture that explains the relations among the myriad of seemingly unrelated biases and their potential psychological generative mechanisms. Limitations and research questions are discussed.