By Guy Hochman
For decades, psychologists have warned us about the availability heuristic – our tendency to judge how likely something is based on how easily examples come to mind. Dramatic events feel more common because they are memorable; familiar patterns feel more probable; and what we can quickly recall or imagine substitutes for statistical data.
The availability heuristic was formulated in the previous century, an era of informational scarcity where information was expensive, filtered, or difficult to access. Under those conditions, availability served as a useful, albeit imperfect, proxy for likelihood. People tended to encounter more often what actually occurred more frequently.
That world no longer exists.
We now live in an era of informational abundance. The problem is no longer insufficient information, but rather too much of it. Anyone with a keyboard can publish anything, and that content can reach millions within minutes. What becomes available is determined not by accuracy or truth, but by algorithms, social reinforcement, and repetition. As a result, the already-weak connection between availability and likelihood has collapsed entirely.
Yet heuristics do not disappear. They evolve.
From Availability to UnAvailability
In today’s information ecosystem, we are influenced not only by what we see, but also by what is missing – especially when we expect it to be seen.
When information is assumed to be unlimited, what should be visible but isn’t begins to feel suspicious. Silence is no longer neutral. It is interpreted as evidence of maleficence.
At first glance, the logic seems sound. If we wake up in the morning and discover that our car is missing, it is reasonable to assume it has been stolen. But extending this logic to assume that what we do not see does not exist – merely because we expected to see it – is not really “logical”.
This is what I call ‘UnAvailability Bias’: the tendency to treat the absence of expected information as evidence that a phenomenon does not exist, while failing to consider alternative explanations rooted in institutional, legal, or cognitive constraints.
We still believe that what is more available is more plausible. But humans have added a new rule: what is unavailable is impossible. A reversed availability bias – on steroids.
A Case Study in UnAvailability
The conspiracy theories surrounding the arrest of Charlie Kirk’s killer offer an illustrative case study. Despite a named suspect, an ongoing legal process, court appearances, images, and documented judicial proceedings, online discourse has increasingly focused on what is missing: photographs of the suspect in custody or being escorted to court in restraints.
Because such visuals are expected in the digital age, their absence is interpreted by some not as the result of legal restrictions, defense motions, institutional caution, or even plain-old political interests – but as proof of conspiracy.
The logic is inverted yet psychologically compelling: If this were real, we would have seen it.
But modern legal systems are explicitly designed to suppress precisely the kinds of images that social media now treats as epistemic currency. In high-profile cases – especially those involving political violence or potential capital punishment – courts routinely restrict media access to protect due process, jury impartiality, and the integrity of the trial.
The absence of images, in this case, is not evidence of deception. It is evidence of restraint.
Yet under the UnAvailability Bias, restraint itself becomes suspicious.
When Absence Becomes Evidence Everywhere
This phenomenon is not limited to politics or social media.
In medicine, physicians routinely misdiagnose rare conditions, not because those conditions are unlikely, but because they are cognitively unavailable. What doctors have not encountered, studied recently, or seen in practice often fails to enter consideration at the time of diagnosis. Familiar illnesses crowd out unfamiliar alternatives.
In consulting, finance, intelligence analysis, and law, experts similarly default to patterns they recognize. Gatekeepers make decisions based on what aligns with their expectations rather than on what may actually be present.
What is missing from their mental model is treated as irrelevant rather than unknown.
This is a pseudo-Pygmalion effect. Our perception may yield to false beliefs. Reality does not. Expectations shape what we notice and how we interpret it, not what is real. But when unnoticed realities are treated as nonexistent, decisions become systematically flawed.
No One Is Exempt
The same mechanism applies broadly.
Editors, for example, operate under severe constraints: limited attention, overwhelming volume, and strong priors about what a “good piece” looks like. As a result, even strong ideas that fail to fit familiar narratives often never register. Not because they are rejected. Because they are unseen.
Journalists, writers, and scholars experience this constantly. Well-argued, empirically grounded pieces never see the light of day. Not because they are wrong, but because they lack narrative fit or perceived public interest. Over time, the absence of such ideas is misread as the absence of merit.
Once again, what is unavailable is treated as nonexistent.
The Real Risk
The danger of the UnAvailability Bias is not ignorance. It is a lack of epistemic humility. Our unwillingness to admit what we don’t know and seek for evidence.
In a world of information overload, we have not become more skeptical. We have become more confident that what we know is right and what we do not see does not exist. Institutions designed to limit exposure, protect fairness, or filter noise now appear suspicious simply because they fail to provide visibility on demand.
Conveniently, this logic rarely applies when we evaluate information that aligns with our existing beliefs. There, evidence is optional.
The comfort this provides is illusory. It may feel reassuring, but it does not spare us from consequences.
When every truth associated with “the other side” is expected to arrive with footage, leaks, and virality, reality itself begins to resemble a conspiracy against the truth.