We come into this world as babies awash in a flood of sensory input, aware of sensations but not of ourselves. At first, we don’t even perceive something as simple as the boundary between our selves (our physical bodies), and the rest of the universe. There is no theory of mind, no object persistence, no ability to form durable memories. With time, we start to "know" things. We accumulate knowledge based on the immediate physical properties of objects (eg. "sharp things cut", "dropped objects fall"). As we grow, we become capable of communication, symbolic manipulation, and abstract thought. We come to know things that we may not be able to directly see with our senses or reliably verify (eg. "the world is round", "inflation is 8%"). Some of the things we think we know aren’t actually true (eg. "Santa is an immortal omnipotent being who delivers presents to basically every child in the world on Christmas Eve") and we latter learn them to have been false. But as we progress through our education there is a sense of gradual illumination, of coming out into the light, acquiring an ever more accurate and extensive knowledge of the world we live in as we rid ourselves of successively more and more falsehoods and misconceptions.
I’m no philosopher, but like most people, I’ve had plenty of time since attaining adulthood to reflect on the nature and limits of knowledge. My formal education is mostly behind me now, but I continue to learn things, mostly by "self-teaching" or simply doing things that require me to develop my skills. The sum of human knowledge accessible to me is effectively boundless because it is accumulating at a rate which far outstrips my ability to assimilate it. So, paradoxically, even as the human race collectively creates an ever greater body of knowledge, the predominant sensation I experience as an individual is that of relative ignorance. I come to share in an ever smaller percentage of humanity’s achievements, even if I commit myself to lifelong learning. And that is to say nothing of the difficulty of distinguishing "good" information from "bad", or correctly identifying the bits of knowledge that are actually worth having.
That’s the baseline situation: once you get beyond the trivially verifiable results (eg. "sharp things cut"), you find yourself dealing with an unmanageably large repository of claims in the realms of science, technology, history, and countless other fields. The way we deal with this is to work in terms of abstractions, symbols, and generalizations. We delegate the work of sorting information according to varying degrees of "true" and "false", "useful" and "useless", "interesting" and "unremarkable" to other people. We simplify things (or more accurately, we rely on other people to simplify things), reduce them, distill them to their essence. This always involves leaving something out, but the apparent loss of fidelity seems to be a pretty good trade-off in practical terms. Rather than getting stuck trying to deal with irreconcilable complexity, incomplete data, and crippling uncertainty, we come up with an imperfect approximation and use that instead. Based on what we observe in society around us, a lot of this simplifying, delegating behavior seems to have a pretty clear evolutionary basis; at least, it sure looks like it comes naturally to us. And trying to look at things as objectively as possible, it’s hard to see any other way that we could do it, given the limitations imposed on us in the forms of cognitive resources, memory, and lifespan. An individual can figure some things out from first principles, but far from all things.
With all that in mind, there are a number of prompts that might lead you to think about how you know what you know, what you know, and indeed, what you even mean when you speak of knowing something at all.
In my field, there are a couple of tropes that circulate involving seniority, experience, and knowledge, but they both basically boil down to the same thing.
The first is the Dunning-Kruger effect, commonly described as a cognitive bias in which beginners tend to overestimate their ability, while more experienced folk know enough to "know what they don’t know" (and may in fact underestimate their ability).
The second is the notion that, the more senior an engineer is, the more likely they are to answer "it depends" in response to any question. One imagines an enthusiastic junior developer confidently stating that the solution to all our problems will be found in a microservice architecture built using a hyper-modern technology stack that just came out six months before, while a wizened, gray-haired one won’t even commit to answering a simple question like "do we have enough unit tests?".
Having written my first computer program about 38 years ago now, I can definitely relate to both of these tropes. From an epistemological perspective, I feel like the amount of "actually useful" stuff that I know only built up slowly over time, painstakingly accumulated with great and long effort. At least for me, gains have only compounded slowly; breaking my career up into five-year chunks, I probably got twice as good every chunk for the first few stages, but after that progress slowed to an at-best linear climb. I’ve forgotten at least as many things as I now remember, and I can only hope that there aren’t too many "actually useful" things among the lost bits. Much of the most valuable "knowledge" I now wield has less to do with my corpus of concrete "facts" that I can dredge up out of my memory (or find quickly with the aid of a search), and more to do with my general facilities for pattern recognition, and identifying when problems and solutions viewed in different contexts may actually be fruitfully compared and end up being close enough to the same thing such that the analysis can provide a way forward. But, if this is "knowledge", it is a very intangible and almost mystical form of it. I hesitate to call it "wisdom" because that implies a value judgment that it is invariably good, when I think it is actually closer to instinct or intuition: sometimes a remarkable and helpful shortcut, and at others woefully inadequate.
By 2016 when Trump got elected, it seemed dismayingly clear that society was divided into two barely-interacting bubbles that didn’t seem to occupy the same reality any more. Fed by contradictory news sources (whether they be cable TV channels, media conglomerates, independent journalists, social media influencers, or bots), the world seemed to be notably more polarized. The two main political camps no longer seemed to differ merely in how they thought things ought best be done, but rather, they no longer agreed upon even the simplest set of common facts that only years before had appeared to be undisputed.
I actually think the level of polarization here may be a bit overstated. 50 years ago, you also would get a different view of the world depending on which newspaper you bought. The difference is that today, the internet provides immediate access for anybody to see what "the other side" is saying. And there’s a meta-current going on that means that people can’t stop talking about what "the other side" is saying. As such, I don’t think we’re more divided now than we were, or at least, not much more divided. Rather, we’re more aware of how divided we are. Unsurprisingly, one of the reasons why we’re so keenly aware of how divided we are, is that people can’t stop talking about how divided we are… Evidently, panic sells. Algorithms that foster engagement only exacerbate the underlying tendencies.
Apparently, we’re supposed to be scared of how strongly the internet amplifies these effects. That may be true, but I’m not convinced. As much as the internet might sow division, it also provides people with an endless stream of distracting entertainment that keeps most of them totally absorbed, ever looking at their devices’ screens, to the exclusion of everything else in the outside world around them. Not all that inclined to go outside and start a violent uprising. And here’s another thing: where I live (Spain), there actually was a civil war not all that long ago where people were literally killing each other for political reasons without any help from the internet. And the country I lived in immediately prior to coming here (the USA), also had a civil war without the internet. Remarkably, in both cases, these civil wars happened even though both sides had the same color skin and believed in the same god.
To partly inoculate myself from the effects of these bubbles, I started building a habit a few years ago of drawing my news from multiple sources: authors spanning the political spectrum (ie. I read coverage of major events in a "left" newspaper and then a "right" one), working in multiple forms (newspapers, podcasts, blogs, tweets etc), operating at different scales (independents, corporations etc). The idea isn’t to take multiple samples and somehow average them, but rather, to maintain an awareness that knowledge is contested, and to not lose sight of the fact that everything is provisional. So, I might read a left-wing newspaper that reports on an event and interprets it as a warning of in-progress climate catastrophe, and a right-wing one reporting on the same event as nothing to worry about. The "truth" in all this is probably not to be found by constructing a mid-point between the two poles ("this event is evidence of climate change that will be 50% catastrophic"). I find the practice useful even if the work of settling upon "the actual truth" is left as an "exercise for the reader". Sometimes it is a reminder to go to primary sources (and if you do, to subject those to the same level of critical reading that you subject the newspaper to). Sometimes it is just a reminder that beliefs tend to cluster in ways that have more to do with ideology than evidence. And evidence remains the fundamental bedrock where all these discussions must eventually find a basis.
Climate change is a good example because it’s one of those things that seemed to be "just science" back in the 1990s, but since then has become a matter of dispute in a way that caught me off guard. I mean, I get why the religious right and the feminist left will argue about abortion. I get why liberals and conservatives will differ on how to tax (and who, and how much), what to spend money on, and how big government should be. But I didn’t expect to see a political divide spring up around climate change, given that its reality appears to be so overwhelmingly obvious. You know, I can imagine an alternate reality where the left says, "Humans are warming the climate and the consequences are going to be catastrophic unless we take - and perhaps even if we do take - drastic action immediately", and where the right responds, "Yeah, it’s true that climate change is concerning, but we’ve got to keep the economy running because if we don’t then that will be even worse; in any case, we believe that the free market will naturally converge on solutions to climate change before the problem gets really bad". We could then have a debate about how bad the problem really is, and what should be done about it.
Instead, what we have is the left behaving as predicted, and a highly visible segment of the right (I have no idea as to their actual numerical weight) going far beyond "climate change is bad, but…" all the way to "climate change related to human activity isn’t even a thing". Seeing that there was a real double-take moment for me, as somebody who came of age soon after the fall of the Berlin Wall and who had unwittingly internalized a narrative of human progress as un unstoppable upwards march to ever greater heights of unity, justice, technology, and so on.
So climate change is curiously and particularly fraught when it comes to establishing truth. But even when we keep politics mostly out of it, there are lots of areas in science where it is remarkably hard to sort out wrong from right. Take nutrition for instance. For pretty much any claim you might wish to evaluate (eg. "eating meat is bad for you") you can find ample evidence both in favor and against. Real evidence, too, not just random Instagram influencer posts but supposedly vetted scientific studies in peer-reviewed academic journals. Politics, too, comes into the science of nutrition, but not so obviously along party lines. That is, large industries pump money into political, educational, and research systems to clear the way for their profit-making activities, but you tend not to find that much of a red-blue correlation when it comes to questions such as "what causes diabetes?"; I’m not saying there is no correlation, but rather that it’s not significant enough to matter, because these industries shower both sides of the political aisle with funds in order to ensure their interests are adequately protected.
Overall, surprisingly large fields within science are battlegrounds in which agreement is less broad than you might think. I expect there’s probably overwhelming consensus about, say, orbital mechanics at the scale of our solar system, but as you venture farther afield into the realms of genetics, biology, medicine, and so on, things get complicated real fast.
At the intersection of epistemology, politics, the role of the internet, science, and information bubbles, we find fascinating material for study in the form of the pandemic. What was known about the virus, about the efficacy of masking, new vaccines, and other countermeasures, has been hotly contested and rapidly changing, sometimes on the scale of weeks or even days, in ways that have played out differently from country to country, and fascinatingly mediated by political factors and immensely powerful pharmaceutical sectors, lobby groups, and organizations.
Despite the way the pandemic has pervaded, even dominated, numerous aspects of my life in very palpable ways (for example, via home confinement, being forced into remote work, home schooling, mask mandates, obligatory vaccines, and so on), it’s remarkable to me how indirect my experience of and knowledge about the actual virus has been through this all. The virus itself is invisible to the naked eye, so I’m basically taking it on trust that the thing actually exists, having bought into the germ theory of disease (something which I have never personally verified, but which is effectively an article of faith that I profess along with a host of other post-enlightenment scientific discoveries). Nobody in my family had COVID, at least knowingly. I never went to a hospital nor saw the patients. While the news media and the government reported the death toll, nobody I personally knew passed away. To me COVID was a set of numbers published on a website, a collection of photos and videos, a stream of talking heads, and tweets. Of course, I’m not actually questioning that any of this happened; I’m just pointing out that, like so many other things I claim to "know" in this life, all the evidence I have for it is indirect, a copy of a copy of a copy, passed along a chain of trust that spans individuals, institutions, businesses, politicians, and media organizations.
An interesting corollary is to be had with the war in Ukraine. Just like the virus, I don’t have any direct sensory confirmation that the war is happening; almost my entire "experience" of it is mediated by evidence transmitted from journalists and government sources. But one might be forgiven for asking, might this war have anything in common with the perpetual war portrayed in Orwell’s 1984? That is, somehow exaggerated, manufactured or perhaps even totally fabricated for some nefarious manipulative ends? Like lots of things you see online nowadays, some of the footage has been shown not to be of Ukraine at all (having been posted or re-posted by well-intentioned but mistaken social media users, or maliciously by mischief makers), although the overwhelming majority of it does appear to be accurate and repeatedly confirmed. Again it is needless to say, of course, that I am not seriously questioning the war’s reality; I’m merely pointing out the limits of certainty we can have about anything that isn’t the result of direct first-hand experience. Just because we haven’t seen something doesn’t mean it’s definitely not true; we construct beliefs on this basis all the time. Sometimes we’re wrong about them, sometimes right.
It is my local interactions with refugees that have provided me with the closest thing to direct confirmation of the war obtainable without jumping on an aircraft and seeing it with my own eyes. Otherwise, even the most committed tin-foil hat conspiracy theorist would find hard to believe the quality of the child actors that have been recruited to play the role, unceasingly over many weeks, never falling out of character, of refugees feigning limited knowledge of Spanish and sporting thick eastern European accents in the schools and playgrounds of Madrid of late.
While I think that seeing conspiracy lurking in every shadow is clearly a maladjustment, it does seem that maintaining an open mind and a willingness to question the basis for our beliefs from time to time is something healthy. As an example, prior to the Snowden revelations I would have dismissed a lot of the speculation about NSA data collection as at best wildly optimistic about the capacity and ambitions of the organization, and at worst downright infeasible crackpot conspiracy theorizing. What do you mean, I scoffed, that the NSA would compromise entire hardware supply chains to further their eavesdropping capability at a global scale? The NSA are just a bunch of crypto nerds! The documents, repeatedly authenticated since then, showed my initial position to be rather naive. They did all that and much more.
If one can’t be sure of what’s happening right now, what then, can we say about things that happened decades, centuries or millennia ago? If you look back far enough we don’t have many options beyond digging things out of the ground, or looking at things above ground that are big enough and enduring enough to tell us something about the distant past. By these means, we have knowledge that humans built pyramids in Egypt thousands of years ago, and others made flint arrow heads or clay pots in other parts of the world. That kind of knowledge is probably more interesting than "useful" as such.
But looking much more recently, even in the last century, it can be surprisingly difficult to get a handle on what happened. There is no one "history" waiting in the library for any reader whose interest is piqued to go and access. Rather, multiple histories compete to establish both the factual record and the interpretation of that record in so many dimensions (in terms of morality, rights, power, economics, and grand narratives of "progress", "struggle", "meaning" and so on). Consider how a history of Europe written in the Soviet Union might differ from one penned in England, in Italy, or across the pond in the USA. Our own internal biases might lead us to feel that there is one "real" history that we can pick out from among the rest just by looking at them. What might an alien visitor make of it all when offered a bunch of contradictory explanations of the same time period? I guess aliens have historiographers too.
In between the earliest civilizations just out of prehistory and our most recent past, there are some rather significant historical periods somewhere in the middle, such as the time when Jesus, or Muhammad, or the Buddha supposedly walked the Earth. We have ancient sacred texts that purport to tell us what happened concerning these figures, and a lot of blood has been spilled over the years (and continues to be spilled today) in the name of carrying out the divine purposes spelled out in those tomes. It fascinates me that a human might bend their life, or even sacrifice it, based on something they believe to be true from these books. You know, 2,000 years ago there was apparently this guy and there was this one time he turned water into wine (etc… I’m leaving some details out here) and you should therefore literally structure your entire life around that narrative despite the absence of any confirmatory evidence in the intervening millennia, and the much more plausible explanation for all of this that these texts were cooked-up by pre-scientific society desperately trying to construct a philosophical edifice that would imbue their brief existences with something approaching meaning. But I digress…
From religion, let’s jump straight to physics, an arena in which religious apologists don’t hesitate to point out that our modern beliefs often come to resemble articles of faith. And yes, I can see the point they’re making there. There are a lot of experiments that I actually can perform using high-school mathematics and materials available in my household (or neighborhood), but beyond those, I am indeed taking it on faith that our best and most advanced models have mostly been worked out by a bunch of smart people who have dedicated their lives to studying this stuff, using mathematics that is presently far beyond my reach, and experimental equipment obtainable only through staggeringly large investments that only the most affluent governments can afford.
So, I "know", or at least believe, that we’re all made out of atoms, although I’ve never seen one with my naked eye, and I haven’t even seen much larger assemblies of atoms (atoms into molecules, and molecules into cellular building blocks, and then cells). My most fundamental, concrete knowledge of the world only begins with the larger structures that I actually can see, like a human hair, or skin and so on. I’m basically trusting that a lot of science that I haven’t personally verified has been carried out more or less correctly, in good faith, to give us a reasonable understanding of how a lot of things work in the universe, from the tiniest scales right up to the largest (cosmological) ones. So, I think the earth is round because scientists tell me so, and what I see when I look out a plane window when flying at high altitude seems more compatible with that than with a flat-earth explanation. I could try and confirm this further via experiment, but I’m already satisfied by the weight of the informal evidence. Likewise, I think the earth spins on its axis and goes around the sun because — wait for it — scientists tell me so, and when I look at the way objects move through the sky, the way the seasons turn, and so on, all of that seems to be quite neatly explained by that model of the solar system.
The funny part is, there lies no technological innovation awaiting us in the future that would ever allow us to pierce the veil that separates us from the smallest subatomic scales. We can see "pictures" of lattice-like crystalline structures in which you purportedly can "see" individual atoms, but they’re really just tricks of visualization. The atoms surely are there, but they’re not comprised of the little round billiard balls of hard stuff that constitute their nuclei, nor the zippy little electrons neatly orbiting around them in orderly shells. At that scale, stuff starts to get really weird. Our best quantum theories tell us that reality at that scale is fundamentally different; it’s fruitful to dwell on the particle-like aspects of matter and energy for some purposes, and for others it’s useful to focus on the wave-like aspects. But what is really going on down there? The answer, it seems, is that it is somehow both.
And as such, we’ll never have a "photo" of an electron. Not only won’t you ever see such a thing with your naked eye, but it’s meaningless to even think of what it would look like if you could actually "zoom in" on one. I see macroscopic objects because my sensory apparatus is sensitive to the aggregate affect of quadrillions upon quadrillions of photons bouncing off the objects (or being otherwise emitted from them) and impinging on my sight organs. But what would I hope to see by sending a bunch of photons to interact with an electron? It would be like trying to divine the structure of an egg by "touching" it with a vast host of bullets fired out of a machine gun.
Likewise for things like quarks that the theories tell us have immense predictive power, and whose predictions have been experimentally confirmed over and over again, but which we can only "observe" indirectly by way of their side effects because they can’t exist in isolation. For these subatomic particles (again thinking of them as "particles" is at best a kind of analogy that provides us with a label for a vibration in a quantum field) it doesn’t even make sense to produce an image by bouncing visible light off them. At this level, a layperson like me claiming "knowledge" of what is happening requires me to operate at such a low precision, with such an enormous loss of fidelity and accuracy compared to that wielded by physicists who actually deeply understand the theories, the maths behind them, the nature of the experimental mechanisms, and so forth, that it’s all really quite ridiculous. And even for them, their understanding often rests on such towering pillars of abstractions, deductions, and inferences, that it’s difficult to really assign the label of "knowing" to the thing that they do with their subject matter.
Grant me the hypothetical in which I somehow found the time and had the ability to actually arrive at a much greater degree of certainty about these questions of fundamental physics. I study for years and years. I repeat experiments. I immerse myself in the material until it is as intuitive and natural to me as is the parabolic arc of a baseball sweeping through the air towards the glove of a seasoned player.
Even after all this, there would remain an inaccessible realm of unknowable but possibly true claims about existence and the universe which would forever remain beyond all possibility of verification by me or anybody else inhabiting this universe. For example, do we live in a multiverse? Eliding some of the nuances here, let’s take the "many worlds" interpretation of quantum mechanics as a version of the multiverse idea. In this interpretation, measurement effects are explained by positing that the apparent collapse of the wave function in response to measurement events is actually just the universe splitting into multiple universes that are distinguished by different measurement outcomes. This sounds weird, right (because "measurements" happen with mind-boggling frequency, and before you know it, you have "10 to the power of some number with a ridiculous number of digits" universes)?. The math, however, checks out: energy is conserved because each new universe is not a full "copy" of the original universe created out of "nothing"; rather, the average energy across all the "copies" ends up being the same as it was prior to the split.
This sounds a bit "counter-intuitive", to say the least, because by definition, any such splitting would have no discernible effect on us; the universe splits, "we" continue on in our universe without any access to the universes cleaved away from us (the "we" in the other universes is distinct from and unconnected to us, other than having shared a common origin with us). There are no experimental predictions that a "many worlds" interpretation makes that we could use to affirm the existence of these other universes; about the only thing we can say is that the interpretation is at least falsifiable, because if the quantum bases that the interpretation is founded on are ever falsified experimentally, then the interpretation of the bases will have to go away with them.
Likewise, there’s that fun old question of whether all of this might just be a big simulation. Again, this is effectively unknowable, because all we can do is make intrinsic measurements inside the universe that we live in, looking to see whether they are consistent with our understanding of the laws that describe that universe. What we can’t do is step outside of the universe to observe it in operation, extrinsically, because the universe is, you know, "everything that exists" (and which we have access to).
So after all this, the conclusion looks pretty grim. We can’t trust what we’re told about reality as it exists today, nor any state in which it has existed in the past. The human race produces "knowledge" at a rate that exceeds our ability to absorb and assimilate it, and even if it stopped producing new knowledge right now, all but the purest mathematical constructs are embedded in social systems that make establishing their actual truth a difficult prospect, characterized as these systems are by political, financial, and ideological interests. Oh, and let’s not forget that even in a hypothetical system utterly sterilized of all such conflicts and concerns, there’s the baseline fact that humans aren’t perfect, our cognition is limited, and information gets lost or corrupted; even in a system populated only by the sincerest actors, working together with homogenous intent and in good faith, obtaining certain knowledge would be a fraught process.
Our scientific theories are only ever provisional, a good chunk of the world is still in disagreement with other chunks of the world about which monotheistic religion is The One True Religion, and a lot of the stuff that seems like it should be a matter of fact is actually about stuff that we can’t perceive with our sense organs and which we wouldn’t personally verify even if we could.
Finally, we might actually be computer programs, or somehow even less impressively, we might be higher-order emergent phenomena growing out of an absurdly simple set of automata.
The weird part is, none of that actually bothers me. I feel the ground beneath my feet, the taste of coffee on my tongue, and the bass notes coming out of my headphones somehow please me in a way that makes me suspect some signalling hormone is being secreted somewhere in my body to say "this feels good". This is enough.
I work as a software engineer at a large technology company called GitHub. ↩︎
In the interests of balance, you can find examples of the opposite kind of thing happening in other areas. That is, instances where the right has stayed more or less where it was, and the left has run way off into the fringes. A recent example that comes into mind is the right making perfectly reasonable claims in defense of free speech, which is one of those rare areas where we actually had rather broad agreement across the political spectrum, and the left pivoting hard towards censorship and compelled speech. ↩︎
The standard rejoinder for these arguments that science is, in fact, no less requiring of faith than is religion, is that people who actually understand the scientific method make no such demands for faith at all. Religion demands faith on the basis of divine authority: "this is so and true because I, the ultimate power in the universe beyond all knowing and human ken, have deigned it to be so". Science asks for no faith; it seeks only to develop an increasingly accurate explanation for observable phenomena based on mathematical formalisms that can make predictions. Even the most ardent defender of something like Quantum Field Theory would abandon it — and will abandon it — as soon as something comes along that can explain things more accurately (ie. more in line with experimental observations) or more completely. ↩︎
This analogy isn’t perfect I know, because the bullets evidently obliterate most of the structure of the egg, and what makes an egg an egg. The point is, though, that firing photons at an electron doesn’t give you a "picture" of the electron any more than the bullets do of the egg. Based on what I remember from high-school physics, the electron might absorb energy from the photon (moving to a higher valence level), or it might scatter it, but bombarding the system with photons clearly has more to do with modifying the system than with capturing a meaningful visual representation of it. "Seeing" simply doesn’t make a lot of sense at this scale. ↩︎
Accounting for friction and other aerodynamic and local meteorological effects. ↩︎
Naturally, I haven’t verified the calculations implied by this claim. But I believe that people who are better at math than me have done so. ↩︎
Shout-out to my homie, S. Wolfram! ↩︎