Command-T v6.0 — the Lua rewrite

For a while now I’ve wanted to do a ground-up rewrite of Command-T in Lua. After sitting on the back-burner for many months, I finally got around to doing some work on it. While the rewrite isn’t done yet, it is so close to being an "MVP"[1] now that I can talk about the new version without worrying too much about the risk of it being vaporware. So, let’s start.

History

This isn’t the first time I’ve written about Command-T on this blog. Back in 2016 I wrote about how I’d been optimizing the project over many years. As that post tells, ever since I created Command-T in 2010, its primary goal has been to be the fastest fuzzy finder out there. Over the years, I’ve found many wins both small and large which have had a compounding effect. If you make something 10% faster, then 10% more, then you find a way to make it 2x faster than that, and then you find a way to make it 10x faster than that, the end result winds up being "ludicrously" fast. At the time I wrote the optimization post, some of the major wins included:

  • Writing the performance-critical sections (the matching and scoring code) in C.
  • Improving perceived performance, somewhat counterintuitively, by spending extra cycles exhaustively computing possible match scores, so that the results the user is searching for are more likely to appear at the top.
  • Memoizing intermediate results, to make the aforementioned "exhaustive computation" actually feasible.
  • Parallelizing the search across multiple threads.
  • Debouncing user input to improve UI responsiveness by avoiding wasteful computation.
  • Improving scanning speed (ie. finding candidate items to be searched) by delegating it to fast native executables (like find or git).
  • Avoiding scanning costs by querying an always up-to-date index provided by Watchman.
  • Reducing cost of talking to Watchman by implementing support for its BSER (Binary Serialization Protocol) in C, rather than dealing with JSON.
  • Prescanning candidates to quickly eliminate non-matches; during this pre-scan, record the rightmost possible location for each character in the search term, which allows us to bail out early during the real matching process when a candidate can’t possibly match.
  • Recording bitmasks for both candidates and search terms so that we can quickly discard non-matches as users extend their search terms.
  • Using smaller/faster data types (eg. float instead of double) where the additional precision isn’t beneficial.
  • Using small, size-limited heap data structures for each thread, keeping small partial result sets ordered as we go rather than needing a big and expensive sort over the entire result set at the end.

After all that, I was running out of ideas, short of porting bits of the C code selectively into assembly (and even then, I was doubtful I could hand-craft assembly that would be better than what the compiler would produce). There was one PR proposing switching to a trie data structure, which would allow the search space to be pruned much more aggressively, but at the cost of having to set up the structure in the first place; in the end that one remained forever in limbo because it wasn’t clear whether it actually would be a win across the board.

Why rewrite in Lua?

Neovim comes with Lua (or more precisely, LuaJIT), which is well known for being speedy. It’s an extremely minimal language that optimizes well. I previously saw huge wins from porting the Corpus plug-in from Vimscript to Lua (GIF demo). While I wasn’t planning on throwing away my C code and rewriting it in Lua, I could throw out a bunch of Ruby code — mostly responsible for managing the UI — and rewrite that. This, combined with the fact that Neovim now offers neat APIs for doing things like floating windows, means that a Lua-powered rewrite could be expected to have a much snappier UI.

The reason Command-T had Ruby code in it is that, in 2010, it was the easiest way to package C code in a form that could be accessed from Vim. You build a C extension that integrates with the Ruby VM (ie. you can call C functions to do things like create and manipulate arrays, access hashes, raise exceptions, call Ruby methods, and so on), and then you can call into the Ruby from Vimscript. There is overhead in moving from Vimscript through Ruby into C and back again, but because most of the heavy lifting is done in C-land — the actual work of trawling through thousands or even millions of string bytes and producing scores for them — it ends up being blazingly fast compared to a native Vimscript or pure Ruby implemention.

The other nice thing about Ruby is that it is a "real" programming language, unlike Vimscript, which is a bespoke and idiosyncratic beast that you can use in exactly one place[2]. If you need a working Ruby layer in Vim just to get the C code, you may as well leverage that Ruby layer once you have it. That gives you access to niceties like object-orientation, modules, and a relatively flexible and extensible programming model that allows you to write expressive, readable code.

The downside to all this is that Ruby installations are notoriously fragile inside Vim as soon as you start involving C code. You must compile the Command-T C extension with exactly the same version of Ruby as Vim itself uses. The slightest discrepancy will crash the program. In a world where people are on an eternal operating system upgrade train, are constantly updating their editor with tools like Homebrew, and playing endlessly with Ruby versions via tools like RVM, rbenv, chruby — not even a complete list, by the way — you wind up with an incredibly fragile and unstable platform upon which to build. Over the years I have received uncountable reports about "bugs" in Command-T that were actually failures to install it correctly. A glance through the closed issues on the Command-T issue tracker reveals dozens of reports of this kind; command-t#341 is a representative example. The basic formula is:

I can’t get Command-T to work (or it stopped working)…

(Various installation commands are run or re-run…)

I got it working in the end.

(Issue gets closed with no code changes being committed.)

This alone is probably the main reason why I have never heavily promoted Command-T. Over the years there have been other fuzzy finders that have more features, or are more popular, but none with performance that scales to working on repositories with millions of files, and none which provide such a robust and intuitive ranking of match results. Those are the features that I still care about the most to this day, and that’s why I keep on using Command-T. But I don’t want to actually promote it, nor do I want to keep adding on features to attract new users, because I know that the bigger the user base, the more support tickets related to version mismatches, and the more hair ripped out from frustrated scalps across the globe. So, I continue on, quietly using Neovim and Command-T to get my job done, and I don’t twiddle my editor versions or my Ruby version unless there’s good reason to.

At one point, I was considering a way out from this in the form of running the Ruby code outside of the Vim process itself. The idea was to run a commandtd daemon process, and communicate with it using Vim’s job APIs. This would totally decouple the version of Ruby used by Vim from the version used by the daemon, and "solve" the installation woes once and for all. Users would still need to run a make command to build the daemon, but they could forget about versions at least. In the end, I didn’t pursue this idea to its conclusion. I didn’t like the complexity of having to manage a separate process, and I worried about the overhead of sending data back and forth via IPC. Finally, I figured that if I could just access the C code from Lua instead of Ruby, then I might be able to side-step my Ruby headaches.

So, I thought, let’s make a clean break. I’ll drop the Ruby requirement, and move wholesale over to Lua and Neovim (I’ve been using Neovim myself full-time now for about 5 years, if the first traces of evidence in my dotfiles repo are to be believed). Forget about Vim support, forget about Windows, and just go all-in on modern APIs. The nature of Git branches means that anybody wanting to continue using Vim or Windows or Ruby can do so just by pointing their plug-in manager or their Git submodule at the right branch; in the meantime, I’m going to ride off into a brave new world.

A huge amount of the Ruby code in Command-T is about managing windows, splits, buffers, and settings. Back in 2010 nobody had dreamed of putting floating windows inside Vim, so if you wanted to present a "UI" to the user you had to fake it. Command-T did this, basically, by:

  • Recording the position of all windows and splits.
  • Remembering the values of global settings that need to be manipulated in order to get the "UI" to behave as desired.
  • Creating a new buffer and window for showing the match listing.
  • Setting up global overrides as needed, along with other local settings.
  • Setting up mappings to intercept key presses; the "prompt" was actually just text rendered in Vim’s command line.
  • After a file is selected, clean up the prompt area, remove the match listing, restore the global settings, and reestablish the former geometry of windows and splits.

The code worked remarkably well because it was the product of extreme attention to detail and relentless refinement over the years. But it was an enormous hack, and it was incredibly ugly and annoying to maintain. In comparison, throwing up a floating window with the new APIs is an absolute breeze. No need to think about window geometry, no need to set up mappings, no need to construct an elaborate fake prompt. The importance of having a real prompt is not to be understated: with the old approach, Command-T couldn’t even support extremely natural things like the ability to paste a search query in a uniform and reliable way; with a real prompt, we get that "for free", along with the all of the standard Vim motions and editing bindings.

Other wins

One thing about a clean rewrite is it gives you a chance to reevaluate technical decisions. There are two examples that I’d like to highlight here.

The first is that I turned the C library from a piece of "Ruby-infested" C (that is, C code littered with calls to Ruby VM functions and using Ruby-defined data structures; example matcher.c) to a pure POSIX one (matcher.c). There is no mention of Lua in the C library, which means that any Ruby-VM-related overhead is gone now, replaced by nothing, and the library can be cleanly used from more places in the future, should people wish to do so. In the past, I extracted Command-T’s fast scoring algorithm into a Python package (still C, but adapted for the Python runtime instead of the Ruby one). Doing that was fiddly. With the new, pure POSIX library, grabbing the code and wrapping it up for any language would be a whole lot easier. Pleasingly, this new version is about 2x faster in benchmarks than the old one, which is pretty amazing considering how fast the old one was; maybe the Ruby-related overhead was more than I’d thought, or perhaps the LuaJIT FFI is unexpectedly awesome… And naturally, on revisiting code that had been iterated on for over a decade, and reworking it profoundly, I took advantage of the opportunity to improve readability, naming, structure, and a bunch of other things that you might classify under "spring cleaning". I also implemented some fast C-powered scanning functionality that had been proposed for the old version but never merged due to some doubts about performance. Overall, the C code is in much better shape.

The other aspect that I noticed was the effect of moving from heavily object-oriented Ruby idioms to light-weight Lua ones. Lua mostly favors a functional style, but it does provide patterns for doing a form of object-oriented programming. Nevertheless, because OOP is not the default, I’ve found myself using it only when the use-case for it is strong; that basically means places where you want to encapsulate some data and some methods for acting on it, but you don’t need complex inheritance relationships or "mixins" or any other such fanciness. The Ruby code is probably more legible — Ruby is famously readable, after all, if you don’t go crazy with your metaprogramming — but there is so much less Lua code than there was Ruby code, that I think the overall result is more intelligible. The other thing is that when I wrote Command-T in 2010, I was coming from Apple’s Objective-C ecosystem, and Rails too, both of which had spins on the "MVC" (Model-View-Controller) pattern, and which influenced the architecture. In 2022, however, we see the influence of React and its notion of "unidirectional data flow" to guide me whenever I have a question about where a particular piece of data should live, who should own it, and how updates to it should be propagated to other interested parties within the system. Overall, things seem clearer. My work-in-progress is still at very "pre-alpha" stages, but I’m confident that the end result will be more robust than ever.

It’s sometimes tempting to look at a rewrite and marvel, prematurely, at how much better and lighter it is. Think of it as the "Bucket Full of Stones v1.0", long creaking under the weight of all the stones inside it. You start a fresh with "Bucket Full of Stones v2.0" and are amazed at how light and manoeuvrable the whole thing feels without any stones in it. As you add back stone after stone, it still feels pretty light, but eventually, you discover that your bucket is as full as ever, and maybe it’s time to start thinking about "Bucket Full of Stones v3.0". Nevertheless, I still feel pretty good about the rewrite so far. It is much smaller in part because it only has a subset of the features, but the foundations really do look to be more solid this time around.

The upgrade path

This is where things get tricky. The Vim ecosystem encourages people to install plug-ins using plug-in managers that clone plug-in source from repositories. Users tend to track the main or master branch, so version numbers, SemVer, and the very concept of "releases" lose significance. You can maintain a changelog, but users might not even see it. In this scenario, how do you communicate breaking changes to users? Sadly, the most common answer seems to be, "You break their shit and let them figure it out for themselves". The other answer, and I think the right one, is that you simply don’t make breaking changes at all, ever, if you can help it. Put another way, as a maintainer, ya gotta do some hoop-jumping to avoid user pain[3]. Command-T is not the Linux kernel, but it stands to learn a lesson from it, about not breaking "user space".

My current plans for how to do this release with a minimum of pain are as follows:

  • The new version, version 6.0, will effectively include both the old Ruby and the new Lua implementations.
  • If the user opts-in to continuing with the Ruby version, everything continues as before. It may be that I never remove the Ruby implementation from the source tree, as the cost of keeping it there isn’t really significant in any way.
  • If the user opts-in to using the Lua version, they get that instead. For example, a command like :CommandT will map to the Lua implementation. A command that is not yet implemented in the Lua version, like :CommandTMRU, continues to map onto the Ruby implementation, for now. If you ever need to fallback and use the Ruby implementation, you can do that by spelling the command with a K instead of a C; that is, :CommandTBuffer will open the Lua-powered buffer finder, but :KommandTBuffer can be used to open the Ruby one.
  • It the user doesn’t explicitly opt-in one way or another, the system will use the Ruby implementation. We show a message prompting the user to make a decision; technically this is the breaking change (a new message that will bother the user at startup until they take the step of configuring a preference) that requires the bump in version number to v6. As far as breaking changes go, this is about as innocuous as they come, but it is still one that I make reluctantly.
  • In version 7.0, this default will flip over in the opposite direction: if you haven’t specified an explicit preference, you’ll get the Lua version. By this time, however, I expect pretty much everybody actively using Command-T will already have set their preference. In 7.0 the aliased version of the commands (eg. :KommandT) will go away.

A couple of things to note about this plan:

  1. All of the above applies on Neovim; if you’re running Vim you aren’t eligible to use the Lua backend, so you won’t see the deprecation prompt, and you’ll continue to use the Ruby version transparently.
  2. Maintaining two parallel implementations "forever" is only feasible because this is a hard fork. That is, there is no commitment to having an equal feature set in both implementations, having or fixing the same bugs, or even having the same settings. The Ruby backend, as a mature 12-year-old project, is mostly "done" at this point and I doubt I’ll do much more than fix critical bugs from here on. For people who don’t want any part of all this, they can point their checkout at the 5-x-release branch, and pretend none of this is happening. As an open source project, people are free to contribute pull requests, make a fork, or do whatever they see fit within the terms of the license.

How will all of this work? We’ll see. Last night I published v5.0.5, which may be the last release on that branch for a long while. As I write this, main doesn’t have any of the new stuff yet (currently, 81dba1e274) — the new stuff is all still sitting out on the pu (proposed updates) branch (currently, 9a4cbf954c). My plan is to keep baking that for a little while longer — a timespan probably measured in hours or days, but probably not in weeks or months — and then pull the trigger and merge it into main, at which point we’ll call it the "6.0.0-a.0" release. As I said above, this feels real close to being an MVP now, so it hopefully won’t be long.


  1. I’m defining "MVP" (Minimal Viable Product) here as having the subset of features that I use heavily on a daily basis: a file finder, a buffer finder, and a "help" finder (for searching the built-in Neovim :help). ↩︎

  2. In Vim, that is. Two places, if you count Neovim. ↩︎

  3. To make this more precise: users come first, so you default to hoop-jumping if necessary to avoid user pain; the only reason you might relent and actually break something is if the cost of hoop-jumping becomes so prohibitively high that it dissuades you from working on the project at all. ↩︎

Loneliness in a modern world

Ok, not sure exactly where this one is going to go so I’m just going to start writing and see what comes out, stream of consciousness style. The other day, I tweeted about how modern life seems to be pushing us in the direction of loneliness. Twitter is not a very good forum for exploring any ideas in depth, really, much less complex ones with personal dimensions, so I didn’t get into the details, but the reason why I didn’t start off with a blog post in the first place is that this also doesn’t feel like the best place to discuss something intimate and emotional — it’s literally just "uploading into the void". I also wondered whether a Facebook post might be the right vehicle, but that didn’t feel right either: FB is too ephemeral, too filled with superficial "my wonderful life" posts from acquaintances I once knew to host a serious discussion.

So after a night sleeping on it, I decided to hash it out here after all. One of the reasons it’s hard to select a venue for this topic is that I’m not sure how much I should say, and about what. There is part of me that wants to do the rational/analytic thing, talk about societal trends, and try to analyze their impact. And then there’s another part of me that wants to just relate my personal experience. And I guess there’s a part that wants to do both. Even two paragraphs into this, I’m still not sure where I want to go with it.

Let’s start with the broad societal stuff then. I can’t be bothered digging up references to actual research, and it’s my blog, not a peer-reviewed academic journal, so I feel ok about just tossing out a series of impressions and leaving as an exercise for the reader the whole business of seeing how this stacks up against "the data". With that disclaimer out the way, it seems incontrovertible that a number of technology-fueled innovations have set us up to feel more lonely today.

These include the rise of remote work, enabled by technology and accelerated by the pandemic. Then there’s the transformation of almost every aspect of life by the internet, which seemed pretty fun and novel at the beginning, but with smartphones in every pocket and smartwatches on many wrists, it has taken on a new role as an ever-present distraction. Combine that with the effects of social media, engineered to maximize engagement, and we find ourselves in a dystopia where even when we’re physically together, we’re never really fully present. I live in Madrid, a city that apparently has one of the highest counts of "bars per square mile" of any metropolis, yet here — like everywhere else, I imagine — it’s hard to go to a bar or restaurant without seeing full groups, all heads bowed, every gaze fixed on a glowing, personal screen, as they scroll endlessly through shiny baubles and exquisitely captured "portrayals" of perfect lives on Instagram and alike. All of this, instead of talking to one another. And the truth is we don’t even need internet-enabled pocket computers to make this happen: I can’t help but notice when I talk to my parents on Facetime, 16,600 kilometers (10,000 miles) away, that they always have at least one eye and sometimes two on the television set playing endlessly in the background. With the younger generations, tactile screens have replaced the passive rectangle of TV screen, but the state of divided attention is the same.

Whether it be the naturally emerging "organic" properties of these platforms, the actions of hostile intelligence agencies seeking to disrupt our societies through them, the role of powerful corporate entities and media conglomerates intricately interrelated with the centers of governmental power, or a decidedly not-new marketing machinery finding new avenues for exploitation, all of this adds up to a state of constant agitation, wanting, conflict, and distress. Society seems more divided than ever, but I suspect that’s an illusion; we’ve always inhabited different worlds: what is new is that the differences between us are continually being made visible in a way seldom seen outside of wartime.

At a personal level, these global circumstances intersect with my own due to a series of choices that I made: choices like moving to Spain, where I don’t have a network of friends, and taking up remote work with a company where people in my timezone constitute a minority. And obviously, I’m on the internet, which is a choice too. I’ve done what I can to minimize the deleterious effects of social media without disengaging from it entirely[1], but there’s no denying that social media probably harms my life at least as much as it helps it.

Growing up in Australia when I did, I internalized — apparently permanently — enough of the traditional masculine ethos to ensure that I rarely talk about emotions and feelings. Given all of the above, you could say that I feel quite lonely, but I was raised (not intentionally, of course) to not talk about feelings. Instead of dwelling on my loneliness, I distract myself so as not to feel it. For example, I read a lot. I walk a fair bit. I work on software projects in my spare time. I keep occupied. And all of that stolid activity stops what is objectively a "lonely" existence from turning into depression. I simply exist, or perhaps you could say I exist simply, and I generally try to lose myself — my "I" — from subjective experience, not because I am some kind of enlightened Buddha type, but because I’ve found that if I can enter a "flow" state wherein awareness of myself recedes and my full attention is occupied by some other object, some problem, some focus, then that’s the closest I can find to being "happy".

I put that in inverted commas because I’m mostly defining happiness in the negative sense, as in, as an absence of suffering as opposed to the presence of something positive, like euphoria. This latter category of states, in my experience, is at best enjoyed only fleetingly. It’s nice work when you can get it, but I’m not going to organize my entire life around the pursuit of it. My definition of happiness, the thing I actually seek, is probably closer to "contentment", or "satisfactoriness" (and here I am starting to sound a bit Buddhist, I think). Basically, a state in which suffering is minimized, or at least one doesn’t dwell on it, just as one doesn’t long for things that one doesn’t have.

But despite all that, reality does occasionally impinge on my carefully constructed local environment. I can’t be in a flow state all the time (quite obviously, I am probably only in it a very minor part of my time). I live with small children, and every interaction with them is an uncomfortable reminder of my own inadequacies as a parent, as someone who feels they should be the best possible parent, partner, worker, and in general, human being, but feels like they are at best doing a half-assed job of all of the above. When I go out into society, as I must, I am jolted with the occasional harsh reminder of just how unpleasant society can be. Whether it be being on the receiving end of a bout of homicidal road rage, or experiencing rudeness in a supermarket checkout line, or just seeing evidence of people in general being assholes and douchebags, I can’t escape the sensation that our "communities" are weaker than they seemed to me in the past[2]. How much of this is reality and how much mere perception, I don’t know, but I find myself increasingly despairing of how bad things are and wishing to retreat back into my home to read a book until I’ve been distracted from thinking about how things are. One of the reasons I like books is because they generally have to pass through a great editing and publishing filter that ostensibly increases their quality, but also adds a bunch of latency between the moment of their authorship and my consumption of them. That delay may be months, years, or even decades or centuries. The hope is that, if something endures that long, then maybe it has some deep, residual value that makes it worthy of your attention in a way that a viral tweet or a flippant hot take may not be. Generally, I don’t find that things written in books have the same ability to raise my blood pressure that a current newspaper or a tweet does.

And maybe I am more sensitive to all of this of late because I’ve been participating in a clinical trial for a new drug. And no, I’m not talking about side-effects of the drug[3]. I’m referring to the fact that the trial takes several hours out of my work week, hours which I feel obliged to make up for, and which in turn means that I have ended up canceling a video chat that I used to have every two weeks with one of my former colleagues. It wasn’t much, but that one hour of video calling every two weeks was the only regular contact I had with anybody that I could call a friend. This is the end of an arc of ever-increasing isolation that I have felt over a number of years now: in 2005 I entered my current relationship which increasingly demanded that I prioritize it absolutely above all things in order for it to prosper[4]; in 2011 I worked at a start-up which was all-consuming and where the boundaries between "work" and "life" were blurred, but it did mean that I had a lot of friends there; in 2013 I became a parent, which meant that I had less time for socializing with those work friends; in 2014 I switched to a new job at a much bigger company, and involving a long commute, which massively curtailed the amount of work-enabled socializing that I could do (or hope to effectively integrate with my family life); in 2018 I moved to Spain and where I effectively began with a completely blank slate as far as friendships were concerned (and which remains blank to this day); in 2020 my meagre work-based socialization was curtailed by the switch to remote work due to the pandemic; in 2021 I switched to a fully remote role at company based in a far-off time-zone. Without getting into the fraught territory of comparing subjective experiences, I do spend about 23 hours a day either sleeping or alone.

Last May, I started visiting a psychologist. My working life hadn’t been affected by the pandemic, but my social life had: I had some surplus cash to spend, and therapy sessions seemed like a good way to convert money into "self care" (pre-pandemic, self care might have consisted in going out to a restaurant, and we barely did that at all in 2020 and 2021). I’m thinking of stopping it though, as I don’t seem to be making any progress. I’m not even sure what I would define "progress" as in this context. It can be hard to change much in yourself once you’ve been on this planet for a few decades, so I wasn’t necessarily expecting anything dramatic, but I can’t shake off the feeling that I am going nowhere with this, even though I don’t exactly know where I would realistically wish to go. In a nutshell, it comes down to this: I think I avoid talking about anything truly painful or risky in these sessions, and they effectively turn into superficial "rent a friend" meetings in which I get to bore somebody for 50 minutes — talking about whatever the heck I feel like — in a way that I would not feel comfortable doing in a real social context. If that’s really all I am going to use the sessions for, then perhaps I’d be better off just writing blog posts instead.

The thing I don’t really want to talk about with my therapist is that, at a very low level that I am internally quite aware of, but which I rarely make any explicit acknowledgement of, I don’t really like myself very much. That’s why I’m so good at seeking out flow states and getting into them; because they’re the best tool I’ve ever found for making myself "disappear" from my own consideration. And I haven’t been able to bring myself to talk to my therapist about this because any such discussion would seem to have as an implicit goal, a desire to reprogram my internal worldview or interrupt the related internal dialogs that go with it. But here’s the thing: I don’t want to replace my reality with another one, even if that reality is in some way "healthier" for me[5]. The thing about reality is that it seems, er, real, even if it is just your own reality (and the truth is, unless you’re completely unhinged, there’s got to be at least considerable overlap with how you perceive yourself and the world to be, and how other, similarly "well-hinged" people see these things to be). I like reality. I like "truth". I like "facts". I know these are only ever contingent and provisional, but it’s very deeply ingrained in my to seek these things out. Maybe you can make the argument that "ignorance is bliss", and once you’ve reprogrammed yourself to see yourself more positively, you won’t be troubled by the pesky fact that you had to effectively brainwash yourself in order to talk yourself into believing that you’re a good person. But it just seems so much easier to do nothing at all, and continue being really good at distracting yourself with flow states, books, YouTube videos, and other fleeting nonsense that has literally zero significance at a cosmic scale, just like you, me, and everything else that has ever existed. If you can do all of this in a state of relative contentment, keeping overt suffering at bay, then that seems like quite a reasonable outcome. And in the meantime, I’m just going to try to be the least of an asshole I can, to minimize the downside on others that any of my own choices might have.


  1. On Twitter, for example, I follow almost nobody, and instead add people to private lists that I can dip into as mood and appetite permits. On Facebook, I mostly passively consume, preferring not to volunteer too much content of my own, nor engage on anything that might be controversial or conflictive. ↩︎

  2. And you could be forgiven for remembering here, and wondering if it might apply to me, the old adage: "if everybody around you is an asshole, then guess what: you are the asshole". ↩︎

  3. I’ve got a 50% chance that I’m in the placebo/control group anyway, but regardless of the group I’m in, I’ve had all manner of blood draws and other samples and measurements taken, none of which seem to be showing any significant difference at this stage. ↩︎

  4. For complicated reasons which it doesn’t feel appropriate to go into on a public place like this blog post. The only reason I mentioned even this much is that I don’t think anybody is going to read it, or if they do, they won’t do anything with the information. ↩︎

  5. Obviously, if you don’t like yourself much, then it’s hard to make a priority of looking after yourself. ↩︎

Bitcoin

One of the reasons I write blog posts is that I find it interesting to take snapshots of my thinking as a way of documenting how my viewpoints evolve over time. Sometimes they stay relatively stable, and at others they shift quite dramatically. For example, in July 2020 I wrote my thoughts on coronavirus, at a time when vaccines still hadn’t been rolled out, and many countries (including my own) were in some form of hard lockdown. Questions of waves, variants, masks, and so on were going through their first oscillations with public, scientific, and official sentiment wavering back and forth. The debates were just starting to take on political hues. Arguments about censorship, misinformation, disinformation, and so on, were only just beginning to warm up.

When I look back at that post now I find it all to be pretty reasonable, given what we knew at the time (generally, not as much as we would have liked to). The tone is one of prudence and caution, of provisionality born out of ignorance. If I were to summarize my feelings about COVID and the pandemic now, I’d tell you how I’m quite a bit more relaxed about the virus these days, and how my most significant concerns are actually about what we have learned (or failed to learn) about how to respond to extreme incidents. As a society, I think we’ve come out of this traumatic collective experience with our sensibilities, and our institutions, somewhat deranged. I think we now face the twin risks of having primed ourselves to overreact when something happens in the future, while paradoxically also having attained such levels of fatigue that we may not react enough the next time we face a threat.

But this isn’t a post about COVID. It’s about Bitcoin and the topic of cryptocurrency[1] in general. This feels like another one of those highly contentious topics where interested parties, whether they be "crypto maximalists" or "crypto sceptics", are debating an incredibly complex subject with an essentially unknowable future — only time will tell. As I have only engaged with this whole domain in the most superficial manner over the years, I haven’t felt like I had anything noteworthy to write down until recently, and even now, I’ve only just dipped my toes in the water. But of late, Elon Musk has been in the news a lot, and where Elon treads, discussion of crypto usually follows (and precedes) him. Likewise, Lex Fridman has had a couple of prominent Bitcoin intellectuals on his show for multi-hour discussions (examples: Michael Saylor, Saifedean Ammous) that were pretty interesting, and so I’ve been thinking about crypto in a deeper way than I had until now.

Bitcoin first came onto my radar around 2011. Back then, a single Bitcoin cost about $1. I actually tried to buy a couple, just because I found the idea of a recently invented, purely digital, cryptographically-based currency to be nerdily amusing. Tried, I say, because I couldn’t muster the will to actually overcome the various sources of friction and inconvenience that purchasing Bitcoin actually entailed back then; I can’t even remember what the concrete obstacle was that stopped me from pursuing it all the way to the end. I wasn’t going to buy these coins to spend, or as an investment, but simply to have them, because I thought it was cool.

Since then, a lot has changed. Bitcoin’s price shot through the roof, increasing by thousands (or tens of thousands) of percent from its early days, depending on where you start measuring from. It became an object of speculation. The idea was ripped off, copied, modified by — again, literally — thousands of hucksters, would-be innovators, and crypto nerds. Many a fortune was made or lost, both individual and institutional investors made profits or got fleeced, as coins and financial instruments flashed in and out of existence. Heavy hitters bought huge amounts of digital currency — the most prominent example I’m aware of being Tesla, which currently holds over $1B in Bitcoin — and governments and regulators started to show more interest in monitoring and taxing crypto currencies and securities. In a way that is mostly invisible to the average citizen, crypto mining started consuming an "interesting" proportion of energy production, and in a way that was very definitely visible to anybody trying to build a computer during the last few years, it dramatically impacted the availability of GPUs.

As I watched all this play out, my evolving stance was basically that of a crypto sceptic. As an investor, I’m a "buy-and-hold" type, I believe in holding productive assets (like shares in valuable companies), and I’ve never had any interest in speculation. I don’t like complicated financial vehicles; anything more complex than an index fund isn’t really my cup of tea. I couldn’t care less about puts and calls, synthetic instruments, leverage, and so on. To somebody who’s not that interested in the world of cryptocurrencies, Bitcoin itself seemed unpalatably volatile and not that useful for any of the Big Three reasons that are generally cited as "what money is" (that is, a medium of exchange, a store of value, and a unit of accounting). All of the other coins just seemed like absurd Ponzi schemes, mostly developed by people with far too recently acquired knowledge of programming, cryptography, and finance. I was utterly unimpressed by all the companies small and large scrambling to find a way to cram "blockchain" into their product offerings; never did the phrase "a solution looking for a problem" apply so readily. And obviously, the fact that human activity is driving the planet into a climate crisis sure makes the energy consumption of crypto activity look rather dubious. To my eyes, it seems at best frivolous, and at worst downright deleterious.

Anyway, I’ve done a bit of reading of late trying to understand if there are any intelligent arguments out there for why Bitcoin actually is a good idea for humanity, as opposed to just being an elegant technological construct. There sure as heck are intelligent arguments for it not being a good idea. And when you start talking about stuff like NFTs, which to me look patently absurd, you don’t have to look all that far to find some absolute demolitions penned by smarter and more informed people than I. Nevertheless, I’ve been wrong about lots of things in my life — and I’m adding to the list of things I’ve gotten wrong all the time — so it makes sense that I should at least try to find some of the best arguments in favor of crypto, so that I can develop a more nuanced point of view.

I’m not going to list all the things I’ve picked up here in this blog post. If you want to hear some articulate and well-informed advocacy for Bitcoin just start with the two podcast episodes I linked to earlier and go from there. But I do want to describe a couple of interesting points that they made which I don’t think I would have arrived at on my own. These points aren’t necessarily facts as such because this whole terrain is contested territory: economists, intellectuals, and other "experts" don’t even necessarily agree how economies work and how they are best run, even in the present moment. And the area of dispute here is even harder than that — it’s not just about what is but what will be. The future is clouded with uncertainty, and while we’re talking about "crypto" here, all of this is embedded in complex social and economic systems whose future is yet to be seen. We make predictions, we can even try glimpse the future by modeling systems in more or less rigorous ways, but this business of predicting the future is fundamentally speculative by nature.

So, with that proviso out the way, these are the things that I’ve found interesting in my recent explorations.

The first is about Bitcoin as money and not as an investment. Saylor would phrase this as the distinction between "property" (a thing you buy, like a house) and a "security" (like a stock). Ammous would make an argument about the Big Three: it’s not yet true that Bitcoin indisputably checks all the boxes, but he would say that, of all the cryptocurrencies that have existed so far, it is the one that will. In fact, he seems so certain that it will, that if your time horizon is long enough (ie. that you’re planning to hold the coins for long enough, on the scale of years) that you may as well consider it to be true. For both of them, the fact that the quantity of Bitcoins in existence is only growing slowly and, ultimately, is capped, is key here, because it makes Bitcoin extremely resilient against inflationary forces. In the short term, the market is still small and volatility is high, but the thinking is that in the long term, Bitcoins will be much better at holding their value than any fiat-based currency (or even a gold-backed one or similar) can ever be, because the supply is categorically constrained.

An additional point they make, and which I am inclined to agree with, is that describing inflation in terms of a single "scalar" value that is regularly redefined to suit the whims of the agencies responsible for measuring it, is woefully inadequate, and that the real inflation that we all experience is quite a bit higher than what this number tells us. As an investor in the stock market, I’ve long felt like I’m barely treading water with respect to inflation, despite the reportedly low figures quoted by official entities; there is indeed the sense that one’s money is evaporating before one’s eyes, and it’s only by exposing it to significant risk that one can have any hope of actually storing that value for later consumption. I haven’t held any bonds for a while, and in the current panorama I don’t see anything which would make it compelling to do so. People most often hold bonds in their portfolios as a way of dampening volatility. Right now, crypto swings up and down so violently that one couldn’t really expect it to serve a dampening role like that in a portfolio; it behaves much more like a risky tech stock than a reserve currency. But if folks like Ammous are right, though, that may not remain the case: Bitcoin may one day make sense as the "cash" component in a portfolio. That would be cool because existing forms of cash kind of, well, suck when it comes to being a stable store of value.

The second argument that I found to be rather surprising is the one about proof of work (ie. the thing which drives the energy usage in the Bitcoin algorithm) as actually being a good thing. When asked what he thought about Bitcoin using more energy than some countries, Ammous said, "It’s worth it". That one floored me. From my point of view, all those kilowatt-hours had struck me as egregiously profligate. For Ammous, Bitcoin mining is intrinsically valuable, because it is what makes Bitcoin work (ie. it is what makes it a truly distributed system without any privileged actors able to impose arbitrary modifications or do nefarious things like change the rules of the game or abscond with the funds of the beguiled), and Bitcoin working is a good thing in the world because it is the most perfect form of money (again, referencing the Big Three) yet invented, standing to free us all from the maladies of inflation and crude Keynesian economics. Since the advent of electricity we’ve been using it for all sorts of things that aren’t strictly necessary for survival, but which make our lives better. Ammous claims that Bitcoin is just one more life-bettering technology whose benefits justify its costs; you just have to know a bit about economics and monetary theory to connect the dots and see why.

I wasn’t expecting that. I was expecting something along the lines of proof of stake being the solution; like, just as John the Baptist paved the way for the coming of Jesus H. Christ, Bitcoin with its proof of work is a premonition for some improved cryptocurrency with proof of stake at its heart instead. Maybe a layer on top of Bitcoin, or maybe some kind of successor. But no, that’s not the argument at all. In fact, Ammous thinks proof of stake is totally bogus. It might solve the energy problem, but it does so at the cost of jettisoning the single most important property of Bitcoin and proof of work; namely, that nobody controls Bitcoin — there is no way for any single entity to fuck things up because it really is distributed (ie. no central authority) and there’s no way of changing the rules. You can hard-fork, but that’s not changing the rules, it’s just changing a copy of them, and network effects make Bitcoin’s preeminent status effectively unassailable. As thousands of failed coins demonstrate, none of these copies have gotten critical mass behind them.

The other defense he proffered was that most energy powering Bitcoin mining is cheap (ie. plentiful, abundant) energy, precisely because competitive forces will naturally drive miners towards the cheapest energy. As such, a lot of it ends up being green. Now, I’m aware that this is a contested point, kind of like "vaccines are worth the downside risks", but it is a reasonable point. I couldn’t help but thinking of the Neal Stephenson novel, Fall; or, Dodge in Hell where, spoiler alert, humans end up building massive solar panel arrays in orbit (with the eventual trajectory leading humankind in the direction of building a Dyson sphere, completely enclosing the sun and capturing most of its energy output). In the novel, this energy ends up being used to power a computationally expensive simulated reality in which the scanned-in souls of the dead end up running in the cloud, enjoying a kind of "life" after physical death. Now, that irreal existence seems rather pointless on some level — not unlike endlessly computing block hashes to keep a virtual currency running — but the dividing line between biological and computational processes is rather blurry in this book. Just say you figure it’s worth burning all that energy on eternal digital life, then maybe it’s not unreasonable to also consider it worth burning a huge quantity to enable "eternal" digital currency. Fiat currencies are already pretty divorced from actual circulating specie, as most of the monetary supply is created out of nothing via credit; it’s not clear that that is any more "real" than Bitcoin is. At least Bitcoin is the product of mathematics, a well-defined algorithm, and an element of randomness. Fiat currencies, on the other hand, are driven by utterly arbitrary and unpredictable dynamic systems comprised of incomprehensibly complicated configurations of individual agents, institutions, governments, and markets.

In conclusion, I find these arguments to be quite reasonable, but not compelling, because they’re contingent on the future unfolding in a particular way in which more and more participants in the system converge on Bitcoin as being the ideal hard currency and store of value in the long term. All the frothy shitcoins, Ponzi schemes, and "smart" contracts[2] need to stay where they belong, on the fringes, such that they don’t destabilize the rest of the system. There can be no crisis of confidence, or at least, no fatal one. This is a system that as much requires solid psychological support as it does utterly robust technical foundations. Bitcoin is already a reasonably good store of value, and even a not-bad medium of exchange; for any more complex demands other structures can be layered on top — the bit that really matters is that it needs to be a durable store of value. If we can really liberate ourselves from the ever-present menance of wealth-destroying inflation, that would be pretty darn neat. Companies need to stop trying to ram blockchain-shaped pegs into round holes where they don’t belong; they’re just creating a distraction. More and more agents need to start treating Bitcoin as a long-term, inflation-resistant alternative to fiat cash, enough of them such that Bitcoin starts to take over from bonds as a means to dampen volatility; note that to do this, participants in the system need to treat Bitcoin less like a high-risk speculative investment and more like ballast — unglamorous, uninteresting, predictable. We need to create more and cheaper sources of clean energy, and while we’re at it, find ways to meet consumer demand without crypto hardware purchases drastically distorting the market. Lots of things can and may go wrong on the way.

Overall, if everything turns out the way the proponents proclaim, I think that would be great. Right now I’d put the odds at 50% (plus or minus 25%) that things do head in this direction over the next couple of decades. But those aren’t very certain odds, and not the kind that I would want in order to sink my life-savings into this stuff. I’m pretty conservative when it comes to my investment choices, so I wouldn’t put anything but "play money" into Bitcoin, and nothing into any other crypto. In any case, I’ve never had any play money in my investing life, so this is all hypothetical. And even if things do go according to the rosiest of predictions, there remains the fact that securing crypto assets feels like a step backwards into some kind of dark age prior to modern banking and federally insured deposits; keeping your private keys safe from theft, damage, or loss feels not unlike securing physical gold. I don’t relish the idea of having many thousands of dollars’ worth of digital assets under my physical custodianship any more than I like the idea of trying to somehow safely stash a bunch of gold coins under my bed. There is something comforting about the various checks and balances implemented by old banks to make sure that the numbers tallied against your name in their ledgers stay securely and exactly where you put them, without tampering. I couldn’t care less about crypto’s touted benefits of anonymity and its illusory freedom from government vigilance and expropriation. Those risks are real in the current fiat-based system, but seem so utterly distant and unreal to me in my cozy European life in the bosom of the EU. In short, the idea of having personally to safeguard any significant amount of crypto assets scares the crap out of me, and I can’t begin to imagine how elderly or non-tech-savvy folks might navigate these waters if crypto assets end up expanding to occupy a significant chunk of the monetary and financial pie in the future.

So there you have it — my mid-2022 snapshot of what I think about Bitcoin. I’m still very early on in my research of both the for and against arguments in this arena, but I intend to keep studying. It beats reading about COVID, or Ukraine, or the Depp-Heard trial, and I’ll be interested to revisit this post some years from now and see if the world, or I, have moved on at all.


  1. And out of economy, I’ll be using the shorthand "crypto" a bunch of times in this post, even though it always irked me that it’s an appropriation of a word that was previously used to designate the broader field of "cryptography" itself. Sometimes you’ve just got to accept when these linguistic battles are won and lost, and go with the flow. I still remember how I used to call blogs "weblogs" for years after everyone else stopped doing so, simply because I stubbornly hated the laziness of the contraction… ↩︎

  2. In my experience, few things that people tack the prefix of "smart" onto end up being so. ↩︎