techWebsite/content/posts/clirenaissance/index.md

14 KiB

+++ title = "A modern CLI renaissance?" date = 2024-03-04T12:20:02-06:00 draft = true +++

Take a look at this table at the bottom of the page. Ill wait. Notice the relative scarcity between ~1995 and ~2015? Id like to talk about a trend Ive seen these past few years, where people are rewriting and rethinking staples of the command line interface, why I think this trend might be happening, and why I think this trend is a good thing.

History

The terminal and the command line interface have been staples of computer user interfaces before computer monitors were even available, with some of the first computers offering an interactive mode in the late 1950's. The recognizable Linux terminal traces its linage to the very first version of Unix in 1971. Many utilities that a Linux user interacts with every day, commands like rm, cat, cd, cp, man, and a host of other core commands trace their initial versions to this first version of Unix. Other tools are a bit newer, such as sed (1974), diff (1974) bc (1975), make (1976), or vi (1976). There were a few more tools introduced in the 90's, such as vim (1991) and ssh (1995), but you get the picture. The majority of the foundational CLI tools on a Linux pc, even one installed yesterday, are older than Linux itself is.

Ok, so?

Now, theres nothing wrong with this. The tools still work fine, but in the half-century since they were first written, terminals and the broader Linux ecosystem have all changed. Terminals now have capacity to display more colours, Unicode symbols, and even inline images. Terminal programs now coexist with graphical user interfaces, and only a small subset of computer users even know they exist, whereas in the past, terminals were the only way one interacted with the computer.

Perhaps more importantly, our knowledge has expanded: our knowledge of user interfaces, of what works and what doesnt, of what usecases are common and what usecases are niche, the way that error messages can teach, the value of a good out of the box experience, and the value of documentation that is easy to find and digest.

These changes to the environment surrounding CLI apps in recent years have led to a resurgence in development of command line utilities. Instead of just developing completely new tools or cloning old tools, Ive noticed that people are rethinking and reinventing tools that have existed since the early days of Unix.

This isnt just some compulsive need to rewrite every tool out there in your favorite language. People are looking at the problem these tools set out to solve, and coming up with their own solutions to them, exploring the space of possible solutions and taking new approaches.

Its this exploration of the solution space that id like to take a look at: the ways that tools are changing, why people are changing them, and what kicked off this phenomenon.

The lessons learned from the past

A large amount of the innovation in the area, I think, can be attributed to lessons that have been learned in 50 years of using software: sharp edges we have repeatedly cut ourselves on, unintuitive interfaces that repeatedly trip us up, and growing frustration at the limitations that maintaining decades of backwards compatibility imposes on our tools.

These lessons have been gathering in the collective conciousness; through cheatsheets, guides, and FAQs; resources to guide us through esoteric error messages, complex configurations, and dozens upon dozens of flags.

Id like to go over a couple of the more prominent lessons that I feel terminal tools have learned in the past several decades.

A good out of the box experience

While configurability is great, one should not need to learn a new configuration language and dozens or hundreds of options to get a usable piece of software. Configuration should be for customization, not setup.

One of the earliest examples of this principle may be the fish shell. Both zsh and fish have powerful prompt and autocompletion engines, but zsh requires you to setup a custom prompt and enable completions in order to use the features that set it apart from the competition. With no config file, zsh is no better than bash. When starting fish for the first time, however, its powerful autocompletion and information rich prompt are front and center with no configuration required. Of course, fish still has the same level of configurability as zsh, it just also has sensible defaults.

To demonstrate my point, this is the default prompt for zsh with no configuration. It only shows the hostname, none of the advanced features you can get out of a zsh prompt even without plugins. zsh prompt, only shows hostname Here is bash's prompt. It actually gives more info than zsh's, even though zsh can do more when properly configured. bash prompt, shows hostname and current directory And here is fish's default prompt. It has a few colours, shows everything the bash prompt does, and additionally shows the git branch we are on. fish prompt, has colours, shows hostname, current directory, and git info

Text editors are another great example of the evolution of out of the box defaults. Vim and Neovim both improved on their predecessors, but much of that improvement is locked behind extremely complex configuration experiences and plugins. Heres four different terminal text editors with no configuration applied:

vi, vim, neovim, and helix editors in their default
configuration

Vi, (top left) is our baseline, and, as far as I can tell, doesnt actually support much for configuration. What you see out of the box is more or less whats there.

Vim (top right) greatly improved on Vi, adding things such as syntax highlighting, line numbers, spellchecking, split windows, folding, and even basic autocompletion. However, everything but syntax highlighting is either extremely clunky or outright disabled without configuration. (for example, the earliest things I did when I first made a .vimrc was to enable indent folding, make some better keybinds for navigating windows, and adding a line number ruler to the side)

Neovim (bottom left) further improved on Vim, adding support for Treesitter and the Language Server Protocol, but the out of the box experience is the exact same as vim! In order to take advantage of the LSP and Treesitter support, you have to install plugins, which means learning a Nvim package manager, learning how to configure LSPs, and configuring a new LSP for every language you want to use it with (or finding out about Mason and being OK with having multiple levels of package management in your Nvim install alone). Dont get me wrong: Neovim is a great editor once you get over the hump. I still use it as my daily driver, but so much of its functionality is simply hidden.

Then we have the Helix (bottom right) editor. Slightly glaring default colour scheme aside, everything is just there. Helix doesnt have plugin support yet, but it has so much stuff in core that, looking through my neovim plugins, pretty much all of them are in the core editor! (Ironically, the one feature that I feel helix is missing, folding, is a core part of neovim, albeit one that requires some configuration to get good use out of). Helix does have a config file where you can change a huge amount of settings, but its an extremely usable IDE out of the box thanks to having all of its features enabled by default.

Friendly error messages

before

Concise and discoverable documentation

Common usecases should be easy

Where possible, documentation should not even be required for the most common use cases. Whenever I want to use find, I almost always have to first look at the man page, as I dont use it quite often enough to memorize it. But thats totally unnedded! 90% of my uses of find take the form of find ./ -name "*foo*". with fd, the exact same invocation is a simple fd foo, dead simple, no man page needed. Of course, 10% of the time im doing something else and have to look at the manual even with fd, but the point is that manuals are for when you want to do someing with the tool that is not the most common usecase.

There are many other examples as well. How many of your grep invocations are in the form of grep -R 'foo' ./? most of mine are. Ripgrep shortens that to rg foo while still having all the power of grep when I need it, and it is faster to boot!

This isnt to say that tools should 'dumb themselves down' or hobble themselves to make them easier to use. Tools abosultuely should not shy away from being powerful; however they should keep in mind the first time user experience and what the first time user is likely to want to use the tool for; as that is likely also what power users will want to use the tool for 90% of the time.

Shedding historical baggage

The trendsetter

The languages

Appendix: the tools

This is an extremely unscientific table of command line tools that I have tried, have used, or currently use. It is assuredly incomplete, but should be broadly representative. The date data has been gathered from the first git commit where available, wikipedia otherwise, and sorting is by year first, then alphabetical.

tool year language
ls 1961 c
cat 1971 c
cd 1971 c
cp 1971 c
man 1971 c
rm 1971 c
grep 1973 c
diff 1974 c
sed 1974 c
bc 1975 c
make 1976 c
vi 1976 c
bourne shell 1979 c
awk 1985 c
screen 1987 c
bash 1989 c
zsh 1990 c
vim 1991 c
midnight commander 1994 c
ssh 1995 c
curl 1996 c
fish 2005 c (currently being rewritten in rust)
fossil 2006 c
tmux 2007 c
git 2008 c
go 1.0 2012 go
fzf 2013 go
eza/exa 2014 rust
neovim 2015 c
pueue 2015 rust
rust 1.0 2015 rust
just 2016 rust
micro 2016 go
nnn 2016 c
ripgrep 2016 rust
fd 2017 rust
bat 2018 rust
broot 2018 rust
difftastic 2018 rust
hyperfine 2018 rust
lazygit 2018 go
lsd 2018 rust
nushell 2018 rust
scc 2018 go
sd 2018 rust
git-delta 2019 rust
grex 2019 rust
starship 2019 rust
tre 2019 rust
typst 2019 rust
diskonaut 2020 rust
helix 2020 rust
pijul 2020 rust
zellij 2020 rust
zoxide 2020 rust
btop 2021 c++
ast-grep 2022 rust
yazi 2024 rust