made outline for nushell and cli renaisance posts.

This commit is contained in:
Gabe Venberg 2024-03-05 00:54:51 -06:00
parent 48c42f11db
commit 2224ba8c20
3 changed files with 112 additions and 10 deletions

View file

@ -1,5 +1,5 @@
+++
title = "The modern CLI Renaissance, In Rust?"
title = "A modern CLI renaissance?"
date = 2024-03-04T12:20:02-06:00
draft = true
+++
@ -49,39 +49,87 @@ draft = true
// vi was made in 1976
// sed in 1974
// awk in 1985
// grep in 1973
// bc in 1975
// diff in 1974
// make in 1976
// vim in 1991
// ssh in 1995
// midnight commander in 1994
// screen in 1987
// tmux in 2007
// rust 1.0 in 2015
Id like to talk about a trend Ive seen these past few years, where people are rewriting core CLI tools,
why I think this trend is a good thing, and why I think this trend might be happening.
== History
The terminal has been a staple of computer user interfaces since before computer monitors were available,
with some of the first computers offering an interactive mode in the late 1950's.
The 'modern' Linux terminal traces its linage to the very first version of Unix, in 1971.
Many utilities that a Linux user uses every day,
Many utilities that a Linux user interacts with every day,
commands like `rm`, `cat`, `cd`, `cp`, `man` and a host of other core commands trace their initial versions to this first version of Unix.
Other tools are a bit newer, such as `sed` (1974), `diff` (1974) `bc` (1975), `make` (1976) or `vi` (1976).
There where a few more tools introduced in the 90's, such as `vim` (1991) and `ssh`, (1995), but you get the picture.
There were a few more tools introduced in the 90's, such as `vim` (1991) and `ssh`, (1995), but you get the picture.
The majority of the foundational CLI tools on a Linux pc, even one installed yesterday, are older than Linux itself is.
== Ok, so?
Now, theres nothing wrong with this, the tools work fine still, but,
in the half-century since they were first written,
Terminals, users, and the broader linux ecosystem have all changed.
Terminals and the broader Linux ecosystem have all changed.
Terminals now have capacity to display more colours, Unicode symbols, and even inline images.
TODO: how have things changed?
Terminal programs now coexist with graphical user interfaces,
and only a small subset of computer users even know they exist,
wheras in the past, terminals were the only way one interacted with the computer.
Additionally, and perhaps more importantly, our knowledge has expanded,
our knowledge of user interfaces,
of what works and what doesnt,
of what usecases are common and what usecases are niche,
the way that error messages can teach,
the value of a good out of the box experience,
and the value of documentation that is easy to find and digest.
== The revival of the terminal
== The new tools
In recent years, Ive noticed a resurgence in development of command line utilities.
These changes to the environment surrounding CLI apps has, in recent years,
led to a resurgence in development of command line utilities.
Instead of just developing tools that dont exist,
Ive noticed that people are remaking, rethinking, tools that have existed since the early days of Unix.
Ive noticed that people are rethinking and reinventing tools that have existed since the early days of Unix.
== Exploration of the design space
== The lessons learned from the past
=== A good out of the box experience
// look at helix compared to (neo)vim
=== Friendly error messages
// look at nushells error messages
=== Concise and discoverable documentation
// look at zellij and helix and their built in keymap cheatsheets
=== Common usecases should be easy
// look at sd, rg, and fd
== Shedding historical baggage
// look at just command runner, simplifying the common use case of make
== The trendsetter
// did neovim kick this all off?
== The languages
// most of the new tools are written in rust and go.
// rusts clap and gos cobra

View file

@ -54,6 +54,13 @@ I put esc on one of the thumb keys for usage in vim.
I moved the numpad layer to my right hand side, swapping its position with the function key layer.
I also put the meta key as a hold-mod on the lower pinky keys, as my window manager uses it for all its keybinds.
The mod-tap home row layer changes actually feel really natural,
and the extra space afforded by layers allows me to organize things in a more natural feeing way,
such as putting the numbers in a numpad layout, rather than along the top.
Im not quite happy with my modifiers being mod-taps on the bottom row,
they can feel slightly awkward to reach,
and I may experiment with moving them around, potentially on the top row.
== Learning
Of course, the board takes some getting used to.
@ -66,7 +73,7 @@ and I still have to look at my keymap printout for symbols sometimes.
However, all things considered, it was easier to learn than I had expected!
Perhaps its because I already was used to split keyboards,
or because I forced myself to use this instead of my 'normal' keyboard at work,
but I am now at the point where it feels more or less natural to type on.
but I am now at the point where it feels natural to type on.
== Case
@ -96,4 +103,5 @@ and if they do, I did socket the microcontrollers for easy replacement.
It took me all of a week to fall in love with the sweeps form factor,
and, 1 month later, Im convinced I will never let myself work on a regular keyboard for a long period of time again,
thats how much Ive come to appreciate split keyboards.
The fact that the board has no pesky diodes or other surface mount parts means its very accessible
The fact that the board has no pesky diodes or other surface mount parts means its very accessible first build,
and one Id recommend to anyone interested in improving their typing ergonomics.

View file

@ -19,3 +19,49 @@ Today, Id like to focus on my experiments with my shell.
Before this, I had been using a minimal zsh setup for a long time,
with only built in features and a handmade prompt.
Zsh is a good shell, probably one of the best posix shells out there,
and I still use it when a posix shell is needed.
However, I got tired of the endless footguns that posix shell scripting imposes,
easy to make errors around quoting, word splitting, and escaping,
the sort of thing that makes https://www.shellcheck.net/[shellcheck] necessary.
I played around with fish for a few days,
but it had many of the same fundamental design choices, mainly, being 'stringly typed',
that made posix shells such a displeasure to work with.
== A Nu shell
While googling around for alternative shells, I stumbled across https://www.nushell.sh/[nushell],
a shell that claimed to work around structured data instead of just strings.
This was *exactly* what I was looking for, and I installed it immediately.
I decided to work with it for around a month,
give myself enough time to really use it,
see not only how it felt with ordinary usage,
but to give myself time and opportunity to construct a few pipelines and scripts in it.
All that said, the month is up, and ive been collecting examples,
thoughts, and some criticisms along the way.
== Piping structured data
// show some examples of grouping, sorting, etc without endless invocations of `cut`.
== Parsing non-nu tools
// show parsing initcall_debug logs, and how it then lets one do analysis on it
== Defining custom commands
// show the basic syntax for custom commands
=== Built in arg parsing?
// show syntax for custom args, and how it leads to auto completion and help generation.
== Error messages
== Whats not there yet
// explain some limitations, tools that assume the existence of a posix shell (esp files one is instructed to source)
// also explain the limitations where nushell scripts cannot pass structured data, but are treated as external commands, therefore their usefullness in a pipeline is limited.