September 9th, 2024

The Modern CLI Renaissance

The resurgence of command line interface tools since 2015 emphasizes user-friendly experiences, clear error messages, and accessible documentation, addressing past shortcomings and evolving with user needs and technology.

Read original articleLink Icon
The Modern CLI Renaissance

In recent years, there has been a notable resurgence in the development of command line interface (CLI) tools, reversing a period of stagnation from 1995 to 2015. This revival is attributed to the evolution of user needs and technological advancements, prompting developers to rethink and reinvent traditional CLI tools. The article discusses several key lessons learned from decades of software use, emphasizing the importance of a good out-of-the-box experience, helpful error messages, and concise documentation. Modern CLI tools aim to provide intuitive interfaces that minimize the need for extensive configuration, as seen in examples like the fish shell, which offers powerful features without requiring user setup. Additionally, the article highlights the significance of clear error messages that guide users in troubleshooting, contrasting the vague messages often found in traditional tools with the more informative feedback provided by newer alternatives like Nushell. Documentation should also be easily accessible and focused on common use cases, allowing users to perform tasks without needing to consult manuals frequently. Overall, the article advocates for a thoughtful approach to CLI tool development that prioritizes user experience and addresses historical shortcomings.

- The development of CLI tools has accelerated significantly since 2015.

- Modern tools focus on user-friendly experiences with minimal configuration.

- Clear and informative error messages are essential for effective troubleshooting.

- Documentation should prioritize common use cases to enhance usability.

- The evolution of terminals has influenced the redesign of traditional CLI tools.

Link Icon 15 comments
By @llm_trw - 8 months
To summarize: The people who ruined native GUIs moved to HTML pages. After ruining HTML pages they are now moving to terminals.

In this very thread we're seeing people say "Well sure, but why not add ...".

The reason why the CLI is good is because it _can't_ do most things people want it to do. Which means you have to think when making an application.

Please, if you're one of the people 'modernizing' the terminal stop and think why the terminal is valuable. Don't make it into another of the long line of UIs which have been destroyed by modern developers.

By @cyberax - 8 months
The stale field of terminals is also getting new developments. My particular favorite is Kitty input protocol that allows terminals to use such 21-st century functionality as accurate key press reporting: https://sw.kovidgoyal.net/kitty/keyboard-protocol/
By @terminaltrove - 8 months
Excellent article of what is going on in the terminal space, agree on the TUI section where we are seeing lots of terminal tools being built in Rust and Go and libraries such as Ratatui [1] and Bubble Tea [2] becoming new modern alternatives to ncurses for building TUIs.

Python has Textualize which is also very popular for building terminal user interfaces [3]

And we've noticed this renaissance as well of new CLI and TUI tools that we list on Terminal Trove [4].

[1] https://ratatui.rs/

[2] https://github.com/charmbracelet/bubbletea

[3] https://textual.textualize.io/

[4] https://terminaltrove.com/

By @JimDabell - 8 months
I would be very happy to see shells shed the idea that everything needs to be done in the context of emulating a terminal from the 70s. But even though new shells exist, I always end up writing shell scripts in ancient Bourne because I want them to run out of the box instead of requiring a third-party shell to be installed. Is there no appetite for a new default shell that could be adopted by Linux, macOS, and BSDs?
By @apitman - 8 months
As far as I know, TUI is the only way to build a statically linked app that can be controlled with a mouse. If you're running in a terminal that supports sixel, that covers a huge class of apps. And you get lots of bonuses like excellent cross platform support and running over a network. It's honestly a compelling platform even without the nostalgia factor.
By @godelski - 8 months
I'm loving how coreutils is getting improved and with how TUIs are exploding. It's just been a breath of fresh air. I hate to reach for my mouse.

Side note: in vim you can press K to see the help page for a specific function. The default is the man page but some plugins will fix this though I'm not aware of one that's let me get away from documents and help files completely (I'd love to pick up docs of the codebase I'm working on. Anyone know of something that will?)

But one thing I want to stress, the defaults of these tools should be the same as the originals as much as possible. This makes them more effective and adoption seamless.

To give two examples fd and ripgrep default to respecting your .gitignore file. If you're not expecting this you can easily land yourself in trouble. Sure, it's in the readme but people are installing through their package managers so it doesn't matter. I'm glad the option exists, but don't make it the fucking default! We have aliases for that stuff. It's always better to error in the direction of too much output than too little. Output can be parsed and searched, but you can't when it's not there.

Btw, here's what pacman says

  extra/fd 10.2.0-1
    Simple, fast and user - friendly alternative to find
  extra/ripgrep 14.1.0-1
    A search tool that combines the usability of ag with the raw speed of grep
So I'm surprised people are surprised that people think you can just alias find or grep, since they are advertised as such and this is also what's said through word of mouth. (Yes, people should open man pages and visit the GitHub, but let's be real about our expectations)
By @solatic - 8 months
One interesting idea, in the Platform Engineering space (inside companies), is using TUIs to take advantage of credentials that may already be available on the developer's laptop. If you serve an internal app as a webapp, then you either need for the webapp to have a service user (icky audit logs) or waste lots of time setting up OAuth-style login flows so the webapp can authenticate as the user (and maybe IT doesn't like the idea anyway). Or, you write something that runs on the dev's laptop, and make use of the credentials that are already available locally, easy-peasy. Auth is simple, use the audit mechanisms that are already in place, easy.
By @sandreas - 8 months
Great article, especially the awesome tools collection at the bottom. I'm barely missing any of my daily drivers:

  # dra - automatically download release assets from github
  # example: dra download -a "dundee/gdu" -I "gdu" --output "$HOME/bin"
  devmatteini/dra
  
  # gdu - disk usage analyzer similar to ncdu but faster
  dundee/gdu
  
  # glow - terminal markdown reader
  charmbracelet/glow
  
  # jless - json viewer
  PaulJuliusMartinez/jless
  
  # lazydocker - terminal docker management ui
  jesseduffield/lazydocker
  
  # lazygit - terminal git management ui
  jesseduffield/lazygit
  
  # rga - ripgrep-all, grep for PDF
  phiresky/ripgrep-all
By @bradgessler - 8 months
I’ve been working on https://terminalwire.com to scratch my own itch making it easier to add command-line interfaces into my own web apps.

If I pull this off, building out a CLI that’s as high quality as GitHub & Stripe’s should be trivial since it won’t require building out a web API and it can be dropped into existing web frameworks.

It won’t be as fast as a CLI that runs locally, but that’s kinda not the point of terminal apps that primarily interact with web services.

I have a private beta for folks working on commercial SaaS products that want to deploy a CLI, but don’t want to deal with building out an API.

By @surfingdino - 8 months
The reason why cli commands are written in C is the fact that the OS is written in C. There is a lot of inertia, because it is more efficient to have the operating system and the tools for it written in the same language. We may be entering a transitional phase with new tools written, old tools rewritten in Rust. Some of the innovation or simplification highlighted in the article remind me of the chaos of the Unix wars which led to POSIX. I wouldn't chuck out the tools we have though. Modernise them, but keep their functionality. What many critics of CLI tools do not realise is that there is a lot of power hiding in that complexity. Familiarise yourself with xargs or parallel commands to see that quite often learning them given you results faster than reimplementing them.
By @kazinator - 8 months
My Bash prompt is now just $, and I have status line at the bottom that is protected from scrolling.

https://www.kylheku.com/cgit/basta/about/

Something called cdlog for directory navigation:

https://www.kylheku.com/cgit/cdlog/about/

Both of the above things are new [2024].

By @yonisto - 8 months
In my case I found it 20x times easier to hack a CLI tool that I can easily move around the organization that has mixture of Windows, Macs and Linux. Installation is just .zst file away.
By @anthk - 8 months
Meh. If any, for Make, clone this and read the files:

    git clone git://bitreich.org/english_knight
On the rest, nvi/vim and vis (vis is nice to cool with Sam-like extructural regexen) are more than enough. Nvi as the uberfast vi clone with Unicode support, and vim for Common Lisp with Slimv.

On TUI tools, there's mc, ncdu... those are useful. A lot of them aren't. Finch vs an IRC client and Bitlbee, for instance, swirc + bitlbee it's far more usable, with FInch I always had to do voodoo with the panes.

By @camgunz - 8 months
I understand where TFA is coming from here. A lot of these tools are built to handle complex tasks and thus while powerful are complex themselves. It's also true that computing has changed a lot, and we've learned a lot since someone defined how `find` would work; maybe we'd do it the same or maybe we wouldn't, but it's definitely true that things are a lot different now.

But I think we should be careful before dismissing the existing CLI/TUI landscape. It's a huge achievement that it's possible for code written in over a dozen scripting languages will just run across different architectures and platforms. A #!/bin/sh script running on a Raspberry PI, a WiFi router, a $20k server, a $300 Chromebook, or a $200 Pinephone will run exactly the same. That's because the standards and technologies that script relies on are ubiquitous. They don't cater to the top 1% of computer users, and they don't change every 5 years when average screen resolution increases.

Which is to say, be careful what you wish for. It's pretty easy to modify your CLI or your semantics when you're not installed on millions of routers across the world [0]. With success comes backwards compatibility concerns, and it's not too long before people start writing blog posts about how your tool needs to "shed historical baggage".

CLI tools are typically respectful of your resources, resources like:

- network bandwidth

- screen size

- attention

- CPU/RAM/disk

We should keep this in mind when we're talking about what defaults make sense. Does enabling LSPs by default mean you have to download a bunch of LSPs you won't use? Does it mean you've gotta maintain a database of LSPs you use/installed? Does it mean I can't use it on a Pinebook Pro without taking 10% off my battery life? Is this core computing infrastructure like grep, find and xargs or something a little more niche like ripgrep or fzf? Does making this interface colorful respect a user's color configuration on their machine (colorblind users, users avoiding blue light, users who setup a desktop theme, etc.)? If this tool generates an error, will it dump 1 error line in my logs or 13? If it generates 100,000 errors because I was processing 100,000 things, will it dump 100,000 error lines in my logs or 1.3 million?

I'm not saying there are clear answers here. My point is that while TFA argues there are clear answers, I'm saying there aren't. You have to target a use case. Andrew Gallant (ripgrep author, among other bonkers things) says he deliberately skipped .gitignore files by default because that's the use case he was targeting. That's great, and I can totally understand where he's coming from. I could also understand a different tool not doing it for different reasons. Neither is correct or incorrect (aside: as engineers, I think we could be taken more seriously if we stopped trying to argue our aesthetic preferences are correct or optimal or whatever--it's OK to just prefer things). Pick a use case. Pick an aesthetic. Pick a mental model.

So, yeah write Helix, write new CLI and TUI tools. But don't do it because existing tools are old and busted or fundamentally incorrect (according to you). Do it because you have a different aesthetic preference (you like colors, you like emoji, you like autocomplete, you like WASD as cursor movement). You don't need the backing of righteous engineering gods before you build something you like. Let me have Vim and I'll let you have PyCharm. Let me have Gleam and I'll let you have Go. There's room enough on this disk for both of us.

[0]: https://daniel.haxx.se/blog/2020/04/15/curl-is-not-removing-...

By @throwaway984393 - 8 months
CLI renaissance, or new dark age? The advent of the Web as the modern application platform of choice destroyed the advancement of graphical user interfaces. We now live in a bizarre world of CLIs when we should be using GUIs.

Before the Web, and during its rise, there was a vast array of productivity tools designed to allow users to do more work, faster, and better, through graphical interfaces. It would have been ridiculous to release a program to users with only a command line interface. We left the dark ages of terminals behind, and pushed into new territory, advancing what users could do with computers.

But once the Web began to develop the capabilities of browsers further, Web programming began to teach young programmers that the web was the only place that needed a graphical interface, because the web was a "universal" graphical application interface (lol, if you don't count the browser wars)

This delighted programmers, as they never really liked making graphical interfaces. Logic and functions were more fun to write than user interfaces (which only made the users - not developers - happier).

This was then hammered home when Markdown was widely adopted for its simplicity, inspiring a sort of text-based Stockholm-syndrome. People started to claim bizarre things, like that the command line and Markdown were preferable (or even superior) to GUIs and WYSIWYGs in almost all cases. More languages were adopted that had no inherent graphics capabilities, and the devs moved ever further towards text.

So the web has unintentionally set back computer science and user productivity by decades. Until browsers lose the spotlight, this will probably continue, and non-web GUIs will continue to be that ugly thing you only write if you have to. Users will continue to languish in these half baked solutions, slaves to the solutions that are presented to them. And devs will continue to create text interfaces that only they enjoy.

Rather than rethinking old ideas and creating new ones, we are simply doubling down on the past.