July 21st, 2024

"Maxwell's equations of software" examined

Ken Shirriff's blog post analyzes a historic Lisp code snippet, showcasing Lisp's core principles. It highlights code-data interchangeability and the essence of Lisp programming, referencing Alan Kay's "Maxwell's Equations of Software."

Read original articleLink Icon
"Maxwell's equations of software" examined

The blog post on Ken Shirriff's website delves into the concept of the "Maxwell's Equations of Software," as quoted from Alan Kay. The post explores a half-page code snippet from the Lisp 1.5 Manual, written in 1961 by John McCarthy et al, which is considered the embodiment of Lisp itself. The code defines a universal Lisp function called evalquote, showcasing the meta-language M-expressions that can be translated into S-expressions. The code demonstrates how functions like CAR, CDR, CONS, ATOM, EQ, LAMBDA, and LABEL are implemented using a few primitives. It highlights the concept that in Lisp, code and data are interchangeable, with a concise piece of code being adequate to define a basic Lisp interpreter. The post also touches on the limitations of the presented code, such as the absence of features like arithmetic and the need for additional functions like equals and cadr. Overall, the blog post provides an insightful analysis of the foundational principles of Lisp programming and the significance of the "Maxwell's Equations of Software" analogy.

Link Icon 7 comments
By @pkal - 4 months
I don't get the impression that anyone is going into the "Maxwell's equations" part of the analogy. I didn't understand/think about it either until watching Sussman's "Programming is (should be) fun!"[0], where he explains the dualism and inter-action between `eval' and `apply' is supposed to be analogous to magnetic and electric fields in Maxwell's equations. The well known Ying-Yang picture containing `eval'/`apply' can (and does in his presentation) also be drawn with \vec{B} and \vec{E}.

[0] https://www.youtube.com/watch?v=2MYzvQ1v8Ww, around minute 13

By @tromp - 4 months
> a half-page of code is sufficient to define a basic Lisp interpreter in Lisp given a few primitives (car, cdr, cons, eq, atom).

One line of code is sufficient to define the 206-bit BLC interpreter in BLC given no primitives whatsoever [1] :

    (λ11)(λλλ1(λλλλ3(λ5(3(λ2(3(λλ3(λ123)))(4(λ4(λ31(21))))))(1(2(λ12))(λ4(λ4(λ2(14)))5))))(33)2)
This code tokenizes and parses (something the Lisp code skips) the binary encoding of a lambda term from an input bitstream and applies that term to the remainder of input.

[1] https://tromp.github.io/cl/cl.html

By @ashton314 - 4 months
I think there's good reason why Lisp's `apply` and `eval` functions are the "Maxwell Equations" of CS. Going back to the Lambda Calculus, `apply` is the β-reduction rule, which is the basis for everything, and `eval` is how you cross the data-code boundary. Between the two you get a system that can 1.) perform any computable function, and 2.) easily work with its own structure.
By @sfvisser - 4 months
This one feels more apt to me: the derivation of a single lambda calculus term in which you can express all other expressions.

> The systematic construction of a one-combinator basis

https://citeseerx.ist.psu.edu/document?repid=rep1&type=pdf&d...

By @ChrisArchitect - 4 months
By @taneq - 4 months
I’m always a little confused when people are blown away by the “the program IS data” revelation. What did they think the program was? Why is this seen as transcendental LISP enlightenment when it’s the main feature of a von Neumann architecture?