June 19th, 2024

The demise of the mildly dynamic website (2022)

The evolution of websites from hand-crafted HTML to PHP enabled dynamic web apps with simple deployment. PHP's decline led to static site generators replacing mildly dynamic sites, shifting to JavaScript for features like comments.

Read original articleLink Icon
The demise of the mildly dynamic website (2022)

The article discusses the evolution of websites from hand-crafted HTML to the rise of PHP, enabling dynamic web applications with simple deployment models. It highlights how PHP allowed for easy prototyping and a "hackish state" of development. The concept of "mildly dynamic" websites emerged, adding minor dynamic elements like style selectors or random quotes. However, with the shift to static site generators, the era of PHP websites declined, leading to the demise of mildly dynamic websites. The article explains that the minor dynamic functionality of these sites is not worth the complexity of using web application frameworks, resulting in a transition to JavaScript for such features like comments sections. The shift away from PHP has impacted the unique character and individuality of websites from that era, as many functionalities have been outsourced to third-party services like Disqus.

Related

The hacking of culture and the creation of socio-technical debt

The hacking of culture and the creation of socio-technical debt

Algorithms shape culture, dividing it into niche groups. "A Hacker Manifesto" by McKenzie Wark discusses hackers' influence on power dynamics, emphasizing free information. Tech giants like Facebook and TikTok wield immense cultural influence, blurring propaganda and personalization boundaries. Corporate dominance in culture hacking alters global power structures, challenging governments' regulatory capacity.

Show HN: Eidos – Offline alternative to Notion

Show HN: Eidos – Offline alternative to Notion

The Eidos project on GitHub offers a personal data management framework as a Progressive Web App with AI features. Customizable with extensions and scripting, it leverages sqlite-wasm technology for chromium-based browsers.

Exposition of Front End Build Systems

Exposition of Front End Build Systems

Frontend build systems are crucial in web development, involving transpilation, bundling, and minification steps. Tools like Babel and Webpack optimize code for performance and developer experience. Various bundlers like Webpack, Rollup, Parcel, esbuild, and Turbopack are compared for features and performance.

Software Engineering Practices (2022)

Software Engineering Practices (2022)

Gergely Orosz sparked a Twitter discussion on software engineering practices. Simon Willison elaborated on key practices in a blog post, emphasizing documentation, test data creation, database migrations, templates, code formatting, environment setup automation, and preview environments. Willison highlights the productivity and quality benefits of investing in these practices and recommends tools like Docker, Gitpod, and Codespaces for implementation.

My weekend project turned into a 3 years journey

My weekend project turned into a 3 years journey

Anthony's note-taking app journey spans 3 years, evolving from a secure Markdown tool to a complex Electron/React project with code execution capabilities. Facing challenges in store publishing, he prioritizes user feedback and simplicity, opting for a custom online deployment solution.

Link Icon 33 comments
By @decasia - 4 months
I think the spirit of this article is correct, although some of the digs at modern web tech and SPAs seem to be beside the point.

I used to have a "mildly dynamic website." It was a $5 digital ocean box. It ran nginx with php-fpm, mostly so it could have a Wordpress install in a subdirectory, and it had a unicorn setup for an experimental Rails app somewhere in there.

Given that environment, the "mildly dynamic website" experience that TFA talks about was absolutely true. If I wanted a simple script to accept form input, or some little tiny dynamic experimental website, I could trivially deploy it. I could write PHP (ugh) or whatever other backend service I felt like writing. I ported the Rails app to golang after a while. It was fun. It made for a low cost of entry for experimental, hackish things. It's a nice workshop if you have it.

The thing is — if you are running this setup on your own linux virtual machine — it requires endless system maintenance. Otherwise all the PHP stuff becomes vulnerable to random hacks. And the base OS needs endless security updates. And maybe you want backups, because you got lazy about maintaining your ansible scripts for system setup. And the price of the $5 virtual linux box tends to go up over the years. And the "personal website" model of the web has kind of declined (not that it's altogether dead, just marginalized by twitter/facebook).

So I got exhausted by having to maintain the environment (I already do enough system maintenance at work) and decided to switch to static HTML sites on S3. You can't hack it anymore. But so far — I can live with it.

By @politelemon - 4 months
> What captured people's imaginations about AWS Lambda is that it lets you a) give any piece of code an URL, and b) that code doesn't consume resources when it's not being used. Yet these are also exactly the attributes possessed by PHP or CGI scripts. In fact, it's far easier for me to write a PHP script and rsync it to a web server of mine than for me to figure out the extensive and complex tooling for creating, maintaining and deploying AWS Lambda functions — and it comes without the lock-in to boot. Moreover, the former allows me to give an URL to a piece of code instantly, whereas with the latter I have to figure out how to setup AWS API Gateway plumbing correctly. I'm genuinely curious how many people find AWS Lambda interesting because they've never encountered, or never properly looked at, CGI.

Well, assuming you are genuinely curious and not just using an expression!

The difference is that the 'web server' is still consuming resources when the code is not in use. They aren't equivalent at all. The web server is hosted on an OS and both require ongoing maintenance.

Further, the appeal of Lambda is in its ease of onboarding for newcomers; I can run a piece of .NET or JS or Python locally and directly without a lambda 'layer' to host it, just invoke the handler method.

I'm not sure what complex tooling that the author is referring to though, it's a zip and push.

By @snovymgodym - 4 months
I like this article except for the part about Lambda. The author doesn't seem to get that for some use cases there are serious benefits to having bits of code that run only when you need it, and only getting billed for those runs instead of getting billed for a VM or container runtime that's always present.

Obviously if your application involves processing a predictable high volume of requests, then you're probably better off running it on your own server/container, but depending on your use case there are times where Functions-as-a-service are the prefect solution.

The part about "why use lambda when cgi-bin exists" reminds me of the HN comment on the DropBox announcement from 2007 where the guy says something like "this is cool but why would anyone use it when you can just whip together ftp and svn on a debian box and have the same thing?"

By @jimbokun - 4 months
I sometimes wonder what the hell AWS Lambda is and whether or not I should care. Now I have a succint answer:

> What captured people's imaginations about AWS Lambda is that it lets you a) give any piece of code an URL, and b) that code doesn't consume resources when it's not being used. Yet these are also exactly the attributes possessed by PHP or CGI scripts.

From now on, when anyone mentions "AWS Lambda" I'm going to replace with "CGI" in my head.

By @dang - 4 months
Discussed at the time:

The Demise of the Mildly Dynamic Website - https://news.ycombinator.com/item?id=31236290 - May 2022 (91 comments)

By @amluto - 4 months
I think the real ability that got lost is the ability to easily mix and match multiple logically separate things. Once upon a time, if you have some slightly dynamic material and you also wanted to add some PDFs from the technical writers, the webmaster would do it in five minutes. Want a video? Just add a file. Need a form? Fire up some PHP or whatever. Want a support contact or FAQ? No big deal.

Now even big companies outsource their PDF hosting and viewing to a third party, they outsource their FAQ and support contact to a different fancy startup, surveys and similar forms to to yet another company, and the list goes on. The all-in-one website seems to be dead.

By @Dwedit - 4 months
I think CDNs led to the demise of "mildly dynamic" websites. They made static websites get served super fast, so you got the most benefit from completely static sites.

You could still include a small amount of JS to make a mostly-static site partially dynamic if you were logged in. Limited mostly to things like including the user's username on the page when logged in.

By @syrusakbary - 4 months
I'm amazed on how well this article fits with a new product that we have been working on at Wasmer. AWS Lambda is great, but it doesn't really solve the cold start problem of dynamic languages. Nor does FastCGI.

We are very close to launch Instaboot, a new feature for Wasmer Edge that thanks to WebAssembly is able to bring incredible fast cold-starts to dynamic languages. Bringing 90ms cold-start times to WordPress (compared to >1s in state-of-the-art cloud providers).

By @1vuio0pswjnm7 - 4 months
"Or, suppose a company makes a webpage for looking up products by their model number. If this page were made in 2005, it would probably be a single PHP page. It doesn't need a framework - it's one SELECT query, that's it. If this page were made in 2022, a conundrum will be faced: the company probably chose to use a statically generated website. The total number of products isn't too large, so instead their developers stuff a gigantic JSON file of model numbers for every product made by the company on the website and add some client-side JavaScript to download and query it.... This example is fictitious but I believe it to be representative."

As an end user, I have seen this perplexing design pattern quite often. As soon as I see it, I just get the URL for the JSON file and I never look at the web page again. It is like there is so much bandwidth, memory and CPU available but the developer is not letting the user takes advantage of it. Instead the developer is usurping it for themselves. Maybe a user wants to download data. But the developer wants to run Javascript and keep _all_ users staring at a web page.

Why not just provide a hyperlink on the rendered search results page pointing to the JSON file, as an alternative to (not a replacement for) running Javascript. What are the reasons for not provding it.

On some US government websites, for example, a hyperlink for the the CSV/JSON file is provided in the rendered search results page.^1

That is why a non-commercial www is so useful, IMHO: the best non-commercial websites do not try to "hide the ball".

Perhaps they have no incentive to try to force users to enable Javascript, which is a practical prerequisite for advertising and tracking.

1. ecfr.gov and federalregister.gov are two examples

By @kolme - 4 months
Fun fact: nextjs was inspired by PHP and can generate static pages that you can throw into a web server.

Another technology that kind of covers the use cases of the article (the mildly dynamic pages) would be htmx and friends.

By @superkuh - 4 months
Server side includes are still the perfect amount of power if you want to do templating stuff like comments.html footer.html or right_menu.html includes across all site pages. And the attack surface is so minimal and code so stable there's basically no increased risk using SSI over just html with nginx and similar webservers.
By @ggm - 4 months
Looking stuff up in an SQL DB and showing it to the masses (wordpress) is not very dynamic.

But, neither is deploying neko the cat to sleep on your cursor, or the dancing turtle of Kame-IPv6.

Intruding PHP into the actual FQDN of the web, and making 3x4.ARITHMETIC.EXAMPLE.COM work as a calculator, now thats dynamic.

REST is pretty dynamic. If I GET it and then POST it back with changes, I feel like I've had a good day.

I personally hate the cursor implicit in tabular data websites. There should be a normative "no cursor" mode to just get all the damn data not the page management requirement.

Riddle me this: How does "the independent" newspaper create a high visual content web which on my tablet appears to render INSTANTLY and yet has 100 points of image and text? it loads faster than almost any other site. It also kills my CPU and is a terrible design, but my goodness it's fast.

By @rkozik1989 - 4 months
Did everyone just forget about Varnish HTTP Cache or something? Or are we just using the new shiny ball because its new?
By @leobg - 4 months
Makes me think of Pieter Levels who used to run a 60k/mo SaaS with hundreds of paying users from a single index.php on a bare metal server.

(I don’t know if he still does it that way.)

By @Legion - 4 months
PHP deployment was indeed easy.

But it turns out "dump everything in docroot and let mod_php interpret and execute whatever it finds there" had security implications...

By @Terretta - 4 months
Why should we “not speak of” Allaire's ColdFusion also released in 1995?

PHP made web development accessible with its simplicity, but ColdFusion was arguably more influential in the '90s. It led the way with features like built-in database connectivity and templating, setting the stage for how dynamic web pages would work as implemented by Microsoft and others, and even shaping what PHP itself became.

Separately, I think projects like Caddy carry on the spirit of the blog post.

By @lovasoa - 4 months
This article resonates with me. I do love "mildly dynamic websites", and have fond memories of my days hacking together PHP websites 15 years ago.

And what I am working on today might be called a bridge for the "dynamicity gap". I'm making an open source server to write web apps entirely in SQL ( https://sql.ophir.dev/ ). It has the "one file per page" logic of PHP, and makes it easy to add bits of dynamic behavior with just a normal INSERT statement and a SELECT over dynamic data.

By @spacebuffer - 4 months
Semi-related: what's the best place to learn the old school style of working with php, I already know laravel but it feels so far removed from normal php that I am not confident working with it on it's own.
By @101008 - 4 months
I feel identified with the first part of the article. I remember the problem about navigations and headers (and right sidebars!).

The first solution was of course framesets, but they were kind of ugly. Then iframes came, and they were almost perfect solution (at least for me). With no borders, they looked like an include. The only problem (and depending of your website) it was that the height was fixed (for headers that may not be a problem, but it was for left and right sidebars).

Of course, with PHP and includes everything became trivial. I kind of miss the old `index.php?cont=page1`...

By @brendadavis22 - 4 months
I've been struggling with how to hack someones WhatsApp messages without touching their cell phone for almost a year. I've tried at least 10 different apps and none of them worked. I found shadownethacker44@gmail.com after reading a blog post on how to hack someones WhatsApp messages without touching their cell phone and the reviews were all good so I thought I would give it a try
By @ArneBab - 4 months
I know why I no longer have a mildly dynamic website: the security risk of PHP is not linear.

That’s where Javascript shines: if you avoid comments (those are actually a hard problem: a social one) and server-side data (security risk), it actually has this linear increase in effort without the jump in security risk.

And this risk has increased a lot since the early mildly dynamic websites.

By @solardev - 4 months
This perspective isn't really making an apples-to-apples comparison. The author is comparing modern framework bloat to the simplicity of a standalone PHP script, but disregarding the underlying stack that it takes to serve those scripts (i.e., the Linux, Apache/Nginx, MySQL/Postgres in LAMP).

Back in those days, it was never really as simple as "sftp my .php file into a folder and call it a day". If you were on a shared host, you may or may not have access to any of the PHP config, needed for things such as adjusting memory limits (or your page might not render), which particular PHP version was available (limiting your available std lib functions), which modules were installed (and which version of them, and whether they were made for fastcgi or not). Scaling was in its infancy those days and shared hosts were extremely slow, especially those without caching, and would frequently crash whenever one tenant on that machine got significant traffic. If you were hosting your own in a VM or bare-metal, things were even worse, since then you had to manage the database on your own, the firewall, the SSH daemon, Apache config files in every directory or Nginx rules and restarts, OS package updates, and of course hardware/VM resource constraints.

Yes, the resulting 100-line PHP script sitting on top of it all might be very simple, but maintaining that stack never was (and still isn't). Web work back then was like 25% coding the PHP and 75% sys-admining the stack beneath it. And it was really hard to do that in a way that didn't result in customer-facing downtime, with no easy way to containerize, scale, hot-standby, rollover, rollback, etc.

=====================

I'd probably break down this comparison (of LAMP vs modern JS frameworks) into questions like this, instead:

1) "What do I have to maintain? What do I WANT to maintain?"

IMHO this is the crux of it. Teams (and individual devs) are choosing JS frameworks + heavy frontends because even though there are still servers and configurations (of course), they're managed by someone else. That abstraction and separation of concerns is what makes it so much easier to work on a web app these days than in the PHP days, IMO.

Any modern framework now is a one-command `create whatever app` in the terminal, and there, you have a functioning app waiting for your content and business logic. That's even easier than spinning up a local PHP stack with MAMP or XAMPP, especially when you have more than one app on the same disk/computer. And when it comes time to deploy, a single `git push` will get you a highly-available website automagically deployed in a couple minutes, with a preconfigured global CDN, HTTPS, asset caching, etc. If something went wrong, it's a one-click rollback to the previous version. And it's probably going to be free, or under $20/mo, on Vercel, Cloudflare Pages, Netlify, etc. Maybe AWS Amplify Hosting too, but like Lambda, that's a lot more setup (AWS tends to be lower-level and offers nitty-gritty enterprise-y configs that simpler sites don't need or want).

By contrast, to actually set up something like that in the PHP world (where most of the stack is managed by someone else), you'd either have to find a similar PHP-script-hosting-as-a-service like Google App Engine (there's not many similar services that I know of; it's different from a regular shared host because it's a higher level of abstraction) or else use something like Docker or Lando or Forge or GridPane to manage your own VM fleet. In the latter cases you would often still have to manage much of the underlying stack and deal with various configs and updates all the time. It's very different from the hosted JS world.

The benefit of going with a managed approach is that you're really only needing to touch your own application code. The framework code is updated by someone else (not that different from using Laravel or Symfony or Wordpress or Drupal). The rest of the stack is entirely out of your sphere of responsibility. For "jamming" as an individual or producing small sites as a team, this is a good thing. It frees up your devs to focus on business needs rather than infrastructure management.

Of course, some teams want entirely in-house control of everything. In that case they can still manage to their own low-level VMs (an EC2 or similar) and maintain the whole LEMP or Node stack. That's a lot more work, but also more power and control.

A serverless func, whether in JS (anywhere) or PHP (like via Google Cloud Run), is just a continuation of this same abstraction. It's not necessarily just about high availability, but low maintenance. You and your team (and the one after them, and the one after that) only ever have to touch the function code itself, freeing you from the rest of the stack. It's useful the same way that being able to upload a video to YouTube is: You can focus on the content instead of the delivery mechanism.

2) Serverside resource consumption

It's not really true that "PHP scripts don't consume any resources (persistent processes, etc.) when they're not being used", any more than a JS site or serverless func isn't consuming resources when they're not being used. Both still require an active server on the backend (or some server-like technology, like a Varnish or Redis cache or similar).

Neither is really an app author's concern, since they are both hosting concerns. But the advantage of the JS stuff is that it's easier and cheaper for hosts to containerize and run independently, like in a V8 isolate (for Cloudflare Workers). It's harder to do that with a PHP script and still ensure safety across shared tenants. Most shared PHP environments I know of end up virtualizing/dockerizing much of the LAMP stack.

3) Serverside rendering vs static builds vs clientside rendering

As for serverside rendering vs static builds, the article doesn't really do a fair comparison of that either. This is a tradeoff between delivery speed and dynamicness, not between PHP and JS.

Even in the PHP world, the PHP processor itself offered caching, then frameworks like Wordpress would offer its own caching on top of that, then you would cache even the result of that in Varnish or similar. That essentially turns a serverside rendered page into a static build that can then be served over a CDN. This is how big PHP hosts like Pantheon or Acquia work. No medium or big size would make every request hit the PHP process directly for write-rarely, read-often content.

In the JS world, you can also do serverside rendering, static builds, clientside renders, and (realistically) some combination of all of those. The difference is that it's a lot more deliberate and explicit (but also confusing at first). But this is by design. It makes use of the strength of each part of that stack, as intended. If you're writing a blog post, chances are you're not going to edit that more than once every few weeks/months (if ever again). That part of it can be statically built and served as flat HTML and easily cached on the CDN. But the comments might trickle in every few minutes. That part can be serverside rendered in real time and then cached, either at the HTTP level with invalidations, or incrementally regenerated at will. And some things need to be even faster than that, like maybe being able to preview the image upload in your WYSIWYG editor, in which case you'd optimistically update the clientside editor with a skeleton and then verify upload/insertion success via AJAX. The server can do what it does best (query/collate data from multiple sources and render a single page out of it for all users to see), the cache can do what it does best (quickly copy and serve static content across the world), and the client can do what it does best (ensure freshness for an individual user, where needed).

It is of course possible (and often too easy) to mis-use the different parts of that stack, but you can say the same thing about the PHP world, with misconfigured caches and invalidations causing staleness issues or security lapses like accidentally shared secrets between users' cached versions.

By @foundart - 4 months
+1 for the article's link to the "CADT Model" https://www.jwz.org/doc/cadt.html
By @jak2k - 4 months
I like the idea of just renaming an `html` file to `php` and adding a bit dynamic stuff.

A webserver that could do this with JavaScript/TypeScript would be cool! (Or maybe I should learn php…)

By @alexanderscott - 4 months
I feel every admin panel I’ve built over the years, including recently, have been “mildly dynamic”. only enough jquery to be usable by other staff.
By @ss64 - 4 months
This is missing any discussion of the dynamic functionality you can add with a bit of vanilla JavaScript.
By @lwhi - 4 months
Once upon a time there was a technology called dHTML.

Waaaay before Web 2.0 and full interactivity on the client.

By @Reason077 - 4 months
I dunno. Isn't HN an example of a "mildly dynamic website"?
By @simonbw - 4 months
I think some of this article resonates with me, but I also think a big part of it rubs me the wrong way. It seems to assume that everyone has a webserver running a LAMP stack. If you have a static site on a webserver running PHP then of course PHP is going to require the least amount of effort to make your site mildly dynamic.

On the other hand, if you have nothing, I don't think that the fastest/easiest way to a mildly dynamic website is to use PHP.

I got curious about the most minimalist setup I could get for running a static site that would also easily transition to becoming dynamic piece-by-piece. Here's what I came up with using NextJS:

1. Create a `package.json` containing:

    {
      "scripts": {
        "dev": "next dev",
        "build": "next build",
        "start": "next start"
      },
      "dependencies": {
        "react": "^18",
        "react-dom": "^18",
        "next": "14.2.4"
      }
    }
2. Create `app/layout.js` containing

    export default ({ children }) => children;

3. Create your home page at `app/page.js`. This is pretty much just what your index.html would have been before except it's wrapped with `export default () => (/.../)`:

    export default () => (
      <html>
        <head>
          <title>Home</title>
        </head>
        <body>
          <h1>Home</h1>
          <p>Welcome to our website!</p>
        </body>
      </html>
    );

4. Put the rest of your static files in `/public`

Now, when you want to make one of your html files dynamic:

1. Move it from `public/your-page.html` to `app/your-page/page.js`

2. Wrap it with `export default () => (/.../)`

And there you go, you can start using JavaScript and JSX in your page.

Here's a summary of the differences between this approach and creating a PHP-based site:

1. You have to create 2 more files containing 13 more lines of code than you would with PHP

2. You need to find a NextJS web host rather than a PHP web host

3. Your dynamic pages are html-inside-javascript rather than php-inside-html. You are also technically writing JSX and not HTML, which has slightly different syntax.

4. You have a much easier to set up local dev server (Install `node`, then `npm install` and `npm dev`)

5. This framework can scale pretty seamlessly to "highly dynamic", whereas with PHP you'd probably need to introduce a frontend framework if you wanted to create something actually "highly dynamic".

6. You only need to learn one programming language (JavaScript) instead of two (JavaScript and PHP).

By @flobosg - 4 months
(2022)