Hot Take: Low Code/No Code platforms die as LLMs get better
The advancement of large language models reduces the necessity for low-code and no-code platforms, complicating their training and pushing developers to focus on technologies with abundant online training data.
As large language models (LLMs) improve in their ability to generate code, the necessity for low-code and no-code platforms diminishes. These platforms are attractive due to their lack of source code, but this absence complicates the training of LLMs to manage them effectively. Consequently, it is argued that utilizing proprietary systems will become increasingly challenging as LLMs advance. Instead, developers should focus on building with technologies that have abundant training data available online, which the most advanced models can leverage. Additionally, the software industry often operates under a winner-takes-all model, where network effects create dominant players. LLMs are extending this dynamic to software development, influencing how builders approach their projects.
- The rise of LLMs reduces the need for low-code and no-code platforms.
- Lack of source code in these platforms complicates LLM training.
- Proprietary systems may become harder to use as LLMs evolve.
- Developers should prioritize technologies with extensive online training data.
- LLMs are influencing the winner-takes-all dynamics in software development.
In 2015, I thought React was low code. It came along and eliminated the need of dom update functions. Imagine all the Jquery or Vanilla functions to sync the data/dom for "add to cart", "increment item", "clear cart" across the product display and top bar and all the subtle out of sync bugs.
Looking at React ecosystem over the years, I felt the pain of the people who in 1954 believed the authoritative "Since FORTRAN should virtually eliminate coding and debugging...".
https://www.softwarepreservation.org/projects/FORTRAN/Backus...
To accomplish the same thing in regular software languages and low code platform takes the exact same learning curve. If the low code platform is good, which they are often half arsed. And learning a regular software language is much more useful down the line too.
What will be interesting is what chosen languages/frameworks are the most effective in LLM-based no/low-code platforms. I've wanted forever for developers to choose/use better languages and tools but that's always too varied to execute at a scale beyond early startup. I could see the low-code AIs/LLMs having an advantage using better communication/action interfaces to make apps. Geez this could all happen fast.
* through multiple iterations of back and forth with LLM assistant I can get DB schema but there is no way to import whole schema with relations, indexes, views described with SQL statements into that nocode app - extremely annoing to have "manually", tediously click and recreate the thing
* same with their drag&drop "functions" stacks builder - very annoying to fiddle around and click click click when in the back of your head you have "damn, llm could generate that block of code in 1m and be done"
* don't even get me started with their custom DSL "expressions" - yet another thing not usable anywhere outside of their platform, despite potential and usefulness I have absolutely no incentives to try to learn that
At this point I really think that I would be faster with code and LLM than this nocode DBaaS
Low/no code tools don't need to be like that. The next generation will be the ones beating the current incubents. As LLM can generate code, LLM-powered no-code tools will deliver high level products with a fraction of the effort needed nowadays. OpenAI, Anthropic and the likes won't be capable of creating the best UX for consumer apps using LLMs. That's a great opportunity there.
But I do see the merit in this thinking that LLMs can eat LCNC platforms for breakfast. However I am excited to see how with time a near perfect applications can be churned out with about 80-90% work done and engineers can figure the last few miles and reach the finish line but 50x faster.
No code platforms can easily abstract this out one more level. The constrained outputs of these platforms are still sufficiently high enough to build lots of varied use cases.
What makes you think LLMs will get better? The more we've seen AI develop the more fine tuned everyone's senses have got hyper-sensitised to what's AI and it just reeks of the uncanny, rendering it ultimately useless.
They aren’t getting eaten by LLMs, they just suck and that’s already a low bar to die due to a reason de jour. Cool looking toys to experiment with in a free money economy.