October 11th, 2024

AI Winter Is Coming

The AI ecosystem shows a disparity between producers and promoters, with academia criticized for superficial research and industry withholding valuable findings, potentially leading to another downturn in AI development.

Read original articleLink Icon
AI Winter Is Coming

The article discusses the current state of the AI ecosystem, highlighting a disparity between producers and promoters in academia and industry. It criticizes academia for becoming a "paper mill," where the pressure to publish leads to a proliferation of superficial research, often with catchy titles but lacking substance. Issues such as citation rings and reproducibility crises are prevalent, exemplified by a scandal involving Stanford students who faked results. In industry, valuable research is often unpublished to maintain competitive advantages, while published work tends to be non-critical or serves marketing purposes. This environment has fostered an "AI echo chamber," where influencers misinterpret and misrepresent research, leading to inflated perceptions of AI capabilities among non-technical audiences. The article warns that this trend could lead to another "AI winter," similar to past downturns in data science and crypto, suggesting that while promoters may chase the next trend, genuine producers will continue to advance the field.

- The AI ecosystem is experiencing a surplus of promoters compared to producers.

- Academia faces issues like superficial research and a lack of reproducibility.

- Industry research often remains unpublished to protect competitive advantages.

- Misrepresentation of AI capabilities is prevalent among influencers.

- The current trends may lead to another downturn in AI development.

Link Icon 23 comments
By @jerf - 7 months
I don't know about "Winter". The original "AI Winter" was near-total devastation. But it's probably reasonable to think that after the hype train of the last year or two we're due to be headed into the Trough of Disillusionment for LLM-based AI technologies on the standard Gartner hype cycle: https://en.wikipedia.org/wiki/Gartner_hype_cycle
By @janalsncm - 7 months
I like the distinction between producers and promoters. This is why I am naturally skeptical of polished demos and people posting in their real name. If you post in your real name, you are at a minimum promoting yourself (generally boils down to “I am a smart, employable person”).

I wish I had a better heuristic, but the best I’ve found on Twitter is pseudonymous users with anime profile pics. These are people who don’t care about boosting a product. They’re possibly core contributors to a lesser-known but essential python library. They deeply understand a single thing very well. They don’t post all day because they are busy producing.

By @i-cjw - 7 months
> Or take Ed Thorp, who invented the option pricing model and quietly traded on it for years until Black-Scholes published a similar formula and won a Nobel Prize

Hardly quietly. Thorpe published "Beat the Market" in 1967 detailing his formulae, six years before Black Scholes won the Nobel.

By @zcw100 - 7 months
People warning about a coming AI winter are almost as annoying as people doomsaying about AGI. It’s going to be somewhere in between. It can be disappointing and revolutionary at the same time. We had the dot-com crash and yet out of that grew some of the largest corporations the world. Microsoft, Facebook, Apple, Amazon, etc
By @aiforecastthway - 7 months
The original "AI Winter" was primarily a government funding phenomenon [1]. There was no "bubble" in the private sector. I.e., the winter was the result of responsible people in government realizing the hype was over-extended and standing up for the taxpayer. Progress would be made, eventually, but not in that moment. (Those people were correct, btw.)

> But beneath the surface, there are rampant issues: citation rings, reproducibility crises, and even outright cheating. Just look at the Stanford students who claimed to fine-tune LLaMA3 to have be multimodal with vision at the level of GPT-4v, only to be exposed for faking their results. This incident is just the tip of the iceberg, with arXiv increasingly resembling BuzzFeed more than a serious academic repository.

Completely agreed. Academia is terminally broken. The citation rings don't bother me. Bibliometrics are the OG karma -- basically, fake internet points. Who cares?

The much bigger problem is that those totally corrupt circular influence rings extend into program director positions and grant review committees at federal funding agencies. Most of those people are themselves academics (on leave, visiting, etc.) who depend on money from the exact sources they are reviewing for. So this time is their friends turn, and next time is their turn. And don't dare tell me that this isn't how it works. I've been in too many of those rooms.

It's gotten incredibly bad in in ML in particular. Our government needs to cut these people off. I am sick of my tax money going to these assholes (via the NSF, DARPA, etc.). Just stop funding the entire subfield for a few years, tbh. It's that bad.

On the private sector side, I think that the speculative AI bubble will deflate, but also that some real value is being created and many large institutions are actually behaving quite reasonably compared to previous nonsense cycles. You just have to realize we're mid-late cycle and companies/groups that aren't finding PMF with llm tech in the next 2-3 years are probably not great bets.

--

[1] https://en.wikipedia.org/wiki/Lighthill_report

By @mindcrime - 7 months
An "AI Fall" maybe. But "AI Winter"? I really doubt it. And the author of this piece presents very little in the way of compelling arguments for the advent of said AI Winter.

For all the valid criticisms of "AI"[1] today, it's creating too much value to disappear completely and there's no particular reason[2] to expect progress to halt.

[1]: scare quotes because a lot of people today are mis-using the term "AI" to exclusively mean "LLM's" and that's just wrong. There's a lot more to AI than LLM's.

[2]: yes, I'm aware of neural scaling laws and some related charts showing a slow-down in progress, and the arguments around not having enough (energy|data|whatever) to continue to scale LLM's. But see [1] above - there is more to AI than LLM's.

By @pinkmuffinere - 7 months
> This is how we’re headed for another AI winter, just as we saw with the fall of data science, crypto, and the modern data stack.

The fall of data science??? When did that happen? I’m not squarely in the field, but I thought I would have heard about it

By @hu3 - 7 months
Similar phenomenon, on a smaller scale, is happening with what I call meta-cloud PaaS, which facilitates web app deployments/provisioning. They usually run on top of AWS or other large clouds, hence meta-cloud.

It started with Heroku but now it has gained VC attention in the form of Next/Vercel, Laravel Cloud, Void(0), Deno Deploy and Bun-yet-to-be-announced solution. I'm probably forgetting one or two.

Don't get me wrong, they are legit solutions. But the VC money currently being poured in on influencers to push these solutions make them seem much more appealing than they would be otherwise.

By @incognito124 - 7 months
> That leading edge research paper is most probably someone’s production code.

Very powerful, albeit sad, statement.

By @Agentus - 7 months
So the argument that during a gold rush, there are scammers selling pyrite and misleading prospective prospectors to quarries where there is no gold, soo because all these are happening incidentally the gold rush is therefore near over. Okay. Good article otherwise. But Geoffrey Hinton takes the opposite stance (so does eric schmidt)with recently stating the last 10 years of ai development have been unexpected and the trend will continue with the next 10 years. But perhaps that could be handwaived off as cheerleaders/promoters.
By @hackable_sand - 7 months
Here is the thesis at the end

> the real producers will keep moving forward, building a more capable future for AI.

This is one of many signal flares going up.

Do something or cash out of the AI space. Engineers are tired.

By @navaed01 - 7 months
I appreciate a good original perspective, but much of this seems over blown…

“Meanwhile, data scientists and statisticians who oftentimes lack engineering skills are now being pushed to write Python and “do AI,” often producing nothing more than unscalable Jupyter Notebooks”

Most data scientists are already well versed in python. There’s so many platforms emerging that abstract a lot of the infra required to build semi-scalable applications

By @swyx - 7 months
our take on this from the industry pov: https://www.latent.space/p/mar-jun-2024 (there is a podcast version too if u click thru)

broadly agree but i think predicting ai winter isnt as useful as handicapping how deep and still building useful things regardless.

By @teddyh - 7 months
At least we got a new keyboard Super modifier key out of it. Or maybe we should make it the Compose key?
By @tim333 - 7 months
I don't think there'll be much of a winter for a while. The winters were mostly economic effects where the funding dried up and current AI seems to be getting near human levels which will be a big economic incentive to plow ahead.
By @mizzao - 7 months
Interesting that this article is right next to one making the opposite point:

https://news.ycombinator.com/item?id=41813268

By @8note - 7 months
> as we saw with the fall of data science, crypto, and the modern data stack.

Has data science or the modern data stack fallen? What does crypto(I assume currency) have any relevance to ai winter for?

By @RobRivera - 7 months
Again?

If anyone had this knowledge, they wouldnt tell us, theyd keep their market edge and make a bet for their own selfish greed.

Anything else is PR

discuss amongst yourselves Rhode Island, neither a road nor an island

By @woopwoop - 7 months
I am a pure mathematician by training. I _hate_ machine learning. The entire field seems to me like a bunch of unprincipled recipes and random empirics. The fact that it works is infuriating, and genuinely seems like a tragedy to me. The bitter lesson is very bitter indeed.

But I've been hearing the refrain of this article for a decade now. I just don't believe it anymore.

By @synapsomorphy - 7 months
Just like with most other criticisms I've seen of AI, this seems to be criticizing the hype around AI, not the technology itself. It isn't clear if the author conflates those but a lot of people wrongly do. AI isn't one to one with NFTs, there being a lot of grift around something doesn't make it useless or mean it won't change the world.
By @bluesounddirect - 7 months
Awesome.
By @ebabchick - 7 months
who's going to tell him #feeltheagi
By @pajeets - 7 months
chatgpt wrapper startups are ngmi