January 13th, 2025

Nvidia might do for desktop AI what it did for desktop gaming

NVIDIA's CES keynote introduced 'Project Digits,' a $3,000 home AI supercomputer for local processing of advanced models, targeting data scientists and researchers, contingent on user-friendly software development for success.

Read original articleLink Icon
Nvidia might do for desktop AI what it did for desktop gaming

NVIDIA's recent keynote at CES highlighted its ambitious plans for desktop AI, paralleling its transformative impact on desktop gaming. CEO Jensen Huang introduced 'Project Digits,' a home AI supercomputer designed to enable local processing of AI models with up to 200 billion parameters, priced around $3,000. This initiative aims to democratize access to advanced AI capabilities for data scientists, researchers, and students, rather than just casual users. The system can be paired to support even larger models, enhancing its appeal. Project Digits operates on both Windows and Mac, and features a Linux-based DGX OS, indicating versatility in its application. NVIDIA's strategy appears to mirror its successful GPU model, which offers various performance tiers to cater to different user needs. However, the success of Project Digits hinges on the development of user-friendly software to drive demand, as the current AI landscape lacks the maturity seen in gaming. If NVIDIA can address this software gap, it may catalyze significant investment and interest in desktop AI, positioning the company as a leader in this emerging market.

- NVIDIA aims to replicate its desktop gaming success in the AI sector.

- 'Project Digits' offers local AI processing for advanced models at a competitive price.

- The target audience includes data scientists, researchers, and students.

- The product's success depends on the development of accessible AI software.

- NVIDIA's strategy may lead to a diverse range of AI products similar to its GPU offerings.

Link Icon 16 comments
By @airstrike - 3 months
400B parameters in a "plug-and-play" form factor for $6k is wild.

Meanwhile all the usual "desktop" players are still trying to find a way to make good on their promises to develop their own competitive chips for AI inference and training workloads in the cloud.

I'm betting on Nvidia to continue to outperform them. The talent, culture and capabilities gap just feels insurmountable for the next decade at least barring major fumbles from Nvidia.

By @richardw - 3 months
I don’t get the need. With gaming there’s a real benefit to having the card close to a display. There’s enough benefit that you don’t mind it being unused 20 hours a day. There’s relatively little benefit to having training happen a few feet away rather than a data center. Solid chance it sits unused most of the time, and when you really need it you run into capacity issues, so you’d need to predict your future needs carefully or be happy waiting for a job to finish.

AI training feels like transport. You rent the capacity/vehicle you need on demand, benefit from yearly upgrades. Very few people are doing so much training that they need a local powerhouse, upgraded every year or so.

Even sharing the hardware in a pool seems more rational. Pay 200/month for access to a semi private cluster rather than having it sit on your desk.

By @divbzero - 3 months
The article is focused on Nvidia, but note that Apple [1][2] and Google [3] have also been working in this area and will undoubtedly continue to do so.

[1]: https://developer.apple.com/machine-learning/core-ml/

[2]: https://machinelearning.apple.com/research/neural-engine-tra...

[3]: https://research.google/blog/improved-on-device-ml-on-pixel-...

By @gnabgib - 3 months
Related Nvidia's Project Digits is a 'personal AI supercomputer' (622 points, 8 days ago, 501 comments & jeans) https://news.ycombinator.com/item?id=42619139
By @aquietlife - 3 months
What is Nvdia's track record with releasing/supporting its own Linux-based OS? Can I easily switch to a different OS?
By @walterbell - 3 months
https://www.tomshardware.com/pc-components/cpus/nvidia-arm-s...

> Nvidia will be introducing two new chips, the N1X at the end of this year and the N1 in 2026. Nvidia is expected to ship 3 million N1X chips in Q4 this year and 13 million vanilla N1 units next year. Nvidia will be partnering with MediaTek to build these chips, with MediaTek receiving $2 billion in revenue.. Nvidia will show off its upcoming ARM-based SoCs in Computex in May.

By @ekianjo - 3 months
From what I could gather on related communities Project Digits will run 200B models very slowly so there is no breakthrough there'
By @rsanek - 3 months
pretty unconvinced. when desktop gaming started you didn't have low latency high bandwidth reliable internet. if you did, you probably wouldn't have people buying cards at all and instead GeForce Now would have been the whole market.

we're already at that stage now with AI / LLMs. this type of physical product will remain niche.

By @ekianjo - 3 months
The track record of standalone Nvidia appliances is pretty poor. The shield console and it's portable version disappeared fairly quickly, the Jetson dev board is for laughs since software support is awful, so I am not holding my breath for this one.

It will take more than jeans and leather jackets to sell those

By @prashp - 3 months
Still no word on where and how to buy these?
By @ls612 - 3 months
These things look pretty small I wonder if someone will make a 2U rack tray to hold a few of them.
By @paxys - 3 months
Is this just an ad for nvidia's new box or is the author actually making a point?
By @xyst - 3 months
I’ll wait for the benchmarks. NVDA marketing is known to oversell.
By @enasterosophes - 3 months
This is just an ad.
By @skywhopper - 3 months
Expensive and rare?
By @MetroWind - 3 months
They did stuff to desktop AI?