What Is Analog Computing?
Analog computing, utilizing continuous systems, contrasts with digital computing's binary approach. Researchers are revisiting it for energy efficiency, especially in AI, amid rising computational demands and energy consumption.
Read original articleAnalog computing, which utilizes continuous physical systems to perform calculations, contrasts sharply with the prevalent digital computing that relies on binary digits (0s and 1s). Historically, analog devices like the Antikythera mechanism and slide rules have been used to model natural phenomena, demonstrating their effectiveness in handling continuous data. The differential analyzer, developed by Vannevar Bush in 1931, exemplified the peak of analog computing, capable of solving complex differential equations but requiring manual reconfiguration for different equations. Although digital computing emerged as a more user-friendly and accurate alternative, it has led to significant energy consumption, particularly with the rise of artificial intelligence, exemplified by a planned $100 billion data center by Microsoft and OpenAI that would consume about 5 gigawatts of power. In light of these challenges, researchers are revisiting analog computing as a potential solution for more energy-efficient processing, especially in AI applications, where analog circuits could model operations using electrical signals rather than traditional binary methods. This exploration of analog computing may provide a sustainable path forward in the face of growing computational demands.
- Analog computing uses continuous physical systems, unlike digital computing's binary approach.
- Historical analog devices effectively modeled natural phenomena and complex equations.
- Digital computing, while user-friendly, has led to high energy consumption, especially in AI.
- Researchers are revisiting analog computing for its potential energy efficiency in modern applications.
- The exploration of analog methods may offer sustainable solutions to current computational challenges.
Related
Taking a closer look at AI's supposed energy apocalypse
Artificial intelligence's energy impact, particularly in data centers, is debated. AI's energy demands are significant but only a fraction of overall data center consumption. Efforts to enhance AI cost efficiency could mitigate energy use concerns.
An Analog Network of Resistors Promises Machine Learning Without a Processor
Researchers at the University of Pennsylvania created an analog resistor network for machine learning, offering energy efficiency and enhanced computational capabilities. The network, supervised by Arduino Due, shows promise in diverse tasks.
Ask HN: Weirdest Computer Architecture?
The traditional computing stack includes hardware and software layers, but alternatives like quantum and neuromorphic computing introduce new models, changing components and architectures beyond classical Turing machines.
Aging US grid can't handle power and water requirements of generative AI
The rise of generative AI is increasing electricity and water demand, straining the U.S. power grid. Data centers' energy consumption may reach 16% of total electricity by 2030, prompting sustainability efforts.
What Is Analog Computing?
Analog computing, historically significant, offers potential energy-efficient alternatives to digital systems. As demand for computing power rises, revisiting analog methods may provide sustainable solutions to modern energy challenges.
The article seems to put a bit of a silly spin on an interesting subject by wanting to present analog computers as a potential alternative to digital ones, when in fact that is only rarely going to be possible.
Digital computers support programming - solving of problems by describing the process by which they are to be solved, whereas analog computers instead support solving of (limited types) of problem by building an analog of the problem (which you could think of as "describing" the problem, rather than how to solve it), either using analog electronics (a patch panel analog computer), or by building a physical analog of it such as the Antikythera mechanism which models the motions of the planets.
The article oddly omits any mention of quantum computers which (unless I'm totally misguided!) are essentially a type of analog computer - one that is no replacement for digital computers and general problem solving since it can't be programmed, but rather only works when one can configure it to be an analog of the problem to be solved. As with any other type of analog computer, the problem is then solved by letting the system run and the dynamics play out.
https://museum.syssrc.com/artifact/exhibits/251/
I’ll note that there’s been plenty of work on general-purpose, analog computing. Analog’s competitiveness also lives on in mixed-signal ASIC’s. The problem with new products, though, is analog requires fully-custom, circuit design. Outside some niche uses, you can’t just synthesize them from higher-level specs like you can digital designs with standard cells.
That puts analog at a severe disadvantage on NRE cost. Especially as both design rules and mask costs increase with every node shrink.
"But it’s not obvious why a system that operates using discrete chunks of information would be good at modeling our continuous, analog world."
OK, fine. But, if my "discrete" modelling is using 64 bits of precision, or, in decimal, about 19 digits of precision, and your analog system on a good day has about 3, how could your analog system even "tell" mine is discrete? It can't "see" that finely in the first place.
Oh, you don't like how it's excessively precise? It is off-the-shelf numerical tech to include precision in calculations. It isn't something every programmer knows, but it's been in books for years.
Oh, by the way, it is also off-the-shelf numerical tech to estimate errors using digital computers. That's going to be a hard trick to match in your analog system. Also, digital systems offer many tradeoffs with regard to error levels and computation time. Another thing your analog system is going to have a hard time with.
"But floating point numbers have lots of problems with instability and such." Most of those problems will manifest in your analog computers, too. Many of those aren't so much about floating point per se as just doing inadvisable things with numbers in general, e.g., if you set up an analog computer to subtract two quantities that come out "close to zero" and then divide by that quantity, your analog computer is going to behave in much the same way that a digital computer would. Ye Olde Classic "add a whole bunch of small numbers together only for them all to disappear because of the large number" is also the sort of problem that you can only get because digital is good enough to reveal it in the first place; accurate adding together millions of very tiny numbers is something your analog computer almost certainly just plain can't do at all.
A carefully crafted analog system might be able to outpace a digital system, but it's not going to be the norm. Normally the digital system will outpace the analog system by absurd amounts.
Now, are there niche cases where analog may be interesting? Yes, absolutely. Neural nets is an interesting possibility. Obviously a big possibility. There's a handful of others. But it's a niche within a niche. There's no problem with that. There's plenty of interesting niches within niches. You can make careers in a niche within a niche. But I don't know what it is about this one that convinces so many people that there's some sort of revolution budding and we're on the cusp of an amazing new era of analog computing that will do so many amazing things for everyone... as opposed to a niche within a niche. We had that revolution. It was digital computing. It is the single largest revolution in the history of man as measured by factors of magnitude of improvement from start to now, and the ditigal revolution is still not done. Analog is not going to overtake that. It doesn't have the headroom. It doesn't have the orders of magnitude improvement possible. That's why it was left behind in the first place. The delta between my kid 3D printing a clock mechanism and the finest Swiss watchmaker who ever lived is still a small single digit number of orders of magnitude in terms of their mechanical precision, and in the digital world we still expect that degree of progress about every 5-10 years or so... and that's slower than it was for many decades.
So they don't have to do everything at once.
(the joke is that analog computers don't have clocks... Do they?)
And yet not even one paragraph in this article?
Related
Taking a closer look at AI's supposed energy apocalypse
Artificial intelligence's energy impact, particularly in data centers, is debated. AI's energy demands are significant but only a fraction of overall data center consumption. Efforts to enhance AI cost efficiency could mitigate energy use concerns.
An Analog Network of Resistors Promises Machine Learning Without a Processor
Researchers at the University of Pennsylvania created an analog resistor network for machine learning, offering energy efficiency and enhanced computational capabilities. The network, supervised by Arduino Due, shows promise in diverse tasks.
Ask HN: Weirdest Computer Architecture?
The traditional computing stack includes hardware and software layers, but alternatives like quantum and neuromorphic computing introduce new models, changing components and architectures beyond classical Turing machines.
Aging US grid can't handle power and water requirements of generative AI
The rise of generative AI is increasing electricity and water demand, straining the U.S. power grid. Data centers' energy consumption may reach 16% of total electricity by 2030, prompting sustainability efforts.
What Is Analog Computing?
Analog computing, historically significant, offers potential energy-efficient alternatives to digital systems. As demand for computing power rises, revisiting analog methods may provide sustainable solutions to modern energy challenges.