AI method rapidly speeds predictions of materials' thermal properties
Researchers developed VGNN, a machine-learning framework predicting materials' thermal properties faster than other AI techniques. It aids energy systems and microelectronics by reducing waste heat, offering accurate predictions for various material properties.
Read original articleResearchers from MIT and other institutions have developed a new machine-learning framework that can predict materials' thermal properties, such as phonon dispersion relations, up to 1,000 times faster than other AI-based techniques. This advancement could significantly aid in designing more efficient energy-conversion systems and faster microelectronic devices by reducing waste heat. The method, known as Virtual Node Graph Neural Network (VGNN), introduces virtual nodes to represent phonons in crystal structures, allowing for more efficient and accurate predictions. VGNNs can estimate phonon dispersion relations in alloy systems and predict heat capacity with improved accuracy compared to traditional methods. This approach not only accelerates the prediction of complex phonon properties but also has the potential to be applied to other material properties like electronic, optical, and magnetic spectra. The research, published in Nature Computational Science, received support from various organizations including the U.S. Department of Energy and the National Science Foundation.
Related
AI discovers new rare-earth-free magnet at 200 times the speed of man
Materials Nexus and the University of Sheffield collaborated to create MagNex, a rare-earth-free permanent magnet using AI, significantly faster than traditional methods. MagNex offers a sustainable, cost-effective alternative for powerful magnets.
Researchers run high-performing LLM on the energy needed to power a lightbulb
Researchers at UC Santa Cruz developed an energy-efficient method for large language models. By using custom hardware and ternary numbers, they achieved high performance with minimal power consumption, potentially revolutionizing model power efficiency.
Researchers upend AI status quo by eliminating matrix multiplication in LLMs
Researchers innovate AI language models by eliminating matrix multiplication, enhancing efficiency. A MatMul-free method reduces power consumption, costs, and challenges the necessity of matrix multiplication in high-performing models.
An Analog Network of Resistors Promises Machine Learning Without a Processor
Researchers at the University of Pennsylvania created an analog resistor network for machine learning, offering energy efficiency and enhanced computational capabilities. The network, supervised by Arduino Due, shows promise in diverse tasks.
Training of Physical Neural Networks
Physical Neural Networks (PNNs) leverage physical systems for computation, offering potential in AI. Research explores training larger models for local inference on edge devices. Various training methods are investigated, aiming to revolutionize AI systems by considering hardware physics constraints.
Another MIT study: if we AI-augmented R&D widely in the US, our productive economic growth rate would double. The authors argue this would be permanent, forever increasing the rate of technological progress:
https://www.sciencedirect.com/science/article/pii/S004873332...
I bet a true accounting of all the economic impacts of semiconductor technologies will show these have far more value than Meta, Instagram, Snapchat, TikTok all combined. The stuff built on top of deep technology has higher PR value -- but the deeper tech has far more economic value. Same with AI.
> It is estimated that about 70 percent of the energy generated worldwide ends up as waste heat.
This has me a bit confused and I’m curious if someone has more insight. Doesn’t all energy generated end up as heat somewhere? Like every watt going into a bitcoin mining rig gets released as heat. I’m unsure how you’d determine what percentage of that is waste heat.
Related
AI discovers new rare-earth-free magnet at 200 times the speed of man
Materials Nexus and the University of Sheffield collaborated to create MagNex, a rare-earth-free permanent magnet using AI, significantly faster than traditional methods. MagNex offers a sustainable, cost-effective alternative for powerful magnets.
Researchers run high-performing LLM on the energy needed to power a lightbulb
Researchers at UC Santa Cruz developed an energy-efficient method for large language models. By using custom hardware and ternary numbers, they achieved high performance with minimal power consumption, potentially revolutionizing model power efficiency.
Researchers upend AI status quo by eliminating matrix multiplication in LLMs
Researchers innovate AI language models by eliminating matrix multiplication, enhancing efficiency. A MatMul-free method reduces power consumption, costs, and challenges the necessity of matrix multiplication in high-performing models.
An Analog Network of Resistors Promises Machine Learning Without a Processor
Researchers at the University of Pennsylvania created an analog resistor network for machine learning, offering energy efficiency and enhanced computational capabilities. The network, supervised by Arduino Due, shows promise in diverse tasks.
Training of Physical Neural Networks
Physical Neural Networks (PNNs) leverage physical systems for computation, offering potential in AI. Research explores training larger models for local inference on edge devices. Various training methods are investigated, aiming to revolutionize AI systems by considering hardware physics constraints.