Microsoft shelves its underwater data center
Microsoft has ended its underwater data center experiment, noting improved server longevity underwater. Despite success, Microsoft shifts focus to other projects like AI supercomputers and nuclear ambitions, discontinuing further underwater endeavors.
Read original articleMicrosoft has ended its Project Natick underwater data center experiment, citing a shift in focus. The project, which started in 2013, aimed to explore the feasibility of underwater data centers. Microsoft reported fewer server failures in the underwater setup compared to land-based servers, attributing this to seawater's temperature stability and the use of inert nitrogen gas for server protection. Despite the success in server longevity, Microsoft has decided not to pursue further underwater data center projects. The company plans to apply the learnings from this experiment to other cases, focusing on energy efficiency and operational improvements. While China has initiated its own submerged server project, Microsoft has not indicated any future underwater data center endeavors. Instead, Microsoft is redirecting its efforts towards collaborations like building a $100 billion AI supercomputer data center with OpenAI and exploring nuclear ambitions for modular reactors. The decision to discontinue the underwater data center project reflects Microsoft's strategic realignment towards more focused research and development initiatives.
Related
20x Faster Background Removal in the Browser Using ONNX Runtime with WebGPU
Using ONNX Runtime with WebGPU and WebAssembly in browsers achieves 20x speedup for background removal, reducing server load, enhancing scalability, and improving data security. ONNX models run efficiently with WebGPU support, offering near real-time performance. Leveraging modern technology, IMG.LY aims to enhance design tools' accessibility and efficiency.
Creating New Installation Media for MS-DOS 4.0
Microsoft released MS-DOS 4.00 source code in 2024, prompting the community to develop MS-DOS 4.01 due to missing official media. Challenges arose in recreating installation media, including creating stub executables and addressing missing files. Installation from floppies had mixed results, with various disk downloads provided for users. Feedback is welcomed on fabulous.community.
My weekend project turned into a 3 years journey
Anthony's note-taking app journey spans 3 years, evolving from a secure Markdown tool to a complex Electron/React project with code execution capabilities. Facing challenges in store publishing, he prioritizes user feedback and simplicity, opting for a custom online deployment solution.
Merlin Labs aims to test an AI-powered KC-135 within a year
Two startups, Merlin Labs and EpiSci, collaborate to develop self-flying tankers and advanced AI for dogfighting. They aim to create AI pilots to reduce human pilot dependency in Air Force missions.
Homegrown Rendering with Rust
Embark Studios develops a creative platform for user-generated content, emphasizing gameplay over graphics. They leverage Rust for 3D rendering, introducing the experimental "kajiya" renderer for learning purposes. The team aims to simplify rendering for user-generated content, utilizing Vulkan API and Rust's versatility for GPU programming. They seek to enhance Rust's ecosystem for GPU programming.
It doesn't sound like they really discontinued the project. It sounds like it got moved to another team.
That therefore begs the question: what are the narrowest points of failure in a system like this? For example it would be naive to have 140 fungible compute nodes if they all went through a single networking switch. Node workloads can be moved but a switch failure makes the entire container useless.
This works physically as well. Marine vessels have sealed bulkheads such that a hull breach in one end doesn’t cause flooding in the other end.
How do these vessels fan out all the other points of failure and what are the error rates in those components?
I wonder what factors led them to discontinue the project.
Related
20x Faster Background Removal in the Browser Using ONNX Runtime with WebGPU
Using ONNX Runtime with WebGPU and WebAssembly in browsers achieves 20x speedup for background removal, reducing server load, enhancing scalability, and improving data security. ONNX models run efficiently with WebGPU support, offering near real-time performance. Leveraging modern technology, IMG.LY aims to enhance design tools' accessibility and efficiency.
Creating New Installation Media for MS-DOS 4.0
Microsoft released MS-DOS 4.00 source code in 2024, prompting the community to develop MS-DOS 4.01 due to missing official media. Challenges arose in recreating installation media, including creating stub executables and addressing missing files. Installation from floppies had mixed results, with various disk downloads provided for users. Feedback is welcomed on fabulous.community.
My weekend project turned into a 3 years journey
Anthony's note-taking app journey spans 3 years, evolving from a secure Markdown tool to a complex Electron/React project with code execution capabilities. Facing challenges in store publishing, he prioritizes user feedback and simplicity, opting for a custom online deployment solution.
Merlin Labs aims to test an AI-powered KC-135 within a year
Two startups, Merlin Labs and EpiSci, collaborate to develop self-flying tankers and advanced AI for dogfighting. They aim to create AI pilots to reduce human pilot dependency in Air Force missions.
Homegrown Rendering with Rust
Embark Studios develops a creative platform for user-generated content, emphasizing gameplay over graphics. They leverage Rust for 3D rendering, introducing the experimental "kajiya" renderer for learning purposes. The team aims to simplify rendering for user-generated content, utilizing Vulkan API and Rust's versatility for GPU programming. They seek to enhance Rust's ecosystem for GPU programming.