Tag: Sustainability

  • Mirova Invests $30.5M in Regenerative Farming in India

    Mirova Invests $30.5M in Regenerative Farming in India

    The air in Delhi felt thick with the usual November haze, but the news coming out of the Indian agricultural sector offered a breath of something fresher, you know?

    Mirova, a fund with backing from the luxury group Kering, just announced a $30.5 million investment in Varaha. The goal? To boost regenerative farming across northern India. It’s an ambitious project, aiming to support around 337,000 farmers, covering some 675,000 hectares, as reported on November 12th, 2025.

    The tricky part is, what does that actually mean on the ground? Regenerative farming, in essence, is about working with nature, not against it. It’s about soil health, biodiversity, and trying to create a more sustainable model, something really needed in the face of climate change.

    I spoke with an official from Mirova, and they emphasized the long-term vision. “This isn’t just about immediate yields,” she said, “it’s about building resilient systems for the future.” It sounded good, honestly, but the proof will be in the planting, as they say.

    The investment is meant to provide Varaha with the resources to expand its work with farmers, helping them transition to these new practices. This includes training, access to better inputs, and, crucially, financial support. It’s a complex undertaking. Or maybe I’m misreading it.

    It’s still early days, of course. But the scale of the project is what’s striking. Hundreds of thousands of farmers. Hundreds of thousands of hectares. The potential impact is significant, but it all hinges on the execution.

    And, well, we’ll see.

  • Mirova Invests $30.5M in Varaha’s Regenerative Farming in India

    Mirova Invests $30.5M in Varaha’s Regenerative Farming in India

    The air in Delhi, November 12, 2025, felt thick with the usual haze, but today, there was a different kind of buzz. News had broken earlier about a significant investment in India’s agricultural future.

    Mirova, the investment fund backed by Kering, is putting $30.5 million into Varaha. The goal? Supporting regenerative farming practices across a vast swath of northern India. You know, it’s the kind of project that feels huge, even before you dig into the details.

    The plan, as per reports, is to reach around 337,000 farmers, spanning 675,000 hectares. That’s a lot of land. It’s also a lot of lives, potentially changed. The tricky part is always the execution, of course.

    I spoke with an official from Mirova earlier today, and they said this investment aligns with their broader sustainability goals. They see Varaha’s work as critical to promoting climate-resilient agriculture. Or, at least, that’s what I understood.

    The specifics are still emerging, but the core idea is clear: supporting farmers in adopting practices that improve soil health, conserve water, and boost biodiversity. The hope is that this will lead to more sustainable and productive farming, which, honestly, is something everyone can get behind.

    The move is interesting, especially given the ongoing conversations around climate change and food security. India, with its massive agricultural sector, is, you know, a key player in this. This investment, in a way, is a bet on a more sustainable future for the country’s farmers.

    It’s a step, anyway. A significant one, maybe. The details will matter, as always. But the initial impression is positive. And that, in itself, is something.

  • Carbon Credit Market: Consolidation & Uncertainty

    Carbon Credit Market: Consolidation & Uncertainty

    So, the carbon credit market — it’s changing, isn’t it? Seems like just yesterday everyone was talking about the gold rush, and now? Well, now we’re seeing some serious consolidation. Carbon Direct is buying Pachama, and honestly, it feels like a turning point.

    It’s not exactly a surprise, though. The voluntary carbon markets have been, you know, a bit of a wild west. Lots of players, lots of different standards, and a whole lot of questions about the actual impact of it all. This move by Carbon Direct, though… it’s different. It’s like a signal that the big players are starting to really dig in, ready to shape the future.

    And what does that future look like? That’s the million-dollar question, isn’t it? The TechCrunch article, published November 10, 2025, points to a period of uncertainty. You can feel it, too. There’s a lot of scrutiny on carbon credits right now, with folks wondering if they’re actually doing what they claim to do. Are we really offsetting emissions? Or are we just, well, shuffling numbers around?

    The Players and the Stakes

    Carbon Direct, for those who don’t know, is a climate solutions company. Pachama? They’re all about using tech to verify and manage carbon offset projects. So, in a way, it makes sense. A company that provides the credits, merging with one that helps to validate them. It seems logical, you could say.

    But it’s bigger than that, I think. This whole thing is about trust. The voluntary carbon markets need it. They need it badly. If companies can’t trust the credits, they won’t buy them. If investors aren’t confident, they’ll pull back. And that would be a problem, wouldn’t it? Because these markets, in theory, are supposed to be a key part of the fight against climate change.

    What Does This Mean for the Future?

    So, what happens next? Well, we’ll probably see more of this. More mergers, more acquisitions. The market is maturing, and that means some players will inevitably get squeezed out. The stronger, more established companies, like Carbon Direct, will likely swallow up the smaller ones, or at least partner up.

    This consolidation could be a good thing, you know? It could lead to more standardization, more transparency. Maybe it’ll help to weed out some of the, let’s say, less credible projects. It could also mean that the cost of carbon credits goes up, as the market becomes more concentrated. That’s something to watch.

    And then there’s the whole issue of demand. Will companies continue to buy carbon credits? Will they be willing to pay more? It all depends on the regulations, the public perception, and, of course, the actual effectiveness of these projects. It’s a complex web, for sure.

    A Changing Landscape

    The TechCrunch piece mentions this shift, and I think it’s spot on. The article really captures that feeling of a market in flux. It’s a bit like watching a storm gather. You can see the clouds rolling in, the wind picking up. You know something big is about to happen, but you can’t quite predict where the lightning will strike.

    So, yeah, the carbon credit market. It’s a story that’s still being written. And right now, it feels like a chapter is closing, and a new one is just beginning. For now, we wait and see what the future holds.

  • Mazama Energy: Geothermal Power for Data Centers 24/7

    Mazama Energy: Geothermal Power for Data Centers 24/7

    Mazama Energy’s Hot Rock Strategy: Powering Data Centers 24/7

    The energy landscape is undergoing a significant transformation, with renewable sources taking center stage. Among these, geothermal energy stands out for its potential to provide a consistent, 24/7 power supply. In this context, Mazama Energy, a geothermal startup, is making waves. The company, backed by Khosla Ventures, is innovating in the field of geothermal energy, and its recent achievements could reshape how data centers are powered.

    The Promise of Superhot Rocks

    At the heart of Mazama Energy’s strategy lies the exploitation of superhot rocks deep beneath the Earth’s surface. The company’s recent drilling efforts have reportedly yielded record-breaking temperatures, a critical step in harnessing geothermal energy efficiently. These ultra-high temperatures open the door to highly efficient power generation, making geothermal a potentially key player in meeting the increasing energy demands of data centers.

    How Mazama Energy is Innovating

    Mazama Energy’s approach involves drilling boreholes to access these superhot resources. The company’s success in achieving record temperatures demonstrates its technological prowess and commitment to pushing the boundaries of geothermal energy. This innovation is crucial, as it could unlock a more sustainable and reliable power source for data centers, which are known for their high energy consumption.

    The innovation by Mazama Energy has the potential to make geothermal energy competitive with traditional power sources. By providing a constant power supply, the company is also addressing a critical need for data centers, which require uninterrupted power to operate effectively. This could be a game-changer for the industry.

    Why Geothermal for Data Centers?

    The motivation behind Mazama Energy’s efforts is clear: to deliver 24/7 power. The company’s focus on data centers is strategic. Data centers consume vast amounts of electricity, and the demand is only increasing. By providing a reliable and sustainable power source, Mazama Energy aims to address the growing energy needs of this sector. This is a critical step towards a more sustainable future.

    Data centers are essential for today’s digital world, supporting everything from cloud computing to online services. However, their high energy consumption has raised environmental concerns. By utilizing geothermal energy, Mazama Energy not only aims to provide a reliable power source but also to reduce the carbon footprint of these critical facilities. This dual benefit makes geothermal an appealing option for data center operators.

    The Khosla Ventures Factor

    The backing of Khosla Ventures provides Mazama Energy with the financial support and industry expertise needed to succeed. Khosla Ventures has a strong track record of investing in innovative and sustainable technologies. Their involvement underscores the potential of Mazama Energy’s approach and its ability to disrupt the energy market. This partnership is a testament to the viability of geothermal energy as a sustainable power source.

    The Future of Geothermal Energy

    Mazama Energy’s work is a step towards a future where geothermal energy plays a more significant role in the global energy mix. With its focus on superhot rocks and data center power, the company is well-positioned to capitalize on the growing demand for sustainable energy solutions. While the geothermal industry faces challenges, the recent innovations by Mazama Energy offer a promising outlook.

    As the world moves towards renewable energy sources, the potential of geothermal energy is becoming increasingly clear. Mazama Energy’s innovative approach and strategic focus on data centers could pave the way for a more sustainable and reliable energy future. The company’s efforts are a testament to the power of innovation in the face of climate change.

  • COI Energy: Energy Sharing Revolution at TechCrunch Disrupt 2025

    COI Energy: Energy Sharing Revolution at TechCrunch Disrupt 2025

    COI Energy: Pioneering Energy Sharing at TechCrunch Disrupt 2025

    In the dynamic landscape of modern business, energy efficiency and sustainability are no longer just buzzwords; they are critical components of operational success. COI Energy is at the forefront of this transformation, offering a groundbreaking solution to a persistent challenge: the underutilization of electricity by large enterprises. This innovative platform is set to be showcased at TechCrunch Disrupt 2025, promising a paradigm shift in how businesses manage their energy resources.

    The Energy Conundrum: A Problem and a Solution

    Many large businesses routinely purchase more electricity than they actually consume. This excess capacity represents a significant financial inefficiency and a missed opportunity for greater sustainability. COI Energy addresses this problem head-on by providing a patented platform that empowers businesses to sell and share their unused electricity. This not only optimizes energy usage but also fosters a more sustainable and collaborative energy ecosystem.

    How COI Energy Works: A Technological Marvel

    The core of COI Energy’s innovation lies in its proprietary technology. The platform allows businesses to monitor their energy consumption in real-time. It then facilitates the secure and efficient selling of surplus energy to other businesses or the grid. This process is streamlined, transparent, and compliant with all relevant regulations, ensuring a seamless experience for all participants. The platform’s sophisticated algorithms optimize pricing and distribution, maximizing the value of the shared energy.

    Key Features and Benefits:

    • Real-time Monitoring: Provides businesses with detailed insights into their energy consumption patterns.
    • Automated Trading: Simplifies the process of selling and buying excess energy.
    • Secure Transactions: Ensures safe and compliant energy trading.
    • Sustainability: Reduces waste and promotes the efficient use of energy resources.
    • Cost Savings: Offers businesses a new revenue stream by monetizing unused electricity.

    The Significance of TechCrunch Disrupt 2025

    TechCrunch Disrupt is renowned for showcasing the most innovative and disruptive technologies. COI Energy’s presence at TechCrunch Disrupt 2025 underscores the significance of its solution in the rapidly evolving energy sector. This event provides a crucial platform for COI Energy to connect with investors, potential partners, and industry leaders, accelerating its mission to transform the energy landscape.

    Why COI Energy Matters: The Future of Energy

    The why behind COI Energy’s mission is clear: to create a more sustainable and efficient energy future. By enabling businesses to actively participate in the energy market, COI Energy is fostering a sharing economy that benefits both the environment and the bottom line. This approach not only reduces carbon footprints but also promotes a more resilient and decentralized energy infrastructure. This aligns with the global shift towards renewable energy sources and sustainable business practices.

    Looking Ahead: The Impact of COI Energy

    COI Energy is poised to make a significant impact on the energy sector. By providing a practical and efficient solution for managing unused electricity, the company is empowering businesses to become active participants in the energy transition. As the world moves towards a more sustainable future, COI Energy’s innovative platform is set to play a pivotal role in shaping a more efficient, resilient, and environmentally friendly energy ecosystem.

    For more information, visit the COI Energy website or catch them at TechCrunch Disrupt 2025.

    Sources:

    1. TechCrunch. (2025, October 27). COI Energy solves a conundrum: Letting businesses sell unused electricity — catch it at TechCrunch Disrupt 2025.
  • Strong by Form to Launch Ultralight Wood at TechCrunch Disrupt

    Strong by Form to Launch Ultralight Wood at TechCrunch Disrupt

    Strong by Form to Showcase Ultralight Engineered Wood at TechCrunch Disrupt 2025

    The construction industry is on the cusp of a significant transformation, with sustainability and carbon footprint reduction at the forefront. Strong by Form is poised to make a substantial contribution to this shift. The company will be showcasing its innovative ultralight engineered wood at TechCrunch Disrupt 2025, an event scheduled for October 27, 2025.

    Addressing Carbon Emissions in Construction

    The environmental impact of building materials and construction processes is substantial. According to the World Green Building Council, the global construction sector accounts for a staggering 11% of global carbon emissions. This figure underscores the urgent need for sustainable alternatives to traditional building materials. The why behind this innovation is clear: to mitigate the construction industry’s considerable contribution to climate change.

    The Innovation: Ultralight Engineered Wood

    What makes Strong by Form’s offering unique is its ultralight engineered wood. This material presents a significant advantage over conventional building materials, such as concrete and steel, which have high carbon footprints. The how of this innovation involves advanced engineering processes that create a strong yet lightweight material. This not only reduces the carbon footprint associated with manufacturing but also simplifies transportation and construction, further minimizing environmental impact.

    TechCrunch Disrupt 2025: A Platform for Innovation

    TechCrunch Disrupt provides an ideal platform for Strong by Form to unveil its technology. The event attracts industry leaders, investors, and media from around the globe. This exposure will allow Strong by Form to gain crucial visibility and potentially secure partnerships that can accelerate the adoption of its sustainable building materials. The when of this event, October 27, 2025, marks a significant milestone for the company and the broader sustainable construction movement.

    Benefits and Impact

    The advantages of using ultralight engineered wood are manifold. Besides its reduced carbon footprint, the material offers potential benefits in terms of construction speed and cost-effectiveness. The lightweight nature of the wood simplifies handling and installation, reducing labor costs and construction timelines. Furthermore, the use of timber in construction can promote carbon sequestration, as trees absorb carbon dioxide from the atmosphere during their growth. This contributes to a more sustainable and environmentally friendly building process.

    The Future of Sustainable Construction

    The emergence of innovative materials like Strong by Form’s ultralight engineered wood points to a promising future for sustainable construction. As the industry continues to seek ways to reduce its environmental impact, the demand for eco-friendly alternatives is expected to grow. Strong by Form, with its forward-thinking approach, is well-positioned to become a leader in this evolving landscape.

    Conclusion

    Strong by Form’s presence at TechCrunch Disrupt 2025 signifies a notable advancement in sustainable building practices. By introducing ultralight engineered wood, the company is offering a compelling solution to address the carbon footprint of the construction sector. This innovation has the potential to reshape how we build, creating a more sustainable future for the industry and the environment.

    Source: TechCrunch

  • Google Data Protection: Cryptographic Erasure Explained

    Google Data Protection: Cryptographic Erasure Explained

    Google’s Future of Data Protection: Cryptographic Erasure Explained

    Protecting user data is a top priority at Google. To bolster this commitment, Google is transitioning to a more advanced method of media sanitization: cryptographic erasure. Starting in November 2025, Google will move away from traditional “brute force disk erase” methods, embracing a layered encryption strategy to safeguard user information.

    The Limitations of Traditional Data Erasure

    For nearly two decades, Google has relied on overwriting data as a primary means of media sanitization. While effective, this approach is becoming increasingly unsustainable. The sheer size and complexity of modern storage media make the traditional method slow and resource-intensive. As storage technology evolves, Google recognized the need for a more efficient and environmentally conscious solution.

    Enter Cryptographic Erasure: A Smarter Approach

    Cryptographic erasure offers a modern and efficient alternative. Since all user data within Google’s services is already protected by multiple layers of encryption, this method leverages existing security infrastructure. Instead of overwriting the entire drive, Google will securely delete the cryptographic keys used to encrypt the data. Once these keys are gone, the data becomes unreadable and unrecoverable.

    This approach offers several key advantages:

    • Speed and Efficiency: Cryptographic erasure is significantly faster than traditional overwriting methods.
    • Industry Best Practices: The National Institute of Standards and Technology (NIST) recognizes cryptographic erasure as a valid sanitization technique.
    • Enhanced Security: Google implements cryptographic erasure with multiple layers of security, employing a defense-in-depth strategy.

    Enhanced Security Through Innovation

    Google’s implementation of cryptographic erasure includes a “trust-but-verify” model. This involves independent verification mechanisms to ensure the permanent deletion of media encryption keys. Furthermore, secrets involved in this process, such as storage device keys, are protected with industry-leading security measures. Multiple key rotations further enhance the security of customer data through independent layers of trusted encryption.

    Sustainability and the Circular Economy

    The older “brute force disk erase” method had a significant environmental impact. Storage devices that failed verification were physically destroyed, leading to the disposal of a large number of devices annually. Cryptographic erasure promotes a more sustainable, circular economy by eliminating the need for physical destruction. This enables Google to reuse more hardware and recover valuable rare earth materials, such as neodymium magnets, from end-of-life media. This innovative magnet recovery process marks a significant step forward in sustainable manufacturing.

    Google’s Commitment to Data Protection and Sustainability

    Google has consistently advocated for practices that benefit users, the industry, and the environment. The transition to cryptographic erasure reflects this commitment. It allows Google to enhance security, align with the highest industry standards set forth by organizations such as the National Institute of Standards and Technology (NIST), and build a more sustainable future for its infrastructure. Cryptographic erasure ensures data protection while minimizing environmental impact and promoting responsible growth.

    For more detailed information about encryption at rest, including encryption key management, refer to Google’s default encryption at rest security whitepaper. This document provides a comprehensive overview of Google’s data protection strategies.

    Source: Cloud Blog

  • Agile AI Data Centers: Fungible Architectures for the AI Era

    Agile AI Data Centers: Fungible Architectures for the AI Era

    Agile AI Architectures: Building Fungible Data Centers for the AI Era

    Artificial Intelligence (AI) is rapidly transforming every aspect of our lives, from healthcare to software engineering. Innovations like Google’s Magic Cue on the Pixel 10, Nano Banana Gemini 2.5 Flash image generation, Code Assist, and Deepmind’s AlphaFold highlight the advancements made in just the past year. These breakthroughs are powered by equally impressive developments in computing infrastructure.

    The exponential growth in AI adoption presents significant challenges for data center design and management. At Google I/O, it was revealed that Gemini models process nearly a quadrillion tokens monthly, with AI accelerator consumption increasing 15-fold in the last 24 months. This explosive growth necessitates a new approach to data center architecture, emphasizing agility and fungibility to manage volatility and heterogeneity effectively.

    Addressing the Challenges of AI Growth

    Traditional data center planning involves long lead times that struggle to keep pace with the dynamic demands of AI. Each new generation of AI hardware, such as TPUs and GPUs, introduces unique power, cooling, and networking requirements. This rapid evolution increases the complexity of designing, deploying, and maintaining data centers. Furthermore, the need to support various data center facilities, from hyperscale environments to colocation providers across multiple regions, adds another layer of complexity.

    To address these challenges, Google, in collaboration with the Open Compute Project (OCP), advocates for designing data centers with fungibility and agility as core principles. Modular architectures, interoperable components, and the ability to late-bind facilities and systems are essential. Standard interfaces across all data center components—power delivery, cooling, compute, storage, and networking—are also crucial.

    Power and Cooling Innovations

    Achieving agility in power management requires standardizing power delivery and building a resilient ecosystem with common interfaces at the rack level. The Open Compute Project (OCP) is developing technologies like +/-400Vdc designs and disaggregated solutions using side-car power. Emerging technologies such as low-voltage DC power and solid-state transformers promise fully integrated data center solutions in the future.

    Data centers are also being reimagined as potential suppliers to the grid, utilizing battery-operated storage and microgrids. These solutions help manage the “spikiness” of AI training workloads and improve power efficiency. Cooling solutions are also evolving, with Google contributing Project Deschutes, a state-of-the-art liquid cooling solution, to the OCP community. Companies like Boyd, CoolerMaster, Delta, Envicool, Nidec, nVent, and Vertiv are showcasing liquid cooling demos, highlighting the industry’s enthusiasm.

    Standardization and Open Standards

    Integrating compute, networking, and storage in the server hall requires standardization of physical attributes like rack height, width, and weight, as well as aisle layouts and network interfaces. Standards for telemetry and mechatronics are also necessary for building and maintaining future data centers. The Open Compute Project (OCP) is standardizing telemetry integration for third-party data centers, establishing best practices, and developing common naming conventions and security protocols.

    Beyond physical infrastructure, collaborations are focusing on open standards for scalable and secure systems:

    • Resilience: Expanding manageability, reliability, and serviceability efforts from GPUs to include CPU firmware updates.
    • Security: Caliptra 2.0, an open-source hardware root of trust, defends against threats with post-quantum cryptography, while OCP S.A.F.E. streamlines security audits.
    • Storage: OCP L.O.C.K. provides an open-source key management solution for storage devices, building on Caliptra’s foundation.
    • Networking: Congestion Signaling (CSIG) has been standardized, improving load balancing. Advancements in SONiC and efforts to standardize Optical Circuit Switching are also underway.

    Sustainability Initiatives

    Sustainability is a key focus. Google has developed a methodology for measuring the environmental impact of AI workloads, demonstrating that a typical Gemini Apps text prompt consumes minimal water and energy. This data-driven approach informs collaborations within the Open Compute Project (OCP) on embodied carbon disclosure, green concrete, clean backup power, and reduced manufacturing emissions.

    Community-Driven Innovation

    Google emphasizes the power of community collaborations and invites participation in the new OCP Open Data Center for AI Strategic Initiative. This initiative focuses on common standards and optimizations for agile and fungible data centers.

    Looking ahead, leveraging AI to optimize data center design and operations is crucial. Deepmind’s AlphaChip, which uses AI to accelerate chip design, exemplifies this approach. AI-enhanced optimizations across hardware, firmware, software, and testing will drive the next wave of improvements in data center performance, agility, reliability, and sustainability.

    The future of data centers in the AI era depends on community-driven innovation and the adoption of agile, fungible architectures. By standardizing interfaces, promoting open collaboration, and prioritizing sustainability, the industry can meet the growing demands of AI while minimizing environmental impact. These efforts will unlock new possibilities and drive further advancements in AI and computing infrastructure.

    Source: Cloud Blog

  • Google’s Encryption-Based Data Erasure: Future of Sanitization

    Google’s Encryption-Based Data Erasure: Future of Sanitization

    Google’s Future of Data Sanitization: Encryption-Based Erasure

    Protecting user data is a top priority for Google. To bolster this commitment, Google has announced a significant shift in its approach to media sanitization. Starting in November 2025, the company will transition to a fully encryption-based strategy, moving away from traditional disk erasure methods. This change addresses the evolving challenges of modern storage technology while enhancing data security and promoting sustainability.

    The Limitations of Traditional Disk Erasure

    For nearly two decades, Google has relied on the “brute force disk erase” process. While effective in the past, this method is becoming increasingly unsustainable due to the sheer size and complexity of today’s storage media. Overwriting entire drives is time-consuming and resource-intensive, prompting the need for a more efficient and modern solution.

    Cryptographic Erasure: A Smarter Approach

    To overcome these limitations, Google is adopting cryptographic erasure, a method recognized by the National Institute of Standards and Technology (NIST) as a valid sanitization technique. This approach leverages Google’s existing multi-layered encryption to sanitize media. Instead of overwriting the entire drive, the cryptographic keys used to encrypt the data are securely deleted. Once these keys are gone, the data becomes unreadable and unrecoverable.

    This method offers several advantages:

    • Enhanced Speed and Efficiency: Cryptographic erasure is significantly faster than traditional overwriting methods.
    • Alignment with Industry Best Practices: It aligns with standards set by organizations like NIST. [Source: Google Cloud Blog]
    • Improved Security: By focusing on key deletion, it adds another layer of security to data sanitization.

    Defense in Depth: Multiple Layers of Security

    Google implements cryptographic erasure with a “defense in depth” strategy, incorporating multiple layers of security. This includes independent verification mechanisms to ensure the permanent deletion of media encryption keys. Secrets involved in the process, such as storage device keys, are protected with industry-leading measures. Multiple key rotations further enhance the security of customer data through independent layers of trusted encryption.

    Sustainability and the Circular Economy

    The transition to cryptographic erasure also addresses environmental concerns. Previously, storage devices that failed verification were physically destroyed, leading to the destruction of a significant number of devices annually. Cryptographic erasure allows Google to reuse more of its hardware, promoting a more sustainable, circular economy.

    Furthermore, this approach enables the recovery of valuable rare earth materials, such as neodymium magnets, from end-of-life media. This innovative magnet recovery process marks a significant achievement in sustainable manufacturing, demonstrating Google’s commitment to responsible growth.

    Google’s Commitment

    Google has consistently advocated for practices that benefit its users, the broader industry, and the environment. This transition to cryptographic erasure reflects that commitment. It allows Google to enhance security, align with the highest industry standards, and build a more sustainable future for its infrastructure.

    For more detailed information about encryption at rest, including encryption key management, refer to Google’s default encryption at rest security whitepaper. [Source: Google Cloud Blog]

    Conclusion

    By embracing cryptographic erasure, Google is taking a proactive step towards a more secure, efficient, and sustainable future for data sanitization. This innovative approach not only enhances data protection but also contributes to a circular economy by reducing electronic waste and enabling the recovery of valuable resources. This transition underscores Google’s ongoing commitment to responsible data management and environmental stewardship.

  • Agile AI: Google’s Fungible Data Centers for the AI Era

    Agile AI: Google’s Fungible Data Centers for the AI Era

    Agile AI Architectures: A Fungible Data Center for the Intelligent Era

    Artificial intelligence (AI) is rapidly transforming every aspect of our lives, from healthcare to software engineering. Google has been at the forefront of these advancements, showcasing developments like Magic Cue on the Pixel 10, Nano Banana Gemini 2.5 Flash image generation, Code Assist, and AlphaFold. These breakthroughs are powered by equally impressive advancements in computing infrastructure. However, the increasing demands of AI services require a new approach to data center design.

    The Challenge of Dynamic Growth and Heterogeneity

    The growth in AI is staggering. Google reported a nearly 50X annual growth in monthly tokens processed by Gemini models, reaching 480 trillion tokens per month, and has since seen an additional 2X growth, hitting nearly a quadrillion monthly tokens. AI accelerator consumption has grown 15X in the last 24 months, and Hyperdisk ML data has grown 37X since GA. Moreover, there are more than 5 billion AI-powered retail search queries per month. This rapid growth presents significant challenges for data center planning and system design.

    Traditional data center planning involves long lead times, but AI demand projections are now changing dynamically and dramatically, creating a mismatch between supply and demand. Furthermore, each generation of AI hardware, such as TPUs and GPUs, introduces new features, functionalities, and requirements for power, rack space, networking, and cooling. The increasing rate of introduction of these new generations complicates the creation of a coherent end-to-end system. Changes in form factors, board densities, networking topologies, power architectures, and liquid cooling solutions further compound heterogeneity, increasing the complexity of designing, deploying, and maintaining systems and data centers. This also includes designing for a spectrum of data center facilities, from hyperscale to colocation providers, across multiple geographical regions.

    The Solution: Agility and Fungibility

    To address these challenges, Google proposes designing data centers with fungibility and agility as primary considerations. Architectures need to be modular, allowing components to be designed and deployed independently and be interoperable across different vendors or generations. They should support the ability to late-bind the facility and systems to handle dynamically changing requirements. Data centers should be built on agreed-upon standard interfaces, so investments can be reused across multiple customer segments. These principles need to be applied holistically across all components of the data center, including power delivery, cooling, server hall design, compute, storage, and networking.

    Power Management

    To achieve agility and fungibility in power, Google emphasizes standardizing power delivery and management to build a resilient end-to-end power ecosystem, including common interfaces at the rack power level. Collaborating with the Open Compute Project (OCP), Google introduced new technologies around +/-400Vdc designs and an approach for transitioning from monolithic to disaggregated solutions using side-car power (Mt. Diablo). Promising technologies like low-voltage DC power combined with solid state transformers will enable these systems to transition to future fully integrated data center solutions.

    Google is also evaluating solutions for data centers to become suppliers to the grid, not just consumers, with corresponding standardization around battery-operated storage and microgrids. These solutions are already used to manage the “spikiness” of AI training workloads and for additional savings around power efficiency and grid power usage.

    Data Center Cooling

    Data center cooling is also being reimagined for the AI era. Google announced Project Deschutes, a state-of-the-art liquid cooling solution contributed to the Open Compute community. Liquid cooling suppliers like Boyd, CoolerMaster, Delta, Envicool, Nidec, nVent, and Vertiv are showcasing demos at major events. Further collaboration is needed on industry-standard cooling interfaces, new components like rear-door-heat exchangers, and reliability. Standardizing layouts and fit-out scopes across colocation facilities and third-party data centers is particularly important to enable more fungibility.

    Server Hall Design

    Bringing together compute, networking, and storage in the server hall requires standardization of physical attributes such as rack height, width, depth, weight, aisle widths, layouts, rack and network interfaces, and standards for telemetry and mechatronics. Google and its OCP partners are standardizing telemetry integration for third-party data centers, including establishing best practices, developing common naming and implementations, and creating standard security protocols.

    Open Standards for Scalable and Secure Systems

    Beyond physical infrastructure, Google is collaborating with partners to deliver open standards for more scalable and secure systems. Key highlights include:

    • Resilience: Expanding efforts on manageability, reliability, and serviceability from GPUs to include CPU firmware updates and debuggability.
    • Security: Caliptra 2.0, the open-source hardware root of trust, now defends against future threats with post-quantum cryptography, while OCP S.A.F.E. makes security audits routine and cost-effective.
    • Storage: OCP L.O.C.K. builds on Caliptra’s foundation to provide a robust, open-source key management solution for any storage device.
    • Networking: Congestion Signaling (CSIG) has been standardized and is delivering measured improvements in load balancing. Alongside continued advancements in SONiC, a new effort is underway to standardize Optical Circuit Switching.

    Sustainability

    Sustainability is embedded in Google’s work. They developed a new methodology for measuring the energy, emissions, and water impact of emerging AI workloads. This data-driven approach is applied to other collaborations across the OCP community, focusing on an embodied carbon disclosure specification, green concrete, clean backup power, and reduced manufacturing emissions.

    AI-for-AI

    Looking ahead, Google plans to leverage AI advances in its own work to amplify productivity and innovation. Deepmind AlphaChip, which uses AI to accelerate and optimize chip design, is an early example. Google sees more promising uses of AI for systems across hardware, firmware, software, and testing; for performance, agility, reliability, and sustainability; and across design, deployment, maintenance, and security. These AI-enhanced optimizations and workflows will bring the next order-of-magnitude improvements to the data center.

    Conclusion

    Google’s vision for agile and fungible data centers is crucial for meeting the dynamic demands of AI. By focusing on modular architectures, standardized interfaces, power management, liquid cooling, and open compute standards, Google aims to create data centers that can adapt to rapid changes and support the next wave of AI innovation. Collaboration within the OCP community is essential to driving these advancements forward.

    Source: Cloud Blog