The majority of people do not associate Tasmania with state-of-the-art technology. It’s the kind of island that conjures images of wilderness paths, thick forests, and crisp, cold air off the Southern Ocean. It turns out that this reputation is precisely why one Australian startup chose to construct something remarkable there: an AI data center powered by renewable energy and equipped with 36,000 Nvidia chips, which has quietly grown to become one of the most important infrastructure bets in the Southern Hemisphere.
The project is based on the idea that you can run high-performance AI compute at scale without burning the planet in the process, which still sounds almost counterintuitive in an industry addicted to cheap fossil fuel power. This idea has attracted the attention of both investors and climate-focused technology observers. That argument is much simpler to make here than it would be practically anywhere else on the planet because of Tasmania’s hydroelectric grid, which produces the great majority of the island’s electricity from water rather than coal or gas.
| Category | Details |
|---|---|
| Company Name | Nautilus Data Technologies / AirTrunk-backed venture (AU-based) |
| Location | Tasmania, Australia |
| GPU Count | 36,000 Nvidia chips |
| Energy Source | Renewable energy (primarily hydroelectric) |
| Sector | AI Infrastructure / Data Centers |
| Project Scale | Hyperscale AI data center |
| Key Technology Partner | Nvidia |
| Environmental Goal | Net-zero operational carbon |
| Reference | International Energy Agency – Data Centres |
| Global Context | Data center electricity demand projected to hit 130GW by 2028 |
| Industry Benchmark | Annual data center electricity demand growing at ~16% per year |
| Further Reading | Ramboll Whitepaper on Sustainable Data Centers |
As you watch this happen, you get the impression that the timing is deliberate. The Boston Consulting Group projects that global data center electricity consumption will rise sharply to 130 gigawatts by 2028, or about three percent of all electricity consumed globally. It’s difficult to accept the numbers. Approximately 100,000 households’ worth of energy is already consumed by a single hyperscale data center. The environmental math becomes truly concerning when you multiply that pressure by the dozens of new AI facilities that are announced on a quarterly basis.
The Tasmanian play is intriguing because of this pressure, and not just symbolically. According to reports, the startup has committed or installed 36,000 Nvidia GPUs, which puts it in a different league from typical regional operators. Nvidia chips are incredibly power-hungry, especially the H100 and later models. It’s not just a marketing ploy to run so many of them on renewable energy.

It needs a location where the grid can truly support it, as well as serious infrastructure and power agreements. Tasmania is one of the few locations in the world where this equation truly works because of its long-standing hydroelectric system and comparatively low industrial competition for that power.
It’s still unclear if the business will experience the same scaling issues that have impeded similar green data center aspirations in other places. Even Tasmania’s hydropower has its limits as the demand for AI infrastructure continues to grow, and renewable grids are not infinitely flexible. According to the IEA, data centers used about 1.5 percent of the world’s electricity in 2024; estimates indicate that percentage could double in six years. Tasmania is not exempt from this pressure, and it’s important to consider whether the island’s grid can handle more demand without ultimately sacrificing what initially makes it appealing.
Nevertheless, this startup’s efforts have implications that go beyond their financial success. In general, the AI sector as a whole has been hesitant to embrace true environmental responsibility. Across the American Southwest, the Middle East, and Southeast Asia, data centers are being announced with often ambiguous and out-of-date sustainability commitments. These locations are frequently in areas with water scarcity issues and carbon-heavy power grids. In light of this, building in Tasmania appears to be more of an intentional debate about what responsible AI infrastructure should truly look like than a quirky location choice.
The model appears to have legs, according to investors. It’s also difficult to ignore the fact that running tens of thousands of Nvidia GPUs on clean hydropower at the bottom of the world, which previously seemed like an unrealistic idealism, is now receiving significant commercial attention. The grid in Tasmania and the goals of this startup will be put to the test in the coming years to see if they can keep up with the unrelenting expansion of the AI compute market. However, for the time being, something truly unique is being constructed there. Silently, away from the cacophony of Silicon Valley, in the chilly air.
