Australia’s first sovereign AI inferencing node has been launched in Sydney, marking a major milestone in the country’s AI infrastructure build-out. Deployed by SCX.ai, the system delivers AI inferencing that is up to ten times more energy-efficient than conventional GPU-based platforms, while operating without water-based cooling.
The node is now live at Equinix’s SY5 International Business Exchange data center. As a result, Australian enterprises and government agencies can access onshore AI processing that meets strict data residency requirements and lowers environmental impact.
Sovereign AI Arrives at a Critical Moment
The launch comes as global AI growth places growing pressure on power grids and water supplies. Meanwhile, Australian organizations have faced limited local options for advanced AI workloads.
Through this deployment, that gap is being addressed. AI inferencing is being delivered domestically, with no data routed offshore. At the same time, latency is reduced for real-time applications operating within Australia’s digital core.
According to SCX.ai, the facility achieves the lowest carbon footprint per AI token among operating AI sites in the Asia-Pacific region.
Efficiency Gains Without Water Cooling
Unlike hyperscale AI facilities that rely on water-intensive cooling, the Sydney node runs without water dependency. This design choice allows sustained AI workloads while avoiding the strain placed on local resources.
In addition, the platform uses ASIC-accelerated architecture, developed through SCX.ai’s partnership with SambaNova Systems. Higher inference throughput per watt is delivered, allowing dense production workloads that traditional GPU systems struggle to support.
As a result, energy costs are reduced while performance remains consistent.
Enterprise and Government Use Cases Targeted
The infrastructure has been positioned for sectors handling sensitive data, including financial services, healthcare, and government agencies. All AI processing is performed within Australia, ensuring compliance with national data protection rules.
David Keane, founder and CEO of SCX.ai, said the launch changes how AI can be deployed at scale across the country. He noted that advanced AI workloads can now be run locally without tradeoffs in cost, sustainability, or control.
Equinix Supports High-Performance AI Deployment
The node operates within Equinix SY5, a facility designed for next-generation compute workloads. Carrier-dense connectivity, private interconnection, and security infrastructure are provided to support enterprise-grade AI operations.
Equinix Australia Managing Director Guy Danskine said sovereignty has become a central consideration for organizations deploying AI. He added that local development can now occur while maintaining connectivity to global cloud and partner ecosystems.
Project MAGPiE and Local Partnerships
Alongside customer workloads, the infrastructure will support Project MAGPiE, SCX.ai’s Australia-focused large language model. The model has shown strong performance in localized benchmarks and was designed to reflect Australian terminology and context.
Local operations and managed services are being delivered by Servers Australia, an Australian-owned infrastructure provider. Through this partnership, domestic technical support and skills development are being strengthened.
National Rollout Planned
The Sydney deployment marks the first phase of a broader national strategy. Additional sovereign AI inferencing nodes are scheduled to come online during 2026.
As capacity expands, latency will be reduced for regional users. Demand for trusted, domestically hosted AI infrastructure is also expected to rise.
Overall, the rollout positions Australia to scale AI adoption while retaining control over data governance, system performance, and environmental impact. Those factors are increasingly shaping enterprise AI strategies worldwide.
