STOCK TITAN

AZIO AI and Envirotech Vehicles (NASDAQ: EVTV) Execute Scalable AI Infrastructure Strategy Designed for Multi-Megawatt Expansion

Rhea-AI Impact
(Moderate)
Rhea-AI Sentiment
(Neutral)
Tags
AI

Envirotech Vehicles (NASDAQ: EVTV) and AZIO AI announced a scalable AI infrastructure program focused on integrated power, cooling, and modular compute to support sustained, production-level AI workloads.

The Texas reference deployment uses behind-the-meter natural gas generation and purpose-built cooling to run continuous operations and gather live performance, efficiency, uptime, and economics data. Initial modules support ~500 kW compute (≈1,000–1,250 units) with designed expansion paths to ~5 MW (10,000+ units) and larger multi‑megawatt campuses. The commercial framework has AZIO AI selling hardware while EVTV owns hosted assets.

Loading...
Loading translation...

Positive

  • Initial module supports ~500 kW compute (~1,000–1,250 units)
  • Design pathway to ~5 MW supporting 10,000+ compute units
  • On-site power (behind-the-meter) enables faster deployment timelines
  • Live operational data from Texas reference deployment to refine design

Negative

  • None.

News Market Reaction – NVDA

+2.95%
1 alert
+2.95% News Effect

On the day this news was published, NVDA gained 2.95%, reflecting a moderate positive market reaction.

Data tracked by StockTitan Argus on the day of publication.

Integrated Power, Cooling, and Compute Architecture Addresses a Rapidly Expanding Global AI Infrastructure Market Measured in the Tens of Billions of Dollars Annually

LOS ANGELES, Jan. 21, 2026 /PRNewswire/ -- AZIO AI, a next-generation artificial intelligence and high-performance computing infrastructure platform, and Envirotech Vehicles, Inc. (NASDAQ: EVTV) ("EVTV"), today announced continued execution of a scalable AI infrastructure program designed to support sustained, real-world compute workloads using purpose-built cooling systems, self-owned behind-the-meter power, and modular deployment architecture.

The infrastructure is engineered to operate under continuous, production-level demand, generating live operational data that directly informs system design, efficiency, and economics as AZIO AI advances toward larger-scale multi-megawatt deployments.

Purpose-Built Infrastructure Designed for Scale
AZIO AI is deploying proprietary cooling and power systems engineered specifically for high-density AI computing. Unlike legacy data-center designs adapted for AI workloads, AZIO AI's architecture is optimized from inception for sustained thermal performance, power efficiency, and operational reliability as compute density increases.

By integrating cooling and power at the infrastructure level, AZIO AI aims to improve performance predictability while reducing dependency on third-party utility constraints that increasingly limit large-scale AI deployments.

Self-Owned, Behind-the-Meter Power Strategy
The infrastructure utilizes on-site, behind-the-meter power generation, providing AZIO AI with increased control over cost structure, uptime reliability, and expansion timelines. This approach is designed to mitigate grid congestion and long interconnection lead times that have become a growing challenge for AI-driven compute facilities.

The power strategy is intended to be repeatable across future AZIO AI sites, supporting faster deployment of additional capacity as demand scales.

Texas-Based Reference Deployment Supporting Global Infrastructure Development
AZIO AI's initial deployment is located in Texas, at operating oil field sites that produce on-site natural gas from underground reserves, which is used to generate power on a continuous, 24/7 basis through dedicated generation equipment.

The Texas deployment is structured as a reference operating environment, enabling AZIO AI to evaluate power generation, cooling performance, system reliability, and operational economics under sustained, real-world conditions. Rather than being site-specific, the infrastructure is designed to generate data and operational insights that can be applied across AZIO AI's broader platform.

Learnings derived from this deployment—including power management, thermal efficiency, uptime characteristics, and modular deployment practices—are intended to inform the design, deployment, and operation of future AZIO AI infrastructure across international markets, including locations where grid capacity, energy reliability, and deployment timelines present similar constraints.

By utilizing behind-the-meter natural gas generation and modular infrastructure in a controlled operating environment, AZIO AI is developing repeatable deployment frameworks intended to support future domestic and overseas data center and AI infrastructure projects.

Built for Continuous, Real-World AI Workloads
The system is designed to operate under sustained, full-time computing demand, enabling AZIO AI to measure real-time performance across power utilization, cooling efficiency, uptime, and system economics. Data gathered from live operations is used to refine infrastructure design, improve cost efficiency, and accelerate deployment of larger-scale facilities.

This operational data forms a core component of AZIO AI's broader infrastructure roadmap, supporting expansion from initial deployments to multi-megawatt and, over time, larger-scale AI compute campuses.

Modular Architecture Enables Incremental Expansion
The modular design allows compute and power capacity to be added incrementally while maintaining consistent performance standards. Initial configurations are capable of supporting approximately 500 kilowatts (kW) of compute load, with expansion pathways designed to scale to multiple megawatts as deployment milestones are achieved.

Based on typical high-density AI configurations:

  • Approximately 500 kW can support 1,000–1,250 compute units
  • Approximately 5 megawatts (MW) can support 10,000+ compute units
  • Larger deployments in the tens of megawatts can support 100,000+ compute units, subject to final configuration and commercial arrangements

Commercial Framework with EVTV
Under the current structure:

  • AZIO AI supplies and sells compute hardware
  • EVTV owns the deployed compute assets
  • Compute units are hosted within a managed hosting environment
  • AZIO AI participates through hardware sales and platform-level infrastructure economics
  • EVTV participates through infrastructure ownership and hosting economics

This framework is designed to align long-term incentives while supporting scalable deployment across future facilities.

Market Opportunity and Industry Context
Global demand for artificial intelligence compute and supporting infrastructure continues to accelerate, driven by the rapid adoption of large language models, enterprise AI deployment, sovereign AI initiatives, and increasing power constraints across traditional data center markets. According to publicly available research and commentary from leading global institutions including McKinsey & Company, Bloomberg, Goldman Sachs, NVIDIA (NASDAQ: NVDA), and the International Energy Agency (IEA), AI-driven data center expansion, power infrastructure, and advanced cooling requirements represent a rapidly expanding global market measured in the tens of billions of dollars annually, with continued growth expected as AI workloads scale and energy-efficient, behind-the-meter architectures become increasingly critical.

AZIO AI's vertically integrated approach—combining purpose-built cooling, self-owned power, modular infrastructure, and high-density compute—positions the Company to participate in this expanding market as deployments scale across future facilities.

Forward-Looking Statements
This press release contains statements that do not relate to historical facts but are "forward-looking statements" within the meaning of the safe harbor provisions of the U.S. Private Securities Litigation Reform Act of 1995. These statements can generally (although not always) be identified by their use of terms and phrases such as anticipate, appear, believe, continue, could, estimate, expect, indicate, intend, may, plan, possible, predict, project, pursue, will, would and other similar terms and phrases, as well as the use of the future tense. Forward-looking statements are neither historical facts nor assurances of future performance. Instead, they are based only on current beliefs, expectations and assumptions regarding the future of the business of the Company, future plans and strategies, projections, anticipated events and trends, the economy and other future conditions. Because forward-looking statements relate to the future, they are subject to inherent uncertainties, risks and changes in circumstances that are difficult to predict and many of which are outside of our control. Actual results and financial condition may differ materially from those indicated in the forward-looking statements. Therefore, you should not rely on any of these forward-looking statements. Forward-looking statements in this press release speak only as of the date hereof. Unless otherwise required by law, we undertake no obligation to publicly update or revise these forward-looking statements, whether because of new information, future events or otherwise.

Media & Investor Relations Contact

Phoenix Management Consulting
Email: press@phoenixmgmtconsulting.com

Cision View original content to download multimedia:https://www.prnewswire.com/news-releases/azio-ai-and-envirotech-vehicles-nasdaq-evtv-execute-scalable-ai-infrastructure-strategy-designed-for-multi-megawatt-expansion-302666182.html

SOURCE Azio AI Corporation

FAQ

What capacity does the AZIO AI reference deployment for EVTV start with?

The initial configuration supports approximately 500 kW of compute load (about 1,000–1,250 compute units).

How far can AZIO AI's modular architecture scale for EVTV projects?

Expansion pathways are designed to scale to roughly 5 MW (10,000+ units) and to multi‑megawatt campuses beyond that.

What power strategy is AZIO AI using at the EVTV Texas site?

AZIO AI uses behind-the-meter on-site natural gas generation to provide continuous 24/7 power for compute operations.

How do AZIO AI and EVTV split commercial roles in the deployment?

AZIO AI supplies and sells compute hardware while EVTV owns the deployed assets and hosts them in a managed environment.

What operational data will the Texas deployment produce for EVTV and AZIO AI?

It will generate live metrics on power management, thermal efficiency, uptime, and system economics to inform future deployments.

Will AZIO AI's design reduce dependence on local utilities for EVTV deployments?

Yes; the integrated cooling and self-owned power approach is intended to reduce dependency on third-party utilities and mitigate interconnection delays.
Nvidia Corporation

NASDAQ:NVDA

View NVDA Stock Overview

NVDA Rankings

NVDA Latest News

NVDA Latest SEC Filings

NVDA Stock Data

4.49T
23.24B
Semiconductors
Semiconductors & Related Devices
Link
United States
SANTA CLARA