STOCK TITAN

Wetour Robotics (NASDAQ: WETO) sets May 28 launch for Orchestra OS

Filing Impact
(Neutral)
Filing Sentiment
(Neutral)
Form Type
6-K

Rhea-AI Filing Summary

Wetour Robotics Limited furnished a Form 6-K that incorporates into its existing Form S-8 and Form F-3 registration statements two press releases about its Orchestra Physical AI platform. The first release describes four development milestone demonstrations across the VisionLink and Conductor perception modules, showing gesture control, environmental haptic feedback, sEMG-based real-time hand tracking, and in-development spatial localization, all processed on an edge AI hub without cloud dependency.

The second release announces a May 28, 2026 launch event in Austin, where Wetour will debut Orchestra as a Physical AI operating system for real-time human‑machine interaction. The event will feature a live demonstration of “Spatial Intent Fusion,” combining visual context, EMG gesture signals, and spatial localization to control connected devices like lighting, audio, drones, and other hardware.

Positive

  • None.

Negative

  • None.

Insights

Wetour showcases edge-based Physical AI platform and schedules Orchestra launch.

Wetour Robotics is positioning Orchestra as an operating system for Physical AI, rather than a single device. The VisionLink and Conductor modules demonstrate real-time perception from cameras and sEMG wearables running directly on an edge AI hub, avoiding cloud latency.

The company frames “Spatial Intent Fusion” as combining visual context, neural gesture input, and spatial localization into unified commands for devices such as exoskeletons, drones, smart wheelchairs, and smart home systems. It emphasizes being device-agnostic and focusing on the platform layer instead of manufacturing wearables or end devices.

The May 28, 2026 Austin event will be the first public live demonstration of this integrated stack, including the Orchestra Connect Protocol and a strategic roadmap. Commercial traction, developer adoption, and real-world use cases are not detailed here and would need to be evaluated through subsequent disclosures and customer or partner announcements.

Development milestones four milestones Orchestra VisionLink and Conductor modules
Launch event date May 28, 2026 Orchestra public debut in Austin, Texas
Press release dates April 29 and May 01, 2026 Exhibits 99.1 and 99.2 attached to Form 6-K
Spatial Intent Fusion technical
"illustrate Spatial Intent Fusion: the simultaneous, real-time understanding of where a user is"
Spatial intent fusion is the process of combining location data (where things or people are) with behavioral cues (what they appear to want or do) from multiple sensors or data sources to produce a single, clearer picture of likely movement or purpose. For investors, it matters because products or systems that use this fusion—like mapping, targeted services, delivery logistics, or autonomous navigation—can make more accurate decisions, improve user experience, reduce errors, and therefore increase revenue potential or lower operational risk; think of it as assembling several clues to predict where someone is headed.
edge AI hub technical
"all processed at the edge with no cloud dependency"
Physical AI technical
"a Physical AI infrastructure and wearable robotics company headquartered in Austin"
Physical AI combines artificial intelligence with physical devices or environments, enabling machines to interact with and adapt to the real world in a human-like way. It matters to investors because it can lead to smarter robots, autonomous vehicles, or advanced sensors that improve efficiency and open new markets, potentially creating significant business opportunities and competitive advantages.
Electromyography technical
"continuous EMG(Electromyography) gesture signals (through the Conductor)"
Electromyography (EMG) is a medical test that records the tiny electrical signals muscles make when they contract, using sensors placed on or in the muscle—think of it as checking the wiring and signals between nerves and muscles. Investors care because EMG is a key tool for diagnosing neuromuscular conditions and evaluating treatments or devices; clear EMG results can influence clinical adoption, regulatory decisions and the commercial prospects of related healthcare products.
Orchestra Connect Protocol technical
"The event will also include an introduction to Orchestra’s technical architecture, the Orchestra Connect Protocol"

 

 

UNITED STATES

SECURITIES AND EXCHANGE COMMISSION

WASHINGTON, D.C. 20549

 

FORM 6-K

 

REPORT OF FOREIGN PRIVATE ISSUER

PURSUANT TO RULE 13a-16 OR 15d-16 UNDER

THE SECURITIES EXCHANGE ACT OF 1934

 

For the month of May 2026

 

Commission File Number: 001-42536

 

Wetour Robotics Limited

(Translation of registrant’s name into English)

 

Room 7003

3300 N Interstate 35 Ste 700

Austin, TX 78705

(Address of principal executive offices)

 

Indicate by check mark whether the registrant files or will file annual reports under cover of Form 20-F or Form 40-F.

 

Form 20-F ☒       Form 40-F ☐

 

 

 

 

 

 

Incorporation by Reference

 

This report on Form 6-K (the “Report”) shall be deemed to be incorporated by reference into the registration statements on Form S-8 (File No. 333-291960 and Form F-3 (File Nos. 333-294373 and 333-295457) of the Company , including any prospectuses forming a part of such registration statements, and to be a part thereof from the date on which this Report is filed with the U.S. Securities and Exchange Commission (the “SEC”), to the extent not superseded by documents or reports subsequently filed or furnished.

 

 EXHIBITS

 

Exhibit No.   Description
99.1   Press Release dated April 29, 2026
99.2   Press Release dated May 1, 2026

 

1 

 

 

SIGNATURES

 

Pursuant to the requirements of the Securities Exchange Act of 1934, the registrant has duly caused this report to be signed on its behalf by the undersigned, thereunto duly authorized.

 

  Wetour Robotics Limited
     
  By: /s/ Nan Zheng
  Name:  Nan Zheng
  Title: Chief Executive Officer

 

Date: May 1, 2026

 

2 

 

 

Exhibit 99.1

 

Wetour Robotics Demonstrates Real-Time Multi-Modal Edge AI Across VisionLink and Conductor Modules of the Orchestra Physical AI Platform

 

Four development milestones released: camera-based gesture control, environmental haptic feedback, sEMG real-time hand tracking, and in-development spatial localization — all processed at the edge with no cloud dependency

 

Austin, TX, April 29, 2026 (GLOBE NEWSWIRE) — Wetour Robotics Limited (NASDAQ: WETO) (“Wetour Robotics” or the “Company”), a Physical AI infrastructure and wearable robotics company headquartered in Austin, Texas, today released four development milestone demonstrations of its Orchestra platform. The demonstrations span VisionLink and Conductor — the first two perception modules built on Orchestra — and together illustrate Spatial Intent Fusion: the simultaneous, real-time understanding of where a user is, what they are looking at, and what their hand intends, processed entirely at the edge.

 

VisionLink Module — Vision-Based Bidirectional Interaction

 

Human → Machine. A chest-mounted camera captures the user’s hand gestures. VisionLink recognizes each gesture, Orchestra classifies intent and sends commands to a connected exoskeleton operating at three speed levels plus rest. The same pipeline can command robotic arms, mobility devices, or any actuator on the Orchestra Connect Protocol.

 

World → Human. VisionLink detects a person approaching, measures distance and direction in real time, and Orchestra translates this into directional haptic feedback. Closer means stronger; left-side approach triggers left-side response. No user input required.

 

Conductor Module — EMG-Based Neural Gesture Recognition

 

Real-Time Hand Tracking. A developer wears an sEMG wristband. Conductor processes the muscle signals in real time, and a 3D virtual hand on a connected display mirrors the wearer’s physical movement — vertical, lateral, and rotational — without the hand needing to be in any camera’s field of view.

 

Spatial Localization — In Active Development

 

A foundational layer for Spatial Intent Fusion currently in active development. Wearable sensor data is processed in real time to produce precise positional information within a 3D environment, displayed as the wearer moves through a room. Once integrated, this capability will combine with visual context (VisionLink) and gestural intent (Conductor) to enable unified commands for connected physical devices.

 

Demonstration videos for all four milestones are available at www.wetourrobotics.com and on the Company’s LinkedIn (Wetour Robotics) and X (@WETO_IR_TEAM) channels.

 

Why Spatial Intent Fusion Matters

 

Today’s wearable devices and physical machines are fragmented. A smartwatch cannot command a robotic arm. A camera cannot coordinate with a wheelchair. A wristband cannot direct a drone. Spatial Intent Fusion is Orchestra’s answer: an application layer that lets a wearable on the wrist, a camera in the room, and a connected device across the space act as one coordinated system.

 

 

 

 

Both VisionLink and Conductor are software pipelines running on the Orchestra edge hub — hardware-agnostic and compatible with any conforming wearable or camera device. Orchestra is designed to be device-agnostic, sensor-flexible, and open to builders. The Company does not manufacture wearable devices or physical end-devices. It develops the platform layer that makes them work together.

 

CEO Statement

 

“Before Orchestra, your wearables and the machines around you lived in separate worlds. Smart, but alone,” said Nan Zheng, Chief Executive Officer of Wetour Robotics. “With VisionLink and Conductor now generating real-time signals on the same edge hub, we are building the unified perception layer the Physical AI era requires — not a single feature, but the integration that makes the entire category usable. We call this capability Spatial Intent Fusion, and it is the foundation of what we will debut on the product launch day.”

 

About Wetour Robotics

 

Wetour Robotics Limited (NASDAQ: WETO), formerly known as Webus International Limited, is a technology company operating AI-driven premium travel and mobility services under its Wetour brand. Building on its foundation in AI and intelligent mobility, the Company is expanding into Physical AI infrastructure, developing Orchestra — a next-generation operating system that coordinates human intent with intelligent physical devices including wearable robotics. Wetour Robotics is headquartered in Austin, Texas. For more information, visit www.wetourrobotics.com.

 

Forward-Looking Statements

 

This press release contains forward-looking statements within the meaning of the Private Securities Litigation Reform Act of 1995. Words such as “plans to,” “designed to,” “believes,” “expects,” “in development,” and similar expressions identify forward-looking statements. These statements involve risks and uncertainties that could cause actual results to differ materially, including risks related to technology development progress, the timing and outcome of demonstrations, market acceptance, competition, capital requirements, and regulatory changes. The Company’s characterization of its competitive positioning and the term “Spatial Intent Fusion” reflect management’s current belief based on its review of publicly available information and are not based on a comprehensive market survey. The Company undertakes no obligation to update forward-looking statements. For additional risks, see the Company’s filings with the SEC.

 

Investor Relations Contact:

 

Annabelle Li
Investor Relations — Wetour Robotics Limited
Email: ir.annabelle@webus.vip 

 

 

 

 

Exhibit 99.2

 

Wetour Robotics to Debut Orchestra, a Physical AI Operating System for Human-Machine Interaction, on May 28 in Austin

 

Austin debut event will introduce Orchestra as a new platform for real-time human-machine interaction in the Physical AI era

 

Austin, TX, May 01, 2026 (GLOBE NEWSWIRE) — Wetour Robotics Limited (NASDAQ: WETO) (“Wetour Robotics” or the “Company”), a Physical AI infrastructure and wearable robotics company headquartered in Austin, Texas, today announced that it will host its inaugural product launch event on May 28, 2026 in Austin, Texas, to publicly launch Orchestra — the Company’s Physical AI operating system and a new paradigm for how humans interact with the physical world.

 

Why This Matters

 

The world is full of intelligent devices that often do not talk to each other. Smart glasses, wristbands, and motion sensors frequently operate within their own ecosystems while robotic arms, drones, smart wheelchairs, and industrial equipment typically rely on distinct control systems. In many cases, a smartwatch cannot command a robotic arm, a camera cannot coordinate with a wheelchair, and a wristband cannot directly interact with a drone. Every new device may add another remote, another app, another silo — and the human becomes the manual integration layer.

 

The Company believes this fragmentation is the central bottleneck holding back the Physical AI era — and that Orchestra is the operating system that can resolve it.

 

What Will Be Shown on May 28

 

The launch event will feature the first public live demonstration of Spatial Intent Fusion: Orchestra simultaneously processing visual scene context (through the VisionLink), continuous EMGElectromyography gesture signals (through the Conductor), and spatial localization — and translating all three into coordinated, real-time commands for everyday connected devices, including smart lighting, audio, drones, and other connected hardware.

 

The event will also include an introduction to Orchestra’s technical architecture, the Orchestra Connect Protocol, and the Company’s strategic roadmap following the May 28 debut.

 

From Smartphones to Physical AI

 

The Company believes the transition from the smartphone era to the Physical AI era requires a fundamentally new approach to human-machine interaction. The smartphone era connected people to information through screens and apps. The Physical AI era will connect people to the physical world through wearable sensors that perceive and physical devices that act — coordinated by an operating system purpose-built for this function. Interaction moves from the screen to the body. Commands move from taps to gestures, gaze, and presence.

 

Orchestra is designed to be that operating system: device-agnostic, sensor-flexible, and open to builders. The Company does not manufacture wearables or end-devices. It develops the platform layer — including the VisionLink and Conductor perception modules — and the application capabilities such as Spatial Intent Fusion that will make these devices work together.

 

 

 

 

Potential applications are expected to span assistive mobility, industrial safety, spatial navigation, visually impaired guidance, warehouse logistics, sports training, smart home control, and consumer drones — any domain where real-time coordination between wearable perception and physical actuation can augment human capability.

 

CEO Statement

 

“For a decade, every new wearable and every new connected device has made the world more intelligent and the user more confused,” said Nan Zheng, Chief Executive Officer of Wetour Robotics. “Orchestra is our answer. On May 28, we will present a demonstration of our approach that enables wearables on your body and the devices in your environment speaking the same language. We view this approach not as a new product category, but as a new application layer for the Physical AI era, and we call it Spatial Intent Fusion.”

 

Event Information

 

Date: Thursday, May 28, 2026

 

Location: Austin, Texas

 

Registration: launch.wetourrobotics.com

 

Open to partners, developers, media, and institutional investors and analysts. Capital markets participants may register through the event page or contact Investor Relations directly for coordinated access.

 

About Wetour Robotics

 

Wetour Robotics Limited (NASDAQ: WETO), formerly known as Webus International Limited, is a technology company operating AI-driven premium travel and mobility services under its Wetour brand. Building on its foundation in AI and intelligent mobility, the Company is expanding into Physical AI infrastructure, developing Orchestra — a next-generation operating system that coordinates human intent with intelligent physical devices including wearable robotics. Orchestra’s sensory modules include VisionLink (computer vision) and Conductor (sEMG-based neural gesture recognition), both processed in real time on a portable edge AI hub. Wetour Robotics is headquartered in Austin, Texas. For more information, visit www.wetourrobotics.com.

 

Forward-Looking Statements

 

This press release contains forward-looking statements within the meaning of the Private Securities Litigation Reform Act of 1995. Words such as “will,” “may,” “to launch,” “plans to,” “intends to,” “designed to,” “believes,” “expects,” and similar expressions identify forward-looking statements. These statements involve risks and uncertainties that could cause actual results to differ materially, including risks related to the successful execution of the planned product launch event, technology development, market acceptance, competition, the Company’s ability to develop and grow a developer ecosystem, capital requirements, and regulatory changes including continued listing requirements of the Nasdaq Capital Market. The Company’s characterization of its competitive positioning and the term “Spatial Intent Fusion” reflect management’s current belief based on its review of publicly available information and are not based on a comprehensive market survey. The Company undertakes no obligation to update forward-looking statements. For additional risks, see the Company’s filings with the SEC.

 

Investor and Media Contact:

 

Annabelle Li

 

Investor Relations — Wetour Robotics Limited

 

Email: ir.annabelle@webus.vip

 

Event Registration: launch.wetourrobotics.com

 

 

 

 

FAQ

What did Wetour Robotics (WETO) disclose in this Form 6-K?

Wetour Robotics furnished a Form 6-K incorporating two press releases about its Orchestra Physical AI platform. The releases describe new development milestone demonstrations for VisionLink and Conductor modules and announce a May 28, 2026 Austin launch event for the Orchestra operating system.

What is Wetour Robotics (WETO) Orchestra Physical AI platform?

Orchestra is described as a Physical AI operating system that coordinates human intent with intelligent physical devices. It uses modules like VisionLink for computer vision and Conductor for sEMG-based neural gesture recognition, running on an edge AI hub to control wearables, robotics, and other connected hardware.

What development milestones did Wetour Robotics (WETO) announce for Orchestra?

Wetour reported four milestones: camera-based gesture control, environmental haptic feedback, sEMG real-time hand tracking, and an in-development spatial localization capability. All are processed at the edge without cloud dependency, supporting the company’s concept of real-time Spatial Intent Fusion across multiple perception sources.

When and where will Wetour Robotics (WETO) launch Orchestra publicly?

Wetour plans to host its inaugural product launch event on May 28, 2026 in Austin, Texas. The event will feature the first public live demonstration of Spatial Intent Fusion and presentations on Orchestra’s technical architecture, connect protocol, and strategic roadmap for the Physical AI operating system.

What is Spatial Intent Fusion in Wetour Robotics (WETO) Orchestra platform?

Spatial Intent Fusion is Wetour’s term for simultaneously processing visual scene context, continuous EMG gesture signals, and spatial localization to generate coordinated, real-time commands. It is intended to let wearables, cameras, and physical devices act as one system for applications like mobility, safety, navigation, and smart home control.

Does Wetour Robotics (WETO) manufacture wearables or hardware for Orchestra?

Wetour states that it does not manufacture wearable devices or physical end-devices. Instead, it develops the platform layer, including the Orchestra operating system, VisionLink and Conductor perception modules, and the Orchestra Connect Protocol that allow third-party wearables and hardware to interoperate in real time.

Filing Exhibits & Attachments

2 documents