UNITED STATES
SECURITIES AND EXCHANGE COMMISSION
WASHINGTON, D.C. 20549
FORM 6-K
REPORT OF FOREIGN PRIVATE ISSUER
PURSUANT TO RULE 13a-16 OR 15d-16 UNDER
THE SECURITIES EXCHANGE ACT OF 1934
For the month of May 2026
Commission File Number: 001-42536
Wetour Robotics Limited
(Translation of registrant’s name into English)
Room 7003
3300 N Interstate 35 Ste 700
Austin, TX 78705
(Address of principal executive offices)
Indicate by check mark whether the registrant files or will file annual
reports under cover of Form 20-F or Form 40-F.
Form 20-F ☒ Form 40-F
☐
Incorporation by Reference
This report on Form 6-K (the “Report”) shall be deemed to be incorporated by reference into the registration statements on
Form S-8 (File No. 333-291960 and Form F-3 (File Nos. 333-294373 and 333-295457) of the Company , including any prospectuses forming a
part of such registration statements, and to be a part thereof from the date on which this Report is filed with the U.S. Securities and
Exchange Commission (the “SEC”), to the extent not superseded by documents or reports subsequently filed or furnished.
EXHIBITS
| Exhibit
No. |
|
Description |
| 99.1 |
|
Press Release dated April 29, 2026 |
| 99.2 |
|
Press Release dated May 1, 2026 |
SIGNATURES
Pursuant to the requirements of the Securities
Exchange Act of 1934, the registrant has duly caused this report to be signed on its behalf by the undersigned, thereunto duly authorized.
| |
Wetour Robotics Limited |
| |
|
|
| |
By: |
/s/ Nan Zheng |
| |
Name: |
Nan Zheng |
| |
Title: |
Chief Executive Officer |
Date: May 1, 2026
Exhibit 99.1
Wetour Robotics Demonstrates Real-Time Multi-Modal Edge AI Across
VisionLink and Conductor Modules of the Orchestra Physical AI Platform
Four development milestones released: camera-based gesture control,
environmental haptic feedback, sEMG real-time hand tracking, and in-development spatial localization — all processed at the edge
with no cloud dependency
Austin, TX, April 29, 2026 (GLOBE NEWSWIRE) — Wetour Robotics
Limited (NASDAQ: WETO) (“Wetour Robotics” or the “Company”), a Physical AI infrastructure and wearable robotics
company headquartered in Austin, Texas, today released four development milestone demonstrations of its Orchestra platform. The demonstrations
span VisionLink and Conductor — the first two perception modules built on Orchestra — and together illustrate Spatial Intent
Fusion: the simultaneous, real-time understanding of where a user is, what they are looking at, and what their hand intends, processed
entirely at the edge.
VisionLink Module — Vision-Based Bidirectional Interaction
Human → Machine. A chest-mounted camera captures the
user’s hand gestures. VisionLink recognizes each gesture, Orchestra classifies intent and sends commands to a connected exoskeleton
operating at three speed levels plus rest. The same pipeline can command robotic arms, mobility devices, or any actuator on the Orchestra
Connect Protocol.
World → Human. VisionLink detects a person approaching,
measures distance and direction in real time, and Orchestra translates this into directional haptic feedback. Closer means stronger; left-side
approach triggers left-side response. No user input required.
Conductor Module — EMG-Based Neural Gesture Recognition
Real-Time Hand Tracking. A developer wears an sEMG wristband.
Conductor processes the muscle signals in real time, and a 3D virtual hand on a connected display mirrors the wearer’s physical
movement — vertical, lateral, and rotational — without the hand needing to be in any camera’s field of view.
Spatial Localization — In Active Development
A foundational layer for Spatial Intent Fusion currently in active
development. Wearable sensor data is processed in real time to produce precise positional information within a 3D environment, displayed
as the wearer moves through a room. Once integrated, this capability will combine with visual context (VisionLink) and gestural intent
(Conductor) to enable unified commands for connected physical devices.
Demonstration videos for all four milestones are available at www.wetourrobotics.com and
on the Company’s LinkedIn (Wetour Robotics) and X (@WETO_IR_TEAM) channels.
Why Spatial Intent Fusion Matters
Today’s wearable devices and physical machines are fragmented.
A smartwatch cannot command a robotic arm. A camera cannot coordinate with a wheelchair. A wristband cannot direct a drone. Spatial Intent
Fusion is Orchestra’s answer: an application layer that lets a wearable on the wrist, a camera in the room, and a connected device
across the space act as one coordinated system.
Both VisionLink and Conductor are software pipelines running on the
Orchestra edge hub — hardware-agnostic and compatible with any conforming wearable or camera device. Orchestra is designed to be
device-agnostic, sensor-flexible, and open to builders. The Company does not manufacture wearable devices or physical end-devices. It
develops the platform layer that makes them work together.
CEO Statement
“Before Orchestra, your wearables and the machines around you
lived in separate worlds. Smart, but alone,” said Nan Zheng, Chief Executive Officer of Wetour Robotics. “With VisionLink
and Conductor now generating real-time signals on the same edge hub, we are building the unified perception layer the Physical AI era
requires — not a single feature, but the integration that makes the entire category usable. We call this capability Spatial Intent
Fusion, and it is the foundation of what we will debut on the product launch day.”
About Wetour Robotics
Wetour Robotics Limited (NASDAQ: WETO), formerly known as Webus International
Limited, is a technology company operating AI-driven premium travel and mobility services under its Wetour brand. Building on its foundation
in AI and intelligent mobility, the Company is expanding into Physical AI infrastructure, developing Orchestra — a next-generation
operating system that coordinates human intent with intelligent physical devices including wearable robotics. Wetour Robotics is headquartered
in Austin, Texas. For more information, visit www.wetourrobotics.com.
Forward-Looking Statements
This press release contains forward-looking statements within the meaning
of the Private Securities Litigation Reform Act of 1995. Words such as “plans to,” “designed to,” “believes,”
“expects,” “in development,” and similar expressions identify forward-looking statements. These statements involve
risks and uncertainties that could cause actual results to differ materially, including risks related to technology development progress,
the timing and outcome of demonstrations, market acceptance, competition, capital requirements, and regulatory changes. The Company’s
characterization of its competitive positioning and the term “Spatial Intent Fusion” reflect management’s current belief
based on its review of publicly available information and are not based on a comprehensive market survey. The Company undertakes no obligation
to update forward-looking statements. For additional risks, see the Company’s filings with the SEC.
Investor Relations Contact:
Annabelle Li
Investor Relations — Wetour Robotics Limited
Email: ir.annabelle@webus.vip
Exhibit 99.2
Wetour Robotics to Debut Orchestra, a Physical AI Operating System
for Human-Machine Interaction, on May 28 in Austin
Austin debut event will introduce Orchestra as a new platform for
real-time human-machine interaction in the Physical AI era
Austin, TX, May 01, 2026 (GLOBE NEWSWIRE) — Wetour Robotics Limited
(NASDAQ: WETO) (“Wetour Robotics” or the “Company”), a Physical AI infrastructure and wearable robotics company
headquartered in Austin, Texas, today announced that it will host its inaugural product launch event on May 28, 2026 in Austin, Texas,
to publicly launch Orchestra — the Company’s Physical AI operating system and a new paradigm for how humans interact with
the physical world.
Why This Matters
The world is full of intelligent devices that often do not talk to
each other. Smart glasses, wristbands, and motion sensors frequently operate within their own ecosystems while robotic arms, drones, smart
wheelchairs, and industrial equipment typically rely on distinct control systems. In many cases, a smartwatch cannot command a robotic
arm, a camera cannot coordinate with a wheelchair, and a wristband cannot directly interact with a drone. Every new device may add another
remote, another app, another silo — and the human becomes the manual integration layer.
The Company believes this fragmentation is the central bottleneck holding
back the Physical AI era — and that Orchestra is the operating system that can resolve it.
What Will Be Shown on May 28
The launch event will feature the first public live demonstration of
Spatial Intent Fusion: Orchestra simultaneously processing visual scene context (through the VisionLink), continuous EMG(Electromyography)
gesture signals (through the Conductor), and spatial localization — and translating all three into coordinated, real-time commands
for everyday connected devices, including smart lighting, audio, drones, and other connected hardware.
The event will also include an introduction to Orchestra’s technical
architecture, the Orchestra Connect Protocol, and the Company’s strategic roadmap following the May 28 debut.
From Smartphones to Physical AI
The Company believes the transition from the smartphone era to the
Physical AI era requires a fundamentally new approach to human-machine interaction. The smartphone era connected people to information
through screens and apps. The Physical AI era will connect people to the physical world through wearable sensors that perceive and physical
devices that act — coordinated by an operating system purpose-built for this function. Interaction moves from the screen to the
body. Commands move from taps to gestures, gaze, and presence.
Orchestra is designed to be that operating system: device-agnostic,
sensor-flexible, and open to builders. The Company does not manufacture wearables or end-devices. It develops the platform layer —
including the VisionLink and Conductor perception modules — and the application capabilities such as Spatial Intent Fusion that
will make these devices work together.
Potential applications are expected to span assistive mobility, industrial
safety, spatial navigation, visually impaired guidance, warehouse logistics, sports training, smart home control, and consumer drones
— any domain where real-time coordination between wearable perception and physical actuation can augment human capability.
CEO Statement
“For a decade, every new wearable and every new connected device
has made the world more intelligent and the user more confused,” said Nan Zheng, Chief Executive Officer of Wetour Robotics. “Orchestra
is our answer. On May 28, we will present a demonstration of our approach that enables wearables on your body and the devices in your
environment speaking the same language. We view this approach not as a new product category, but as a new application layer for the Physical
AI era, and we call it Spatial Intent Fusion.”
Event Information
Date: Thursday, May 28, 2026
Location: Austin, Texas
Registration: launch.wetourrobotics.com
Open to partners, developers, media, and institutional investors and
analysts. Capital markets participants may register through the event page or contact Investor Relations directly for coordinated access.
About Wetour Robotics
Wetour Robotics Limited (NASDAQ: WETO), formerly known as Webus International
Limited, is a technology company operating AI-driven premium travel and mobility services under its Wetour brand. Building on its foundation
in AI and intelligent mobility, the Company is expanding into Physical AI infrastructure, developing Orchestra — a next-generation
operating system that coordinates human intent with intelligent physical devices including wearable robotics. Orchestra’s sensory
modules include VisionLink (computer vision) and Conductor (sEMG-based neural gesture recognition), both processed in real time on a portable
edge AI hub. Wetour Robotics is headquartered in Austin, Texas. For more information, visit www.wetourrobotics.com.
Forward-Looking Statements
This press release contains forward-looking statements within the meaning
of the Private Securities Litigation Reform Act of 1995. Words such as “will,” “may,” “to launch,”
“plans to,” “intends to,” “designed to,” “believes,” “expects,” and similar
expressions identify forward-looking statements. These statements involve risks and uncertainties that could cause actual results to differ
materially, including risks related to the successful execution of the planned product launch event, technology development, market acceptance,
competition, the Company’s ability to develop and grow a developer ecosystem, capital requirements, and regulatory changes including
continued listing requirements of the Nasdaq Capital Market. The Company’s characterization of its competitive positioning and the
term “Spatial Intent Fusion” reflect management’s current belief based on its review of publicly available information
and are not based on a comprehensive market survey. The Company undertakes no obligation to update forward-looking statements. For additional
risks, see the Company’s filings with the SEC.
Investor and Media Contact:
Annabelle Li
Investor Relations — Wetour Robotics Limited
Email: ir.annabelle@webus.vip
Event Registration: launch.wetourrobotics.com