
NEWS CENTER
At the prestigious World Robot Conference (WRC), MiiVii's BRD601 THOR carrier board served as the intelligent heart of Galbot., powering a groundbreaking joint demonstration with NVIDIA. This marked a successful the world's first internal deployment of a Jetson Thor in a robot, stunning the audience with a live industrial material handling task.
Leveraging its proprietary models, Galbot autonomously identified the position, shape, and stacking relationship of the bins. It then made the autonomous decision to deftly use a single arm to create a gap before cooperatively lifting the targeted bin with both hands.
It was the immense performance of the Jetson Thor, unlocked by the BRD601 THOR, that allowed this entire complex sequence of perception, planning, and execution to run flawlessly and completely offline, with remarkable fluidity.
NVIDIA Jetson Thor, built on the revolutionary Blackwell GPU architecture, is purposely‑designed to lead the wave of humanoids and autonomous machines. Delivering up to 2,070 FP4 TFLOPS of compute and 128 GB of onboard memory, it is the ultimate engine for robotics workloads that must process massive sensor streams in real time—bringing complex generative‑AI reasoning to the edge.
As the ecosystem rolls out full‑stack Jetson Thor solutions and live demos, ultra‑low‑latency data paths from camera to GPU memory are becoming production‑ready. From cutting‑edge humanoid research to industrial deployment, this transformation is gathering momentum worldwide.
1. Stronger generalization to unlock complex scenarios
Thanks to Jetson Thor’s extreme performance, VLA inference speeds up dramatically. Robots become smarter than ever—understanding more complex instructions, adapting in unstructured environments, and demonstrating robust general capabilities.
2. True edge intelligence without the cloud
With 128 GB of memory, a single robot can now deploy and run multiple advanced AI pipelines entirely on device—ASR, LLM, TTS, and VLA—enabling more natural human‑robot interaction and faster response, without waiting on cloud round trips.
3. Multi‑agent collaboration at the edge
Jetson Thor’s ample memory and compute let it run multiple agent workflows simultaneously. Robots become not just executors but powerful edge computing and decision hubs, coordinating complex tasks locally.

Designed specifically for space‑constrained humanoid robots.
● Humanoid‑first design: A compact form factor that fits demanding head and torso cavities.
● Built for high‑dynamics: Reinforced USB‑C for shock and interference resistance; integrated audio‑amp interface for natural, clear, energy‑efficient multi‑channel voice output.
● Upgraded visual perception: GMSL high‑speed interfaces ensure low‑latency, high‑quality multi‑camera streaming.
Engineered for mobile robots that demand higher reliability.
● Industrial‑grade reliability: System design options targeting IP65 protection to withstand dust and humidity.
● Powerful sensor fusion: Up to 12 GMSL2 video inputs plus multiple 10G Ethernet ports to provide high‑bandwidth paths for dense sensor fusion.
● Precision timing and functional safety: Microsecond‑level time sync across subsystems; integrated TC397 safety MCU for functional‑safety supervision in mission‑critical deployments; built‑in “black‑box” data logging for post‑event analysis and performance tuning.
● Intelligent operations: Real‑time health monitoring, diagnostics, and predictive‑maintenance hooks that simplify fleet‑scale management.
Jetson Thor is built for generative reasoning models. It enables the next generation of physical AI agents — powered by large transformer models, vision language models and vision language action models — to run in real time at the edge while minimizing cloud dependency.
Optimized with the Jetson software stack to enable the latency and performance required in real-time applications, Jetson Thor supports all popular generative AI frameworks and AI reasoning models with unmatched real-time performance. These include Cosmos Reason, DeepSeek, Llama, Gemini and Qwen models, as well as domain-specific models for robotics like Isaac GR00T N1.5, enabling any developer to easily experiment and run inference locally.
With NVIDIA CUDA ecosystem support through its lifecycle, Jetson Thor is expected to deliver even better throughput and faster responses with future software releases.
Jetson Thor modules also run the full NVIDIA AI software stack to accelerate virtually every physical AI workflow with platforms including NVIDIA Isaac for robotics, NVIDIA Metropolis for video analytics AI agents and NVIDIA Holoscan for sensor processing.
With these software tools, developers can easily build and deploy applications, such as visual AI agents that can analyze live camera streams to monitor worker safety, humanoid robots capable of manipulation tasks in unstructured environments and smart operating rooms that guide surgeons based on data from multi-camera streams.
Build now. Reshape what’s next.
With MiiVii’s Jetson Thor solutions, you can start building today:
● Next‑generation humanoids that perform multi‑camera stereo perception, run large models locally, and execute dexterous manipulation.
● Autonomous mobile robots (AMRs) that deliver robust navigation, intelligent obstacle negotiation, and human‑aware interactions in factories and warehouses.
● Mission‑critical edge‑AI applications—such as rail safety and smart heavy equipment—that demand ruggedization and deterministic timing.
A new, AI‑driven physical world is here.
Contact our engineering team to choose the best fit, BRD601 THOR or APEX THOR—for your humanoid or autonomous‑machine platform.