Daimon Robotics’ new data acquisition system brings haptic intelligence to robot teleoperation

zeeforce
8 Min Read


At CES 2026, the robotics conversation has quietly shifted. Fewer people are asking whether robots can move faster or lift heavier objects. More are asking something harder: why are robots still struggling outside controlled demos — and what’s missing to make them reliable in the real world?

For many in the industry, the answer is data. Not synthetic data or scripted motions, but real interaction data that captures how objects behave when they are touched, pushed, squeezed, or moved.

That’s the problem Daimon Robotics is trying to address with the DM-EXton2, a teleoperation-based data acquisition system unveiled at CES this year. It isn’t a consumer product. It’s a professional tool designed to help robots learn from human interaction at scale.

Robots aren’t dumb — they’re inexperienced

Recent advances in AI have dramatically improved perception, language understanding, and reasoning. But physical interaction remains a weak point. A robot may recognize an object perfectly and still fail when asked to pick it up, insert it or manipulate it safely.

The reason is straightforward: the physical world is messy. Force, friction, deformation and contact change from moment to moment and those signals are difficult to capture cleanly. Most robots simply haven’t seen enough of this data.

Traditional data collection methods come with trade-offs. Dedicated capture environments are expensive and labor-intensive, yet still produce limited reusable data. Simulation is cheaper, but the gap between virtual physics and reality often leads to models that work in the lab and fail in practice.

Worse, many existing systems interfere with the very behavior they’re trying to record. Bulky equipment restricts natural movement, while limited sensing misses the subtle force and tactile cues humans rely on instinctively.

What a robotic data acquisition system actually does

A teleoperation-based data acquisition system approaches the problem differently.

Building on traditional teleoperation approaches, a teleoperation-based data acquisition system records interaction data in real time with greater consistency across multiple signals. A human operator remotely controls a robot to perform real tasks — grasping objects, inserting components, or manipulating tools — while the system captures motion, timing, contact, and force data simultaneously.

In effect, the robot learns by watching and feeling how a human does the job. The closer this setup is to natural human behavior, the more useful the resulting data becomes.

Built for real-world data, not demos

The DM-EXton2 is the world’s first haptic-feedback teleoperation system for robotic data acquisition, designed to capture high-quality interaction data from real-world tasks.

It is designed around responsiveness and deployment flexibility rather than wearable specifications. Operating at a 1000Hz response rate, the system enables millisecond-level command synchronization that supports smooth, low-latency teleoperation during data collection.

It also supports full-body teleoperation, including coordinated control of mobile bases and waist joints, expanding the range of tasks that can be captured. Together with adaptive motion scaling and quick end-effector switching, these capabilities allow a single system to support both fine manipulation and large-range movements without interrupting the data-collection process.

To accommodate different working environments, the DM-EXton2 is available in two configurations: a backpack version suited for mobile data-collection setups, and a stand-mounted version designed for fixed workstations. This allows operators to choose the format that best fits their workflow, whether data is being captured across dynamic spaces or within stable, repeatable environments.

Putting force and tactile into the loop

Where the DM-EXton2 stands out most is in pairing operator-side force feedback with tactile sensing for data collection.

The system brings these force capabilities into a broader teleoperation framework, enabling more natural and precise manipulation during data collection. As the robot interacts with its environment, contact forces are fed back to the operator in real time. Tasks like handling fragile objects or performing precise insertions become more intuitive, even when the robot’s view is partially obstructed.

This isn’t just about improving the operator’s control experience. At the robot level, force and tactile signals are recorded alongside motion data, creating multimodal datasets that reflect how humans actually interact with objects. That data is critical for teaching robots not just how to move, but how to judge contact and adapt to physical constraints.

From isolated experiments to repeatable learning

By synchronizing motion, force, and touch, the DM-EXton2 acts as a bridge between human skill and machine learning. Human intuition becomes structured data that robots can learn from, reuse, and apply across tasks.

That shift matters. Instead of collecting small, task-specific datasets, teams can build ongoing pipelines for data generation. Over time, this supports faster model training and more reliable deployment.

Closing the loop

The system also fits into a broader change in how robots are developed. Data collection, model training and deployment are no longer separate stages. They increasingly form a loop.

High-quality interaction data feeds into multimodal models — including Vision-Tactile-Language-Action frameworks — which improve robot behavior. Real-world use then generates new data that refines the next training cycle.

For that loop to work, data has to move freely. Standardization and compatibility aren’t nice-to-haves; they’re prerequisites.

Where Daimon Robotics fits in

Daimon Robotics focuses on the technologies that support robot learning, rather than building complete robots. Its work spans tactile sensing, dexterous manipulation hardware, and teleoperation systems designed to support large-scale data collection.

The company was incubated at the Hong Kong University of Science and Technology and founded by Professor Yu Wang, founding director of the HKUST Robotics Institute, along with Dr. Jianghua Duan. The team combines academic research with experience in deploying robotics technology beyond the lab.

Within this approach, the DM-EXton2 serves as a key component of Daimon Robotics’ “3D” strategy — Device, Data, and Deployment. Drawing on the company’s long-term focus on tactile sensing and dexterous manipulation, the system helps turn force and touch data into usable inputs for advanced learning models, supporting progress toward more general-purpose robotic capability.

Why this matters

As robots move closer to everyday environments, progress will depend less on clever algorithms and more on whether machines can learn from the physical world they operate in.

The DM-EXton2 doesn’t promise instant autonomy. Instead, it serves as a critical bridge, enabling robots to be guided through real-world tasks so that high-quality interaction data can be captured as a foundation for more general capabilities.

You can learn more about Daimon Robotics via its company website, LinkedIn profile and YouTube account.



Source link

Share This Article
Leave a comment
Optimized by Optimole
Verified by MonsterInsights