Dexter Development Environment (DDE)

What are you seeing? A demo/tutorial of the software tool used to create jobs that direct Dexter to make something.

What is the science?   Because we can't hope to provide processes for making everything, we employ a general purpose programming language, JavaScript, (with extensions) to allow you maximum flexibility in designing processes that make.

What is the impact/different from competition? DDE bends over backwards to simplifying the task of automating Dexter. Our "Job" language helps you direct the creation of complex things by coordinating robots as well as humans. DDE's strategy is not to be "high level" or "low level" but rather "all level", to give you both power and versatility. We know of no other robotics IDE with these capabilities.


What are you seeing? You are see movement from one point to the next by a trainable path using another Dexter and picks up a 500g weight.

What is the science?   With the real time physics engine along with the real time metrology, the software compensates out not only the mechanical deficiencies in the 3D printed parts but also the movement of the weight.

What is the impact/different from competition?  We see no other robot in the market able to compensate out all the mechanical and the movement of the load. This requires massively parallel computing and real time feedback.



What are you seeing?  A cotton ball is dropped on the robot.   We measure the deflection the robot senses. 

What is the science?  The real time physics engine and the real time metrology combined with the rigid design of the robot (others have mechanical compliance sensors, which reduce the rigidity of the system) allow us to see the deflection of the entire system at 1 arc second or 159nm (yes nanometers).

What is the impact/different from competition?   What you are seeing is a cornerstone in Dexter being highly sensitive to deliver a human safe robot .  This is intrinsic in the entire robot.  By having a rigid system and sample points throughout the entire body (2M measurements a second), the robot feels and we can detect anything that touches the robot.  IT FEELS YOU BEFORE YOU CAN FEEL IT.  The points that appear on the screen represent data points collected from the deflection of the system.  Each pixel represents 159nm.  Currently there are no other robots that use digital compliance, proprietary algorithms, on the market.


What are you seeing?  Dexter is showing off the rapid movement that can be achieved at the end-effector. 

What is the science?  Dexter was trained in this movement, trying to achieve a representation of how fast a human can move the robot.  To capture this rapid movement, all 3 engines are working, physics, metrology and kinematics.  Within the physics engine we are measuring force, acceleration, moment of inertia and friction.  This training is now editable to add pauses and change position (@5µm - able to cut a human hair width-wise 10+ times).

What is the impact/different from competition?  With other collaborative robots on the market, the training is slow and needs to be done step by step.  Dexter allows the end user to capture human dexterity and then edit those commands to achieve both human dexterity and robot precision to the accuracy of 50µm.


What are you seeing?  A 3-D printed holder for a Dremel tool was attached to the robot.  Dexter is showing its movement.  We called it “self-modification“ because it is drilling a hole on its base (this is Dexter's equivalent to shooting itself in the foot!).  It is also showing off the micro movements, useful if the tool was a grinder or polisher for fine finish refinements.

What is the science?  All the real-time engines working together to demonstrate an end-user application.  The base is constructed to be a tool holder for the Automatic Tool Changing (ATC) feature.  The precision and force detection allows Dexter to swap out tools by detecting where the tool is and then knowing the angle and force required to engage or disengage its different end-effector.

What is the impact/different from competition?  The end-effectors can be created to fit the needs of the end-user.  By having automatic tool changing, you can create a full heterogeneous direct digital manufacturing autonomous robot.  This creates a personal micro factory.  Or, when serial linked to other Dexters, it creates a fully autonomous manufacturing line.  These lines can be set up so users can be flexible in switching between different products and processes with robots, rather than needing large scale production with high product volume.  This lowers production costs to be competitive in the global market.


What are you seeing?  Backlash, the over/under corrections due to mechanical systems, is always an issue in robotics. Dexter’s software eliminates backlash

What is the science?  Dexter uses the massively parallel supercomputer to measure tremendous amounts of data and uses that data in real time to eliminate backlash.

What is the impact/different from competition?   By using software instead of high-cost mechanical anti-backlash components, Dexter is able to be much lower in cost.

250µm MOVEMENT @ 5µm

What are you seeing?  Dexter is demonstrating its precise movement over a 250µm length.  (0.0098inches). Each step, Dexter moves 5µm (able to cut a human hair width-wise 10+ times)

What is the science?  Getting this type of precision in a robot built with 3D printed (not fine machined) parts is achieved by all the engines on the FPGA supercomputer working together.  This eliminates all other interference and we are only constrained in movement by the motors we use. 

What is the impact/different from competition?  Achieving this precision in low cost parts is unprecedented.  This can only be achieved by utilizing the FPGA supercomputer and our proprietary algorithms.  The impact of this is paramount to low cost, high precision robotics.  This control system creates a working solution to achieve sub-micron stepping and precision.

Garage Days - early videos