A new virtual workshop

blog

I found this abandoned school and turned it into a virtual workshop.

After our last update, a few things have been going on in parallel.

I need to do a proper video to walk through all these features. Today is not that day.

+5 mins reading, +13 images

More

Preface

Am I supposed to understand any of this? - most of my friends.

No.. not really. I’m not even trying here to make it an easy read. Trying to get insights from the diagrams might be a waste of your time. Just skim over it and see that things are happening. Later videos will tie some of it together.

Sorry I didn’t have time to write a shorter letter - or to accurately attribute my quotes.

UFactory.

I found that the mechArm 270 was difficult to work with and started to experiment with the UFACTORY Lite 6.

But something has been a little crazy with its kinematics. Here’s a shot of it, not only clipping into the table, but ….from behind?!

Not to mention the flailing. I was going to sync this to some SOAD, but couldn’t be bothered dealing with the rights.

Anyway, it’s much better today and will likely feature in future vids. It's just not quite smooth yet. During the improvement process, a number of tools have been added into the ecosystem. The following chapters have some details.

AI operator client and a trajectory viewer.

First step was to give agents the ability to control the robots and collect data on the results.

That way, one can just leave the simulation running and watch them struggle.

Is this an MCP? Not really… at least not yet.

On the left we see the telemetry of a JetBot falling off the table and tumbling into the void. Being nice, I told the AI that this was likely to happen.

Eventually, this proved effective in solving problems. Here it solved a physics issue in the simulation itself. I had a robot that had traction in moving forward, but not when steering. It was pretty weird.

The trajectory viewer has taken a few forms.

details

This is an old shot for the records showing the 6 degrees of freedom of a robot arm, over time. And (dashed lines) the expected positions when given a control input.

This example was for one of the arms. The blue and green lines are movements in the Z and Y directions (and back). There are no dashed lines because the movement did what was expected. The later dashed purple and orange lines were attempts to perform pitch and yaw of the end effector. But that failed, so the deviations become visible.

Multi-operator & Multi-target

This is what enables multiple people and agents working together in the same environment.

It’s quite a simplified diagram for what turned out to be some serious architectural rework.

Is there ROS here? It’s actually ROS =optional. In this example there isn’t any at all, but there are clear advantages to putting it in.

Testing the services got tedious quickly

details

So I whipped up a Taskfile.dev management front end.

This allows starting and stopping the various services with clicks and key-bindings. It also has a tab that filters the system process list for all relevant processes (and click-to-kill)

…and being able to just click “all stop” is SO nice…

(It’s on the right. The left is just Jetbots watching other Jetbots driving off tables. )

So now when you select a UFactory robot, you get a 6DOF control system for moving the head around in all 6 dimensions.

But, if you switch to a JetBot, you only get the 2 Axis to play with.

So a key application value is to have a user work remotely with one robot, while AI works remotely with another.

But, right now, the biggest advantage of this is workflow.

details

One can have a single simulation running, with multiple agents working to configure different tasks on different robots at the same time. NVIDIA loves their ray-tracing. So Isaac Sim naturally is GPU heavy. Even for the simplest project: expect your computers fans to be running. Not to mention, it always takes several minutes to spin up the environment every time.

BUT… honestly, it’s nice to work in a pretty environment. So nice, I even started adding furniture.

And it’s nice to share screenshots of a sim that looks closer to the real thing and it not look like: Well… like Gazebo (sorry guys).

Having said that: RTX (ray tracing) is cost. If you are simulating, you’re essentially allocating some of your resources to something you don’t need. It’s just plain inefficient.

So I’m considering now either trying to fork Isaac and develop a non-RTX version, or rolling in a Gazebo environment in addition to Isaac Sim. This will also save on cloud deployment costs.

And of course the new workshop

I could say, I just got sick of working in an empty void.

But really… really it’s more about giving the agents something realistic to look at… (coming soon… i suppose)

And the room really did come from a “scary school” asset I found on BlenderKit.

progress images

details

NVIDIA provide a nice omniverse addon to help with the import… but this still took an unexpected amount of manual work to get the materials to import. Even with agents helping. I’ll probably have the next room built with an upwork contractor.

Then when it all finally worked gravity was sideways and, once again, tiny Jetbots took trips to the void.

Anyway, I’ve seen way better blender compatibility in Unity, where you can literally drop the blender file into the scene and it does the magic for you. So, there were higher expectations, but I also understand the reality of other requirements that Isaac sim probably needs to prioritize that makes it a little harder than integrating to a game engine.

Anyway, that’s enough for now.

© 2026 RobotCompany.net. All rights reserved.