As mentioned in earlier posts, one of my goals for reviving the Stepbot project was to get it working with runmyrobot.com.
When my friend Chase introduced me to runmyrobot.com I was excited because they provide a platform for allowing anyone to control telepresense robots through a web interface, and they provide source code for the robot firmware that interfaces with the system. Casually looking at the site, it seemed to fill the need I have for providing a control interface and protocol for my telepresence experiments, which could save me some time at this stage and let me focus more on the robot work itself.
After running into some roadblocks due to the hardware I selected, I upgraded Stepbot’s brain and I was able to get everything working.
As you can see, it’s technically “working”, but the results leave much to be desired.
I haven’t analyzed the results enough to say whether or not the performance of the system has to do with my robot, my re-implementation of the firmware or the platform itself. Regardless, in its current state, it’s not very usable.
The control interface I built for Sux0rz had similar problems which, for the most part, boil down to lag. I assumed that in Sux0rz case this was due to the old-fashioned REST protocol I was using to send control signals and the simplistic video streaming (an old Linux webcam tool). This arrangement, while also plagued with lag, out-performed what you see in the above video.
The up-side of this experience is that it shows the problems I was having with my home-grown system are not unique, and that there wasn’t necessarily something fundamentally wrong with the way I was going about it. Lag (in both directions) is the key problem to solve in developing a usable, general-purpose telepresense platform. This isn’t a surprise, but having more than one point-of-reference makes it easier to justify focusing energy in this area, as it is not yet a solved problem.