WebController for V-Sido Connect

WebController for V-Sido Connect is a development platform for humanoid robot remote control, built on top of V-Sido, a robot control platform provided by Asratec Corp.

Company: Asratec Corp.
Development: Atomos Design Co., Ltd, Robust Inc.

The platform is built 100% with web technology, made it easy for the web developers to integrate V-Sido with web services. As a consultant, I helped them design the platform and the API, designed and developed the default 3D web UI. The system evolved to be used with SoftBank Robotics’s Pepper robot and used in the experiments in the airport and nursing home.


Asratec developed a robot controller board called V-Sido Connect that robot developers can use V-Sido on their robots easily by embedding the board and connecting to servo motors. V-Sido Connect connects to internet through Bluetooth. In WebController, we developed an Android app that communicates with V-Sido Connect, which also works as a WebRTC node that receives the remote control commands and pass them over to the robot through V-Sido Connect. The app also provides the video from the phone camera,


I assessed multiple web-based communication protocols including HTTP and WebSockets and selected WebRTC, since we needed real-time video communication. Also to assure the real-time control, TCP based protocol such as HTTP was not well suited, since TCP guarantees the delivery and the order of every packet, which will cause the lag in the control. WebRTC’s data communication is built on top of UDP. We used PeerJS which is a us library for WebRTC. But PeerJS by default did not provide the switch for delivery and order guaranty. So I had to modified the code of the library to make the smooth real-time control happen. WebRTC requires broker service and we used SkyWay from NTT Communications.

Also the smoothness of the control depends on the latency added by the network. We tested 128 different connection topology within 4G networks of 3 major mobile careers and the landline fixed network. Interestingly, we were able to achieve the latency target in mobile-to-mobile setup, while there were more latency added with landline. Mobile network was better at latency even at the time of 4G.


We wanted the default UI to be in 3D, so we used WebGL to implement an UI that user can control the robot by dragging a virtual figure to change the posture. There was no handy Inverse Kinematics (IK) library for JavaScript / WebGL, so I had to implement IK using JavaScript on my own. The UI also showed the video from the phone.