J Robot

Tom 4 744 MapleSim

There is something profoundly satisfying when something that goes “viral” on the Web has some connection to your life. This happened recently when I and my colleagues were pointed towards a video of some laboratory robots that somehow drew almost a million views on YouTube alone. For an engineer, this was a staggering display of technical creativity and passion.  The robots here are the work of Prof. Masatoshi Ishikawa of the Ishikawa-Komuro Robotics Lab at the University of Tokyo. Coincidentally, I was scheduled to visit Japan a couple of weeks later to speak at our annual Maple TechnoForum Conference. I sent a hopeful request to our Japanese colleagues to try to arrange a visit of the Ishikawa Lab. What I thought was a highly improbable meeting, as it turns out, became reality rather easily. In fact, Prof. Ishikawa insisted that he meet us personally and present his group’s work himself. For a robo-nerd like myself, this is the equivalent of Mick Jagger offering to take me personally to the main Virgin Record Store in London’s Picadilly Circus to point me to his favorite obscure Stones single.

 In modern robotics, the typical path towards high-performance robotic devices involves a fair amount of advanced modeling to work out the control systems. More often than not these days, roboticists are turning to symbolic computation to assist in the modeling. This, of course, is one of the reasons why MapleSim has become so popular within the mechatronics and robotics communities. Prof. Ishikawa was very generous with his time and gave us a comprehensive overview of his research -- lots of videos but lots of low-level information on his unique technology. As it turns out, his group’s strength is the design of custom computer chips that are finely tuned to perform super fast image processing. Combine this with a parallel computing framework, he is able to sample and process video images at an amazing 1000 frames per second. By comparison, typical robotic vision systems will operate at a meager 30 frames per second. What this means is that his vision systems work so fast that using relatively simple, brute force techniques, he can get his robots to react at blistering speeds and get its parts in the desired position using relatively simple control techniques.

Inevitably, I asked the question, “how did you model your robots?” quietly hoping he would say something about Maple or MapleSim. To my disappointment, he said, “we do not develop models.” In essence, when you can function at his speed you have all the information you need to determine exactly how you need the robot to react and can execute the response very quickly. Conversely, if you had a slow sampling system, you would have to develop more intelligent control strategies to try to predict possible outcomes and to get the robot ready for more circumstances. This is where good models come in. My disappointment was short-lived thankfully. He went on to say that he has bigger and better ideas for future robots and he believes that those will require modeling and he was optimistic that there will be opportunities for some collaboration in the future. He was very happy that we visited. I can live with that. And I am very happy that we have made a good connection into one of the premier robotics labs on the planet.

Christina Spirou (Product Director), Chad Schmitke (MapleSim Lead), and Yours Truly posing with the wickedly fast robotic hands at the Ishikawa Labs.

 The famous “hand” twirling a stick at mind numbing speed

 

Same motion slowed down

On to the Tokyo Institute of Technology

As before, we asked the researchers how did they develop their models? The answer was a slight variation of Professor Ishikawa’s answer. In essence, they don’t do computerized modeling and they deploy very conventional techniques for model development – i.e. an army of grad students and countless sleepless nights. And, in fact, it was their experience that they are hitting the limits of their approach and they were very keen to explore the application of MapleSim and more modern physical modeling techniques.

Although I had really hoped that there would be some existing connection between these pioneering works and our software, I must say that these meetings were some of the most engaging and motivating visits that I’ve had in a long time. These brilliant researchers have been relying on pure engineering creativity and diligence for almost 3 decades and they’ve been wildly successful – so successful that they now have the insight and wisdom to explore and indeed embrace new techniques so that they can continue to lead the way to the future. I am very thankful that they consider our work as part of the solution to emerging challenges.

The hopping “ant-bot”

The multi-talented robot with fully autonomous wheel-bots that can disengage and reconnect as needed

 

Useful Links

Please Wait...