3D printing robots – embodied AGI has to start somewhere :)

I’ve been getting quite interested in building robots lately. The reason is that I strongly believe in the embodiment of AGI systems in order for them to make sense as perception-action and learning agents.

To further my machine-learning algorithm embodiment goals, I recently acquired a MakerBot Relicator 2 Desktop 3D printing system. Here is a picture of the system with some 3D parts I’ve printed:


I’ll talk more about the system specifically and what I’ve discovered through using it in a later post.

I decided to start with some robot builds, specifically this one:
InMoov 3D printed robot
Here is the blog of the robot designer, hairygael:
InMoov Blog
All credit goes to him for the beautiful design of these pieces!

I think I first saw this robot reported on a tech news site and thought it looked awesome. I’ve since been printing and assembling some of the components. I started with the hand. Here are a few pics!






The entire hand is printed using PLA – interestingly not the material that the designer used to build the original robot (ABS, the “other” 3D printer material was used in the build documented on the blog link above).

The hand works using an animatronic-inspired “tendon” method, where strings run the length of the fingers. Tension can be applied to each string to either curl or uncurl the corresponding finger. Here is a picture of some of the tendons threaded through the robot hand:


I also tried to make a smaller hand, printed at 50% of the scale. There are a lot of difficulties in this process, including the fact that the servos I bought were not exactly 50% of the scale. Here is the result so far:


I’ll have to experiment with printing the hand at different sizes to see if my smaller servos can be made to fit. I’m a little worried that even if I get the hand servos to work, there will be some problem later on with the servos for the shoulder, head etc. Anyway, it is all a learning process!

I think it is super-cool that people can print their own robots at home now. We definitely need many more designs submitted to Thingiverse (one of several 3D printing open-source design warehouse) for robots of all kinds. The more robots exist, the easier it will be to develop, test and improve machine learning algorithms for their brains.

The Robot Artist Project

I’ve been interested in art for as long as I can remember. But lately I have been also extremely interested in robotics. And yet another interest I have is in AGI (artificial general intelligence) and specifically machine creativity. So I’ve been trying to combine these interests in a fun and unique project. As you do.

This project isn’t very far along yet, but I thought I’d start talking about it anyway… I have several projects on the go and talking about them will help me keep working on them! I’m going to separate projects with different logos and tags, so people can easily find them.


One of the things that we (and by we I mean the Artificial Intelligence/Machine Learning communities in general) are not so very good at is demonstrating cool algorithmic results in a way that people can understand and connect to. For example, this:


Is pretty exciting if you’re presenting at a machine learning conference. However, this:


Is more exciting (or at least thought provoking) for nearly everyone else.

At work we’d been musing for a while over the idea that we needed to craft more cool demos. And so I made a little deal with 2 of my co-workers that we’d each try to come up with a cool, creative, robot-based project to work on in our spare time, power it with some serious AI algorithms, and pitch the resulting entities against one another to achieve the highest level of nerdy robot coolness.

The challenge: Build a creative robot demo powered by an advanced machine Learning algorithm (in your spare time).

I code-named my attempt The Robot Artist Project. The robot I would like to build would be an embodied cognitive system with a ‘savant-like’ ability in the graphic arts, specifically canvas painting (more on this later). Once I had decided I liked the idea of building a robot artist, I had a look around the internet for drawing and painting robots. You can get quite a long way towards your goal by learning from successes and mistakes made in previous attempts to do similar things. I didn’t find that many socially-challenged but creatively gifted savant painting robots in my search, but this project stood out:

The thing I like especially about this is the artist’s story and reasoning behind building the robot – he suddenly had an overwhelming feeling that he could not produce art anymore, had lost his creativity, and needed to create something to do the work for him. I sympathize with this artist’s viewpoint. He is a troubled, angst-ridden creative soul and I can relate to that. Personally, my desire to build artist robots comes from slightly different reasoning. I have a strong negative reaction to anthropocentrism, and as such I strive to build systems and objects that blur the boundary between humans and other entities. Through building intelligent systems, I wish to unveil some of the contradictions evident in people’s kin-selecting, anthropocentric belief systems. Anyway, that’s veering more into the philosophy of AGI, and I came here to talk about robots. So, back to the build!

I wanted to get to something that drew stuff on paper AS FAST AS POSSIBLE, so I didn’t really care how it looked or to what MacGyverish extent I had to take things to get the robot drawing.


I had a couple of servos lying around, some lego mindstorms NXT pieces and an Arduino, which I purchased last summer. The first thing I did was connect the servos to a makeshift frame up using some pieces of Lego Mindstorms. Interestingly the NXT brick wasn’t used for any active control; but it did prove useful as a weight to anchor the robot arm! I also used some custom parts (picture hanging hooks) to attach the lego to the servo-rotor. I ended up with two position-controllable joints in the system, one for the arm of the artist (moving the hand in an arc) and one which lifted the pen up and down (A bit like turtle http://en.wikipedia.org/wiki/Turtle_graphics)



I used the servo sketch in my Sparkfun electronics kit as a template to get the Arduino to control the servo. Next I needed to get the whole thing hooked up to my PC to enable higher level control from Python, so I installed pySerial in order to send simple serial commands over USB to the Arduino. The Arduino listens to the serial ports, and adjusts the position of the servo based on what Python sends to that port.

Getting the loop delays sorted out was a little fiddley. When controlling servos in this way, I found that you have to carefully balance two delays, an inner delay (which sets the duty cycle of the Pulse Width Modulated signal the servo interprets) and an outer delay which gives the servo enough time to reach the desired position before trying to move to the next position.



The first thing the robot did was manoeuvre the pen off the edge of the paper and onto the surface of my table. Supposedly a dry-wipe board pen, it kind of had the effect of a permanent marker. I chastised the robot; frowned at the table-marks. Then I thought how awesome it was that I had created a robot artist and the first thing it had done was scribbled angstily on my furniture. Sweet.

Here are some results from the first tests.



I knew that the little proto-artist would be dissembled and used for other builds in the future; it was way too hacked together to be kept intact. But it was fun to play with and I was inspired to continue with this project. There’s no machine learning or creativity in there yet, but there will be soon…

(Image credits: Shutterstock 114077734)