It's MeArm Pi on Kickstarter

Kickstarter can be a wonderful place to support great new ideas. One project that has sprung up and captured the hearts, minds (and pledges) of folk is MeArm Pi from Mime Industries. Following on from the very successful original MeArm robot arm Mime are presenting something great to the Raspberry Pi community. The project has already smashed it's £10k goal with almost £47k pledged at the time of writing. Doing the maths on the pledges that represents 844 arms at present. That's a lot of robotic hands to shake! Best of all: you still have until 6pm on March 8 to support the project and acquire your very own robot arm.

Mime describes MeArm Pi as "easy to assemble and not requiring extensive knowledge of electronics, the MeArm Pi STEM kit helps kids and adults learn robotics and teaches them how to code." That's cool. Very cool: robot arms are fun, programming is fun, and programming robot arms is twice the fun.

MeArm Pi

I briefly interviewed Ben Pirt joint founder of Mime. His passion for the new MeArm is clear: a desire to create a functioning robotic arm platform that simplifies the construction process enormously.

CD: What was the motivation to change the design of MeArm for MeArm Pi?

Ben: "The first MeArm has been built thousands of times (including a fair few times ourself!) and we wanted to broaden its appeal and try get even more children involved in making and programming it. So we decided to look at which parts of the build were particularly difficult. The number of screws came out as a big issue that was catching people out so we tried to re-work the design wherever possible not to need screws. Now the only screws left are on the joints where two pieces hinge together. The grip had a major re-work (from 9 screws down to 1) which made it much simpler to build."

It's worth pausing and considering this: the number of screws and fiddly components in a build really can influence the complexity and hence accessibility of the product. When I received the Maplin robot arm for Christmas a few years back I spent several hours putting together gear boxes, ensuring all was aligned and assembling the thing. While highly enjoyable in its own way (who doesn't like to build things) it was also frustrating: that's a lot of components to assemble *just* to get a fairly simple robot arm up and running! Mime's keen attempt to solve this build complexity problem is admirable.

Once I had built the Maplin arm I wanted to program it using a record and playback mechanism in Python. It was at this point I hit a few snags as precision playback just isn't easily possible with normal motors, and again it looks like MeArm Pi has overcome this issue.

CD: How accurate are the servos with MeArm Pi, i.e.: can you reliably pre-program repeatable movements?

Ben: "The servos are pretty accurate - they use metal gears for extra reliability. The big difference from the Maplin arm is that servos can be relied upon to be nicely repeatable so you can program them to do things again and again. Servos won’t drift out of calibration like motors."

Having non-drifting motors sounds like a dream come true! Don't get me wrong: I love the Maplin arm and easily recommend it to everyone as a low-cost way to get into robotics on the Raspberry Pi. Now though, Mime are offering a viable alternative that combines the hardware with ease of programming. Talking of programming, I asked Ben what else makes the MeArm so great:

Ben: "I think there are a lot of things that make the MeArm Pi better than the Maplin arm:
  • children build it themselves so they get a better understanding of how the mechanics works
  • the motor control is easier from the Raspberry Pi and can be programmed in any number of programming languages
  • the software is better and more suited to beginners"

It's worth noting that supplying purpose built control software to get up and running quickly is a great idea: it's what makes projects like Pi-Top so readily accessible for instance. Software for the Maplin arm does exist: we covered this in earlier issues of The MagPi a couple of times however it involves getting ones head around the internals of the USB protocol and while learning about USB Vendor IDs is "fun" in one way it certainly isn't conducive to encouraging people new to robotics into the hobby.

Ben also tells me that the age range of the arm is "officially...11+ but with some parental supervision it can be built by as young as 8 or 9 without too many problems." Producing a product that is interesting and accessible to age groups from primary to adult is a great achievement: "We believe in helping children to have fun whilst learning about technology and the MeArm Pi is completely designed around that goal". Superb.

It seems that MeArm Pi is not the only product that Mime are looking at for the future too:

Ben: "This is the first new product from Mime Industries since we formed the company. We’re going to be taking another look at updates to Mirobot as well as rolling the improvements to the MeArm mechanical design over to the other versions. We’ve got lots of ideas for new products but you’ll have to stay tuned for those!"

And stay tuned I most definitely shall.

MeArm Pi, available for another 6 days on Kickstarter.

MeArm Pi
Comments

George visits Digimakers to learn about whiskers

I spent yesterday at Bristol Digimakers having a fantastic time meeting lots of young people who had come along to the event to learn more about coding, robotics, Minecraft, robotics and robotics. There was definitely a theme going on. Digimakers has grown to be the place go to get hands-on experience of hacking and making. Backed by University of Bristol (kudos to the ever-energetic Caroline who does a great deal of the organising) and supported by a host of students and other individuals running their latest coding and hardware inventions a great vibe could be felt all day long.

As usual at Digimakers I set up a table with various demonstrations using Raspberry Pis, mainly focussed around Zumo George, my Behaviour-Driven Development robot. This time around I also included some Crafty Robots, a Hexbug Ant and Cartmanzilla.

Stand

Cartmanzilla_Zumo_Crafty_Hexbug
Cartmanzilla towers over the city. The little robots wonder how they will escape.

The aim of my table was to present two concepts: firstly programming robots based on defining behavioural outcomes (a right to left approach, for example: Event Storming) rather than a list of functional requirements (a left to right approach that may not lead to the desired outcome) allows non-technical people to be more involved in the creation of the robots that they will share their environment with. I've written about BDD with Zumo George before. Distilling the essence of BDD (conversations that discover outcomes of value that enable us to write tests that drive code) down to something that is easily digestible by youngsters proved challenging, but in general most seemed to understand. I think this was helped by having a working demonstration: Zumo George was given the behaviour of "don't get caught by Cartmanzilla" which in practical terms meant using his inbuilt IR sensor to retreat from Cartmanzilla when he approached, and to advance when Cartmanzilla retreated (all over the top of a lovely cityscape given to me by the great Tim Cox).

Secondly, I wanted to explore the idea of how prey avoids predators (and how predators catch prey) by looking at three different robots:
  • Crazy Robot just moves randomly and cannot react objectively to external stimulus (it can however sometimes bounce off things it bumps into)
  • Hexbug Ant has bump sensors front and rear and therefore can run away from anything it touches.
  • Zumo George can sort-of see (via his infrared sensor) what is in front and respond accordingly.
After playing with Cartmanzilla and the robots I asked two questions of the youngsters who came to my table:
  • If you were a mouse escaping from a cat which method (random, touch, sight) would you use to keep away from the cat?
  • If you were a cat trying to catch a mouse which method would you use?
For the first question everyone said sight, which is the obvious answer, as assuming that there is enough light for the mouse to see then this keeps a decent distance between it and the claws. For the second I was genuinely surprised that about a third of the students realised the cat would likely use a combination of sight and touch. Cats do just this: as they approach prey they primarily use sight, but when they make the final strike their whiskers swing forward to make contact with the prey which helps guide their jaws and claws in. To help reinforce this point I played a snippet from a BBC documentary that covers exactly this:



Watch the whole video or skip forward to 2m15s where they explain why and show a cat doing this. As the cat gets very close to the mouse it can no longer focus so it uses its whiskers to guide the prey to its mouth. If you have a pet cat you can likely see this in action: if your cat chases string or small toys then drag a toy in front of the cat to get it to almost-but-not-quite pounce (you may need to do this several times!) When the cat thinks about pouncing, but then gives up you can often (it's quick) see its whiskers twitch: that's the reflex starting to move them forwards (but stopping as the cat gives in). It is harder to see if the cat does pounce as this happens in the blink of an eye.

The interesting thing here is that my robot, Zumo George would benefit from exactly this kind of whisker technology. The Sharp GP2Y0A41SK0F infrared sensor is effective from about 4cm to 30cm. Hence, when an object is closer than ~4cm the sensor's vision is "blurred" and ineffective. This can be seen on the data sheet for the sensor in the graph on page four, which I have reproduced below. This graph shows the voltage returned on the analog pin for a given distance. Below about 3-4 cm the output voltage becomes wildly inaccurate. This is the point at which George's vision blurs resulting in him sometimes advancing and sometimes retreating, seemingly at random: he becomes little better at this distance at avoiding Cartmanzilla than the Crafty Robots.

Sharp_GP2Y0A41SK0F

Fortunately this is generally not a problem as we define the behaviour of George such that he should not get within 4cm of Cartmanzilla in a .features file that our Behaviour-Driven Development tool of choice (Lettuce in my case) can parse:

Feature: Move around the city avoiding monsters
In order for Zumo George to keep a safe distance from the monsters
As Zumo George
I want to retreat when the monsters get near

Rules:
- Retreat if a monster is less than 15 cm away
- Advance if a monster is greater than 15 cm away

From the above feature we have articulated and agreed a general outcome: don't get trodden on by Cartmanzilla as it will ruin your day. We then continue the conversation to discover scenarios of importance. It turns out that there are three, wrapped up in a single line of background in which we agree how close to Cartmanzilla we think is safe, and we add these to the .feature file:

Background:
Given the minimum distance to the monster is 15cm

Scenario: Advance, there are no monsters
When the distance to the monster is 16cm
Then I should "advance" in relation to the monster

Scenario: Stand still, hopeing the monster won't notice me
When the distance to the monster is 15cm
Then I should "halt" in relation to the monster

Scenario: Retreat, there are monsters
When the distance to the monster is 14cm
Then I should "flee" in relation to the monsters


As you can see, we have defined George's behaviour to be that he should attempt, whenever possible, to stay at least 15cm from Cartmanzilla (the monster).

Behaviour-Driven Development works when people have conversations to discover new and different outcomes. It was great to work with the youngsters at my table to vary the minimum distance for George. We could immediately see the little robot scurry backwards when our outcome was that it was unsafe to be so close to Cartmanzilla or to scuttle forwards to a new minimum distance when we felt the outcome was that it was safe to be closed. Being able to talk about, model and try out the effects of varying outcomes in a safe way without causing George to immediately run amok and leap to certain doom from the table edge was great. The kids definitely seemed to enjoy this modelling exercise, and I did too.

Across the rest of the event a large number of other robots could be seen. Here's Steve. He talks back when you talk to him (and sometimes makes embarrassing mistakes):

Steve
This is Steve. Steve was apparently "getting a bit murderous".

Tim Cox ran an excellent workshop and had set up a cityscape full of vehicles, interconnected traffic lights each using PiStop (available from 4tronix) and an enormous (by comparison) Me Arm controllable by a Raspberry Pi and, I'm guessing, python-curses judging by the look of the output on the screen. I was impressed with the Me Arm. I have previously done something similar using the Maplin Robot Arm and the Pi, but I don't like the imprecise geared motors in the Maplin arm. By contrast the Me Arm was much more precise even though it too is not using stepper motors. The screen you can see is from Banggood.com.
MeArm_TimCox2
"Watch out for the Evil Claw" cried the residents of PiCity.

Someone (sorry Someone, I didn't catch your name) had created a Sentry Gun complete with the required "beep......beep.....beep...beep..beep.beep.beep" heard in the film Aliens from the related motion tracker technology.
TrackingGun2
If you hear this noise run away and hide.

A couple of students presented a fruit-music maker connected to a Raspberry Pi. Their approach was different to what I have seen before as they were not relying on one completing a circuit (touch a wire, and with your other hand touch the fruit to make a sound play), but were instead relying on (we think) a capacitive drop when you touched the fruit ("touch the kiwi fruit and absorb its power").... or perhaps it was due to electromagnetic fields. They are currently going through a process of elimination as they learn how exactly this works. However it worked, it worked well.

Fruit_music
Play that funky banana!

Various other workshops and exhibits ran throughout the day including working with a BBC Buggy and separately, Hedgehog Bots controlled by Arduino Nano and invented by Scott and Joe, graduates from University of Bristol. There was also a horizontal scrolling game controlled by a device one wears that picks up on electrical activity in the brain; you moved up by thinking more and down by thinking less... it was important to not actively think about thinking less. Sadly I forgot to get a photo of these great projects.

Saving the best to last there was Josh who presented an AWESOME Persistence of Vision project. Several rows of LEDs spinning at about 1000RPM (I think that was the speed...) He had animations running, could draw and persist a cube or the time and all sorts of other patterns. It looked great, was a tidy build and captivated us all like moths to a light bulb.

Persistance_Of_Vision_PoV_Josh
Must...not...look...at...the...lights. Oooooh shiny.

Digimakers has again lived up to expectations with Caroline and the team keeping everything running smoothly throughout the day.

The next event is currently scheduled for June, hope to see you there.
Comments