Zumo George gets upgrades (part 1)

Everything eventually needs an upgrade. You may think that pencil 1.0 was great, but if Apple has taught us anything: we all need pencil 2.0. I jest, although that said it is time for Zumo George, one of my Raspberry Pi robots to receive the 2.0 make-over. This is brought on by two things:


Previously I had thought of upgrading from the Raspberry Pi A+ to a Zero purely to save some space, enabling me to get a bit o'real-estate back as George measure but 10cm x 10cm. However I would still have the WiFi dongle a-dongling, only it would be dangling from a micro to full-size USB. Dongles dangling from dongles (there's a song in there somewhere) made me sad: "if only a variant of the Zero came with WiFi", I thought. Fantastic news Pi fans: the Foundation delivered.

The Raspberry Pi Zero W is essentially a Zero (same CPU, same RAM, same form factor) with the added bonus of a combined WiFi and Bluetooth chip. Also for our inner geek the Foundation has included the coolest antenna I've seen yet which features a triangular resonant cavity. The MagPi magazine covered the antenna in detail just the other day in Issue 55. Proant, a Swedish company, have licensed the tech to the Foundation.
TheMagPi_55_ZeroWAntenna
The MagPi, Issue 55. Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported (CC BY-NC-SA 3.0)

Given the move to the slimmest of Raspberry Pi's it is also time to move from the Pimoroni Explorer Pro to the Explorer pHAT. This half-pint size board has many of the features of it's larger sibling and is a perfect match for the Zero W.

Putting it all together here are collection of parts:

Zumo George Pi Zero W upgrade

Any observant bod will quickly notice something missing. Yes I hang my head in shame and join the "forgot to order a 40-pin header for the Zero" club. D'oh! eBay quickly to the rescue. Given this tiny omission the build is on temporary hold for a few days. Still, let's get the blade in place because sumo blades == awesome. While we're at it let's have a preview of where the Zero is going to go. With all ports along one long edge I can now have these poking backwards from George. You can also see the extra space I am gaining from the move to the Zero W from the A+.

ZumoGeorge_sumo_blade

I am expecting great things from the sumo blade and am already thinking about how to modify my BDD behaviours and code to take advantage: Zumo George shall no longer retreat in fear from Cartmanzilla.

Stay tuned for Part 2, entitled: "Ahah the header has arrived!"

PS: yes those wires are going to get significantly shortened ;)
Comments

It's all gone quantum at Digimakers

Last Saturday I had a great time at Bristol's Digimakers. I regularly attend this superb event, running a stand and get the opportunity to talk computers and science with children, parents and teachers. This time around I focused on Behaviour-Driven Development (which I've covered before) with a side order or LED and ePaper displays for the Raspberry Pi and Pi Zero from Pi-Supply.
Digimakers June 2016

Several organisations and lots of students ran demonstrations, workshops and drop-in help sessions throughout the day. This is something especially neat about Digimakers: it's not focussed on a single technology as the supposed solution to all scenarios, but instead showcases lots of complementary technologies. We had Raspberry Pi, Arduino, custom things that I don't quite understand and more besides all used as the basis for a number of very interesting projects.

The computer science and engineering students from the University of Bristol continue to impress. Anthony really hit the nail with his sound wave generator which produced a fantastic musical accompaniment for the day when hooked up to Apple's Logic Pro X. If you're reading this and looking to hire an audio engineer then he definitely deserves the job!
Andrew's marvellous musical vibrations

Directly opposite was Matthew Hockley with a swarm of cute robots that were running a simple algorithm related to locality of their neighbours triggering different light patterns. We talked about how us fallible humans like to anthropomorphise whenever given a chance to do so and I postulated that the random movement of his swarm would be seen as "good" or "evil" if he put green smiley faces or red angry faces on top of each robot. Matthew agreed that we do tend to read more into such critters than is deserved as they're not really responsible agents (an update to the Three Laws that were just a plot device for Asimov and not something to base a real robot on) as Alan Winfield notes in his excellent, accessible book, Robotics: A Very Short Introduction.


They appear to be benign, but if you look closely you can see them plotting world domination.

Students and a teacher from Cotham School were back with their arcade cabinet, and this time also had two "Mini Me" versions (as I like to think of them) present. Sadly I forgot to get a photo, but these proved extremely popular. I think the brief goes along the lines of: "yes, you can play computer games at school providing you program those games." It's a great idea, very well executed.

Talking of schools: I had a great chat with Stewart Edmondson, CEO of the UK Electronics Skills Foundation. They believe absolutely that teaching software is not enough and that kids should be getting hands on experience of electronics. I wholeheartedly agree! As I started secondary school in the 1980s I caught the last of the software-related computer lessons before "IT" became "ICT" with the "C" somehow (apparently) meaning "Word & Excel". However I never learnt electronics in school and feel very much I'm enormously behind the learning curve here. Although I've built my own circuits, read lots of tutorials in books and The MagPi magazine and bought and experimented with stacks of components it all does feel very unstructured, as though I am missing the fundamental underpinnings that school ought to have taught me. There is a huge benefit to learning things when your brain is still wired to absorb knowledge like a sponge. At Digimakers they brought along an electronics project kit called MicroBox to get those brain cells firing and this proved very popular.

Ok, so what has all this to do with the title of this post? One of the workshops focussed on Quantum Computing for kids (yes, you did read that right!) While I unfortunately was unable to get away from my stand for long enough to listen in I had a wonderful conversation with a 14 year old girl who popped over afterwards. It started in just the way you don't expect a conversation with a teenager to start: "I'm off to Google to study quantum computing as a way to break ciphers." We then conversed about such things, including a detour to discuss the shape of the universe and the relative sizes of different infinities, the difference between passive and active hacking (which, fortunately she is very aware of - this difference needs to be taught in schools!), that she'd spent the morning learning about ciphers in Python in one of the sessions and that she's already up to speed on inspecting web elements and the like... Awesome. This was the highlight of the day for me.

The next Digimakers is on October 29th at At-Bristol. If you are planning on attending you should register in advance as this event is very popular.
Comments

Zumo George avoids Cartmanzilla at CukeUp!

BDD_for_8-year_olds
On Thursday 14th and Friday 15th April I went to CukeUp! London 2016. The Behaviour-Driven Development community met for two days to share ideas and skills relating to my favourite delivery methodology. The event was fantastic and on a par with last year. Inspired by day one, and with an open slot to deliver a lightning talk lasting just five minutes I set about writing a presentation describing Cartmanzilla versus Zumo George... at 2am... after a few beers and a rather tasty gin. The following morning I re-wrote much of my talk to eliminate 95% of the whiz-bang transitions that had somehow crept into several slides (not sure how that happened). For some reason I had also thought a clearly marked slot for a "5 minute" talk was 10 minutes in duration (proof positive that gin slows the passage of time), and quickly edited it again after confirming with Matt Wynne that 5=5 and not 5=10. Still, overall I managed to get 80% of my message across in just minutes.

Skills Matter, who hosted CukeUp! have kindly put a video of my talk online. You will need to register (painless and quick) with Skills Matter to view it.

A quick recap: Cartmanzilla the monster has invaded robot city and the plucky little robots have to keep away from him. Only Zumo George is programmable, and his general behaviours (keep away from monsters) are determined by feature files that contain behavioural specifications written in the Gherkin syntax of BDD:

Given [a precondition]
When [an event]
Then [an outcome]

Each line of the Gherkin causes a related block of test code to be executed, and when every line of test code passes your software is green, i.e.: the behaviours are working as expected.

I've covered Zumo George and the use of BDD with this robot over a few prior posts, including a specific write-up of Cartmanzilla vs Zumo George at Bristol Digimakers. What is interesting as I read back over previous posts, and I noted this in my talk, is that my first attempts to write scenarios were essentially attempts to describe the functional aspects of George, where-as my later attempts are closer to the behaviours that I originally envisaged: sneaking towards the monster when he's not paying attention and fleeing when the monster gives chase.

Comments

George visits Digimakers to learn about whiskers

I spent yesterday at Bristol Digimakers having a fantastic time meeting lots of young people who had come along to the event to learn more about coding, robotics, Minecraft, robotics and robotics. There was definitely a theme going on. Digimakers has grown to be the place go to get hands-on experience of hacking and making. Backed by University of Bristol (kudos to the ever-energetic Caroline who does a great deal of the organising) and supported by a host of students and other individuals running their latest coding and hardware inventions a great vibe could be felt all day long.

As usual at Digimakers I set up a table with various demonstrations using Raspberry Pis, mainly focussed around Zumo George, my Behaviour-Driven Development robot. This time around I also included some Crafty Robots, a Hexbug Ant and Cartmanzilla.

Stand

Cartmanzilla_Zumo_Crafty_Hexbug
Cartmanzilla towers over the city. The little robots wonder how they will escape.

The aim of my table was to present two concepts: firstly programming robots based on defining behavioural outcomes (a right to left approach, for example: Event Storming) rather than a list of functional requirements (a left to right approach that may not lead to the desired outcome) allows non-technical people to be more involved in the creation of the robots that they will share their environment with. I've written about BDD with Zumo George before. Distilling the essence of BDD (conversations that discover outcomes of value that enable us to write tests that drive code) down to something that is easily digestible by youngsters proved challenging, but in general most seemed to understand. I think this was helped by having a working demonstration: Zumo George was given the behaviour of "don't get caught by Cartmanzilla" which in practical terms meant using his inbuilt IR sensor to retreat from Cartmanzilla when he approached, and to advance when Cartmanzilla retreated (all over the top of a lovely cityscape given to me by the great Tim Cox).

Secondly, I wanted to explore the idea of how prey avoids predators (and how predators catch prey) by looking at three different robots:
  • Crazy Robot just moves randomly and cannot react objectively to external stimulus (it can however sometimes bounce off things it bumps into)
  • Hexbug Ant has bump sensors front and rear and therefore can run away from anything it touches.
  • Zumo George can sort-of see (via his infrared sensor) what is in front and respond accordingly.
After playing with Cartmanzilla and the robots I asked two questions of the youngsters who came to my table:
  • If you were a mouse escaping from a cat which method (random, touch, sight) would you use to keep away from the cat?
  • If you were a cat trying to catch a mouse which method would you use?
For the first question everyone said sight, which is the obvious answer, as assuming that there is enough light for the mouse to see then this keeps a decent distance between it and the claws. For the second I was genuinely surprised that about a third of the students realised the cat would likely use a combination of sight and touch. Cats do just this: as they approach prey they primarily use sight, but when they make the final strike their whiskers swing forward to make contact with the prey which helps guide their jaws and claws in. To help reinforce this point I played a snippet from a BBC documentary that covers exactly this:



Watch the whole video or skip forward to 2m15s where they explain why and show a cat doing this. As the cat gets very close to the mouse it can no longer focus so it uses its whiskers to guide the prey to its mouth. If you have a pet cat you can likely see this in action: if your cat chases string or small toys then drag a toy in front of the cat to get it to almost-but-not-quite pounce (you may need to do this several times!) When the cat thinks about pouncing, but then gives up you can often (it's quick) see its whiskers twitch: that's the reflex starting to move them forwards (but stopping as the cat gives in). It is harder to see if the cat does pounce as this happens in the blink of an eye.

The interesting thing here is that my robot, Zumo George would benefit from exactly this kind of whisker technology. The Sharp GP2Y0A41SK0F infrared sensor is effective from about 4cm to 30cm. Hence, when an object is closer than ~4cm the sensor's vision is "blurred" and ineffective. This can be seen on the data sheet for the sensor in the graph on page four, which I have reproduced below. This graph shows the voltage returned on the analog pin for a given distance. Below about 3-4 cm the output voltage becomes wildly inaccurate. This is the point at which George's vision blurs resulting in him sometimes advancing and sometimes retreating, seemingly at random: he becomes little better at this distance at avoiding Cartmanzilla than the Crafty Robots.

Sharp_GP2Y0A41SK0F

Fortunately this is generally not a problem as we define the behaviour of George such that he should not get within 4cm of Cartmanzilla in a .features file that our Behaviour-Driven Development tool of choice (Lettuce in my case) can parse:

Feature: Move around the city avoiding monsters
In order for Zumo George to keep a safe distance from the monsters
As Zumo George
I want to retreat when the monsters get near

Rules:
- Retreat if a monster is less than 15 cm away
- Advance if a monster is greater than 15 cm away

From the above feature we have articulated and agreed a general outcome: don't get trodden on by Cartmanzilla as it will ruin your day. We then continue the conversation to discover scenarios of importance. It turns out that there are three, wrapped up in a single line of background in which we agree how close to Cartmanzilla we think is safe, and we add these to the .feature file:

Background:
Given the minimum distance to the monster is 15cm

Scenario: Advance, there are no monsters
When the distance to the monster is 16cm
Then I should "advance" in relation to the monster

Scenario: Stand still, hopeing the monster won't notice me
When the distance to the monster is 15cm
Then I should "halt" in relation to the monster

Scenario: Retreat, there are monsters
When the distance to the monster is 14cm
Then I should "flee" in relation to the monsters


As you can see, we have defined George's behaviour to be that he should attempt, whenever possible, to stay at least 15cm from Cartmanzilla (the monster).

Behaviour-Driven Development works when people have conversations to discover new and different outcomes. It was great to work with the youngsters at my table to vary the minimum distance for George. We could immediately see the little robot scurry backwards when our outcome was that it was unsafe to be so close to Cartmanzilla or to scuttle forwards to a new minimum distance when we felt the outcome was that it was safe to be closed. Being able to talk about, model and try out the effects of varying outcomes in a safe way without causing George to immediately run amok and leap to certain doom from the table edge was great. The kids definitely seemed to enjoy this modelling exercise, and I did too.

Across the rest of the event a large number of other robots could be seen. Here's Steve. He talks back when you talk to him (and sometimes makes embarrassing mistakes):

Steve
This is Steve. Steve was apparently "getting a bit murderous".

Tim Cox ran an excellent workshop and had set up a cityscape full of vehicles, interconnected traffic lights each using PiStop (available from 4tronix) and an enormous (by comparison) Me Arm controllable by a Raspberry Pi and, I'm guessing, python-curses judging by the look of the output on the screen. I was impressed with the Me Arm. I have previously done something similar using the Maplin Robot Arm and the Pi, but I don't like the imprecise geared motors in the Maplin arm. By contrast the Me Arm was much more precise even though it too is not using stepper motors. The screen you can see is from Banggood.com.
MeArm_TimCox2
"Watch out for the Evil Claw" cried the residents of PiCity.

Someone (sorry Someone, I didn't catch your name) had created a Sentry Gun complete with the required "beep......beep.....beep...beep..beep.beep.beep" heard in the film Aliens from the related motion tracker technology.
TrackingGun2
If you hear this noise run away and hide.

A couple of students presented a fruit-music maker connected to a Raspberry Pi. Their approach was different to what I have seen before as they were not relying on one completing a circuit (touch a wire, and with your other hand touch the fruit to make a sound play), but were instead relying on (we think) a capacitive drop when you touched the fruit ("touch the kiwi fruit and absorb its power").... or perhaps it was due to electromagnetic fields. They are currently going through a process of elimination as they learn how exactly this works. However it worked, it worked well.

Fruit_music
Play that funky banana!

Various other workshops and exhibits ran throughout the day including working with a BBC Buggy and separately, Hedgehog Bots controlled by Arduino Nano and invented by Scott and Joe, graduates from University of Bristol. There was also a horizontal scrolling game controlled by a device one wears that picks up on electrical activity in the brain; you moved up by thinking more and down by thinking less... it was important to not actively think about thinking less. Sadly I forgot to get a photo of these great projects.

Saving the best to last there was Josh who presented an AWESOME Persistence of Vision project. Several rows of LEDs spinning at about 1000RPM (I think that was the speed...) He had animations running, could draw and persist a cube or the time and all sorts of other patterns. It looked great, was a tidy build and captivated us all like moths to a light bulb.

Persistance_Of_Vision_PoV_Josh
Must...not...look...at...the...lights. Oooooh shiny.

Digimakers has again lived up to expectations with Caroline and the team keeping everything running smoothly throughout the day.

The next event is currently scheduled for June, hope to see you there.
Comments

Robotics 2: Using the Sharp GP2Y0A41SK0F IR distance sensor with Explorer HAT Pro

The HC-SR04 is a clever component. By measuring the time delay between signals from the included board one can easily calculate the distance to objects. Well, that’s the theory. Unfortunately it turns out that it does not play that well with Explorer HAT Pro board that it is connected to when using the provided explorerhat library. I’ve observed that ranges to perpendicular objects (to give the best result) are miscalculated by up to a metre (plus or minus). This seems to be a timing issue as the HC-SR04 does not return the distance to an object, but instead it sends a connected pin high for the same time as it took for a pulse of ultrasound to bounce back.

Instead, Zumo George has received an upgrade in the form of the Sharp GP2Y0A41SK0F infrared distance sensor. This has the added benefit of making him look more Goliath* and less WALL-E. This is an analog device that runs at 5V which matches perfectly with the Explorer HAT’s analog inputs. It measures distances accurately from 4cm to 30cm which, for the purpose of “don’t crash into an object” is perfect.
ZumGeorge_with_IR
Wiring is straightforward with GND and Vcc to their respective 5V pins and the third cable to one of the analog pins (I use pin four). I bought the sensor on eBay for a few quid and the cable it came with did not have 0.1” pins attached at the breadboard end of things. A quick bit of soldering and heat shrinking later and we’re ready to go.

Next I needed to add to my library of tests for Zumo George to ensure that when he boots up all is A-OK. Let’s write a scenario:

@ir
Scenario: Verify infrared range finder is responding
  Given the distance to an object is greater than 10 cm
  When I read the infrared range finder
  Then I should see a distance greater than 10 cm


Now we execute this with Lettuce. Note though that I have added a tag with the @ symbol to enable me to run just this scenario while we get it working. Hence we:

sudo lettuce -t ir

We need sudo because Pimoroni’s Explorer HAT Pro requires this, and we use the -t parameter to specify the tag to execute.

Immediately we see that our three steps have not been defined, and Lettuce helpfully returns suggested step definitions that assert False. Copying these into place and re-executing moves us from not implemented to not coded, with the three steps each going red.

At this point we need to implement some code to make everything work as it should. To do this I have decided to create a new Python module, zumo.py which will contain specific functions required by Zumo George. We are going to need a way of determining distance by using the GP2Y0A41SK0F sensor, hence in zumo.py I enter the following to create a read_distance() function that we can call from a step definition:

import explorerhat
import time

def read_distance():
  # with help from: http://www.yoctopuce.com/EN/article/an-usb-optical-telemeter
  # and an idea from http://jeremyblythe.blogspot.co.uk/2012/09/raspberry-pi-distance-measuring-sensor.html
  v_readings = []
  # read the voltage 10 times so that we can get a decent average
  for i in range (0,10):
    v_readings.append(explorerhat.analog.four.read())

  av_voltage = sum(v_readings)/10.0
  if av_voltage <= 0:
    av_voltage = 0.001
  distance_cm = 13 / av_voltage
  return distance_cm


The GP2Y0A41SK0F works on the principle that distance to an object is inversely proportional to the voltage that can be read from the connection on analog pin four. In other words the higher the voltage the lower the distance, and our equation takes this into account. The number 13 was determined by looking at the datasheet on the Pololu product page. In the graph we can determine for a reading of 1V we should be 13cm away from our object, i.e.: 13/1 = 13cm. I got the idea to do things this way from Yoctopuce who used another Sharp IR sensor. Their magic number was 60 for the GP2Y0A02YK0F which they obtained in the same way.

Into steps.py I then add the following three step definitions as well as the import zumo command. The Given sets up the expected result, the When is the event (i.e.: read the distance) and the Then is the comparison. I will be honest and say that in Python I don’t know if using global variables in this way is a good thing or not (note to self: must research this), but it works, so at this stage I am not too concerned:

@step(u'the distance to an object is greater than (.*?) cm')
def the_distance_to_an_object_is_greater_than(step, distance):
    global minimum_expected_distance
    minimum_expected_distance = distance

@step(u'I read the infrared range finder')
def i_read_the_infrared_range_finder(step):
    global actual_distance
    actual_distance = zumo.read_distance()

@step(u'I should see a distance greater than (.*?) cm')
def i_should_see_a_distance_greater_than(step, expected_distance):
    assert float(actual_distance) > float(minimum_expected_distance), "Distance returned = %d" % actual_distance


Running Lettuce a third time now shows everything is green, meaning our scenario is passing and our code to generate distances is working. Well, sort of. We have to remember that we are now mixing a controlled software environment with real-world robotics and as the adage goes, “anything can happen in the next half hour.” Clearly this scenario will only pass if the distance from Zumo George to an object is greater than 10cm. For me this is perfect as George always starts his working day facing away from obstacles. We could of course change our code to simulate the response from the GP2Y0A41SK0F (not the easiest of components to pronounce, or spell) but then we are not demonstrating the desired real-world behaviour: that when Zumo George is not facing a nearby object he shall be a happy robot ready to drive.

ir_test

You may note that something strange has happened though. We are clearly only running one scenario against a single feature by specifying the tag ir yet Lettuce is reporting that 2 features have passed (which you would think implies a minimum of two scenarios executed). I think of it this way: we have told Lettuce that only a single feature shall be run by telling it absolutely (in this case) that only a single tag containing a single scenario is within scope, therefore we have told Lettuce that the other feature passes and it counts it as such. This is on the basis that BDD should be all-or-nothing, i.e.: to demonstrate that software is fit for purpose all of our scenarios must pass and we are stating we are taking responsibility for this decision in this other case. Another way to think of it is that as the other feature contains no scenarios that we execute there is nothing that can fail, i.e.: it passes by default. It is something to be aware of though as it can make your numbers look a little strange if you are not used to seeing this outcome.

Next time: The Ryanteck 3 Line Follow sensor

* The evil truck from Knight Rider.
Comments

Robotics 1: Zumo George, a BDD rover

It is time to build a new rover, and to take a different look at how we can determine how it is controlled by a series of behaviours, defined using the tenets of Behaviour-Driven Development. BDD is a superb way to undertake development as it emphasises genuine communication and collaboration between business stakeholders, Developers and Testers. Behaviours are defined as features and scenarios, the latter elicited as specific examples using the Gherkin syntax, for example:

Scenario: Drive forwards
   Given Zumo George is greater than 10cm from a wall
   When power is applied to the motors
   Then Zumo George should drive forwards

BDD scenarios are written before other code, and determine what code is written. Hence no wastage: we develop what is required and it most likely passes first time. When the scenarios (the tests) are executed we can clearly see which steps pass (green), fail (red), or have not yet been implemented (that sort of muddy green-yellowy-brown).

I will be exploring the use of BDD to program this rover in quite some depth over a number of articles, intermixing some electronics to show how BDD is an ideal abstraction to explore behaviour-based robotic control. I’m going to term this BDR, Behaviour-Driven Robotics.

ZumoLasers2

In the above example you will note too important points: firstly in the second feature I am defining scenarios that can be used for an internal diagnostic on Zumo George, for example: “Confirm ultrasonic is responding”. These tests will be executed every time Zumo George boots and if any test fails then he will flash a red light and not commence roving which should avoid the problem of an out of control robot. Perhaps more importantly, you can see that George has no lasers and has not been programmed to use them in any case (the first three undefined steps). This is fortunate for obvious reasons.

You will already have noticed that I have named my robot Zumo George and given him agency. I think this is a good thing as he will be expected to mimic certain human behaviours (e.g.: don’t bump your head on a wall while walking / driving). Agency in a robot enables me to mimic human-like behaviours in code. It does however mean I will find myself leaning towards anthropomorphism, referring to Zumo George as “he” rather than “it”.

The chassis of the rover is based on Pololu’s Zumo (hence the rover’s full name) and Pimoroni’s recently released Explorer HAT Pro. The Zumo is available in a number of variants for Arduino such as the all singing Zumo 32U4 which has sensors, buzzer, LCD, accelerometer and more, and also the bare-bones Zumo Chassis Kit which is perfect for the Raspberry Pi as we can add our own electronics.

ZumoGeorge

The Zumo is very compact meaning that any model B is borderline too large. Whoever would have thought the words “too large” would be used to describe the Raspberry Pi! This is because the Zumo is designed to take part in Mini Sumo competitions where the robot must conform to dimensions of 10cm by 10cm. To be honest, a B/B+/v2B would just about squeeze into the available space and certainly others have created Zumo robots using the B. To minimise the footprint I have opted for a Model A+ which is almost small enough (ahh if only it was 1.5mm thinner. More about this in a later article).

I purchased my Zumo from those excellent people at Pimoroni, and also elected for two 95:1 ratio motors. The motors are intentionally purchased separately to enable you to opt for those that best match your robotics need (essentially outright speed versus torque). You can easily drop in replacement motors if you change your mind at a later date as the plastic motor cover on the chassis is removed with just two screws.

ZumoGeorgeInPieces

At this point a confession is in order: I like to tinker with things and see if I can break them. I think this is because my day job is in software testing. Unfortunately this tendency to meddle with things caused me to break one of the motors, necessitating ordering a third. The lesson quickly learned is never manually rotate the drive shaft of the motor as you will quickly grind down the gears until it slips horrendously in use. I share this in the hope that you only have to purchase two motors. On the plus-side I do have a (slightly crippled) spare motor I can now disassemble to better understand how the gearing works. To be honest the motor mechanism itself is fine as it is just the cogs that are worn so there is hope yet that I can resurrect this for a future project.

I considered various options to control Zumo George’s motors and in the end put four possible solutions up against each other in a winner-takes-Zumo knock-out:

PicoBorg is tiny, it really is, and is a great first-step into robotic motor controllers. I’ve had great success using this with a larger robot based on the Magician Chassis. However it is not bi-directional without the use of a pair of 5V relays and including such immediately bulks out the parts list as one also needs a board and cabling to mount them on. Bi-directional capability is essential in a tracked robot (IMHO) as it provides the ability to turn on the spot and not just in an arc. PicoBorg Reverse was briefly considered as a possible solution, but at £31 for the board was felt to be pushing the budget a bit.

The L298N is the go-to staple of bi-directional controllers. With prices in the £1.50 to £4 region it wins on cost. However it is a comparatively bulky thing with a big heat sink rising vertically, and as a 5V device requires additional circuitry for safe usage with the Raspberry Pi. This makes using it in a compact platform somewhat tricky.

The Ryanteck RPi Motor Controller Board is a great bit of kit. It provides bi-directional motors and is compact. Ryan has done an excellent job of documenting the board and providing example code in Python. The GPIO pins are also exposed making it easy to add further electronics (PicoBorg can be mounted on TriBorg for the same effect). I was all set to purchase this board when I spotted an announcement from Pimoroni...

The Pimoroni Explorer HAT Pro is a crazy good board. Using the available easy-peasy Python library one can control both bi-directional motors and take advantage of a large number of other available inputs and outputs. These include four capacitive touch sensors, four pads to attach crocodile leads to, four buffered 5V input and four buffered 5V output pins, two 5V buffered ground pins and 5V buffered analog pins. Also down one edge of the board are an array of 3v3 pins for use that are not buffered. To finish the board off there is even space for an included mini breadboard to be mounted on top. Coming in at the same size as the A+ this is my new favourite wonder board for the Raspberry Pi.

My parts list is now:

  • Pololu Zumo Chassis
  • 2x 95:1 micro metal gear motors (I needed 3 *ahem*)
  • Pimoroni Explorer HAT Pro with mini breadboard
  • Generic tiny WiFi network adaptor
  • 16GB Integral Micro SD Card
  • Raspberry Pi Camera
  • Bendy arm thing to hold camera up
  • Pimoroni Camera Mount
  • HC-SR04 ultrasonic distance sensor a Sharp GP2Y0A41SK0F IR distance sensor, sourced from eBay (see Part Two)
  • 3x plastic legs to raise up the Raspberry Pi Model A (more on this next time)
  • An assortment of wires to hook everything up.
  • Ryanteck 3 Line Follow Sensor (not shown in the above photographs)
You will note two seemingly obvious missing items, namely a battery and some kind of game controller to drive Zumo George for when he is not in auto-roving mode. I am also investigating pan and tilt mechanisms for the camera and / or distance sensor. These parts will be covered in later articles. I have something cunning in mind for the battery but can’t say more about this at present.

Next time: Replacing the HC-SR04 due to a technical hiccup
Comments