Happy New Year!

Wow, it’s been a month since our last update.  That’s partly because we’ve been working on a talk for the upcoming conference, and partly because we’re too lazy after Christmas.  Anyway, lets update you on progress.

Feed the Fish

We have progress on this game, but there’s more to do.

The robot does the appropriate moves around the course, it aligns itself on the aquarium using video guidance, then it stops at the appropriate place based on the front time-of-flight sensor.  The catapult fires, and sometimes, five projectiles land in the aquarium.  But only sometimes.

Why only sometimes?  The robot generally manoeuvres to within a few millimeters of the stop position; it seems unlikely that this is the problem.  More likely is that the catapult does not fire the projectiles consistently.

The next step for this game is to film some shots in slow motion to see if we can see where the problem is.

Oh, and we need to build an aquarium.

 Tidy Up the Toys

‘Toys is a much better story.  So long as the moves are kept slow the game works.  We have quite a few videos of the boxes being autonomously collected and nicely stacked in the target box.  For now, we’re going to call this done.

It might be that we could make the robot move faster but that will require re-tuning of some parameters.  Perhaps if there’s still time when the other games are in the bag?

Up the Garden Path

This is the next game to address.

To begin with, we are going to use the old line following code that was originally used in P19.  It uses a Pi camera to follow the line.  The code will need to be updated because it doesn’t understand junctions, but this doesn’t seem that tricky to do.

Also, the code follows a white line on a black surface, but we will have a black line on a beige background (since our arena is made of hardboard).  Again, a pretty easy change.

The next step for this game is to mark up the course.  Break out the tape!

DIY Obstacle Course

Nothing to report on this game yet.  Our chief driver frequently drives P21 around under radio control so the robot (and driver) are completely ready to take on a course.  It’s just that we don’t have a course yet.

We would like to do the course soon though.  We have a pond which sometimes freezes over.  It would be cool to see some ice driving!  So we need to film this game while the weather is cold.

That’s all for now.  Until next time…

Move Over P20!

P20 (the robot we built for PiWars 2020) had loads of improvements over P19 (the machine we entered for PiWars 2019).  But most of the improvements were kind of… “incremental”: none of them resulted in a big performance increase.  For PiWars@Home we decided there was opportunity for fundamental change.

So, we present P21; our machine for the 2021 games.

P21, Our New Robot for PiWars@Home

I know, it looks just like P20.  But P21 has one big upgrade: it uses new motors.  The motors have encoders, which, when combined with PID enabled motor drivers, allow much finer control.  And since we’re expecting better control, we’ve gone for a 50% faster output shaft RPM so that the robot is faster too.

This change has taken a long time to complete since it has required some fundamental changes.

  • Our designs employ an Arduino based peripheral control board which perform the low level, timing-constrained, jobs on behalf of the Raspberry Pi.  We have replaced this board with a new one since each motor now requires a couple of digital signals from its encoder.
  • The peripheral controller software has also been re-written.  The motors are now interfaced via PID algorithms; this means that when the Raspberry Pi requests a motor speed (expressed as a percentage of full speed), that is what it gets.  With the previous (pulse width modulated) control scheme, you get a vague approximation.  At low speeds control is particularly poor with PWM systems where no motor speed feedback is employed (because of the friction that is inherent with gearboxes).
  • The chassis has also been redesigned and re-printed since the motors are a different size.

There is one other difference between P20 and P21: the implement interface has been changed.  P20 provided a bunch of servo outputs for implements; so, it is simple to connect servos with no further electronics.  P21’s implement interface provides all the internal voltages (5.2v, 6v and the battery voltage) plus a two-way serial interface.  This change was largely imposed by the toy box lift that uses motor pulses to measure the position of the jaws above the ground; the distance of travel is too much for a conventional servo.  The new interface is far more flexible than the old system, but it requires every implement to have a dedicated control board. There’s no such thing as a free lunch.

P21’s hardware and low-level software is now largely complete.  We now need to work back through the higher-level code in the Raspberry Pi to take advantage of P21’s increased performance.  We’re hoping to see some big improvements soon!

The Tidy Up the Toys Challenge

We have progress to report on the Tidy Up the Toys challenge.

To begin with we thought we would adapt our Eco Disaster barrel lifter (shown below) for the tidy up the toys challenge.

Our Eco-Disaster Barrel Lift

However, this has a few shortcomings:

  • The barrel lift only raises the barrels by about 15mm.  This is plenty for Eco-Disaster but not adequate to place one toy box (which are 50mm tall) onto another.
  • When the barrel lift is carrying a barrel, the camera is more-or-less blinded.  To be honest, this is not ideal for Eco-Disaster either.
  • When PiDrogen is picking up a barrel it uses its forward-facing time of flight (ToF) sensor to measure the distance to the barrel.  But the ToF sensor is mounted too high in the chassis to see a box; instead, it looks over the top.
  • Finally, various components of the barrel lift are mounted under PiDrogen’s hull.  As a result, it is necessary to use large diameter tyres when using it.  However, autonomous manoeuvres work better when PiDrogen is fitted with smaller diameter wheels.

We ended up designing a new lift system; see below:

Tidy Up the Toys Lift

The new design addresses the issues previously mentioned:

  • It can lift boxes beyond 50mm, so it can place one box on another.  It can also lift one, two or three boxes, so it can place two boxes on a third box, then lift all three.
  • The lift mechanism now uses a rack and pinion system which operates more smoothly than the barrel lift (which used a servo to create the lifting motion).  This means the lift can reliably lift three boxes without them toppling.
  • If the lift moves high enough (up to about 80mm) then the camera can see under the lift mechanism, even if it is carrying boxes.  So, the lift can pick up a box, raise it above the camera’s field of view, then use camera guidance to find a further box.
  • A new ToF sensor is mounted low on the static frame of the lift so that it can measure the distance to a box when lining up to pick it up.
  • The new design is compatible with the smaller diameter wheels, as can be seen in the image.

The toy box lift works reliably under radio control.  The next step is to write the code to allow the robot to play the game autonomously.

Hopefully, we will get to play Eco-Disaster one day.  Assuming we do, we will probably adapt this design to lift the Eco-Disaster barrels as it has several advantages over the previous barrel lift:

  • It could carry two (or possibly even three) barrels at once.
  • The robot can be fitted with small diameter wheels which are better for delicate manoeuvres.
  • The camera can still see forwards when a barrel has been lifted.  So, it should be possible to steer around obstructing barrels.

Now, let’s go write some toy-box-locating vision software.

Progress on Feed the Fish

It was soon clear that our existing Nerf gun design was not well suited to the new shooting game, “Feed the Fish”.  The gun is designed to hit a target using a projectile trajectory which is as flat as possible; but the new game needs the projectiles to drop into the opening in the top of a box.  Also, the game is timed, and it is permitted in the rules to fire five projectiles at once.

We decided that it would be better to start again, this time with a catapult style device that could lob projectiles rather than shoot them like a gun.  Also, it could then fire five rounds at once.

This is a render of the first design, which is designed to be 3D printed:

Render of our Feed the Fish Catapult

We realised that the device should not be powerful, so it uses a rubber band (connecting the arm to the rubber band mount) to store energy.  The actual amount of energy stored can be adjusted by adding more bands or by tightening them.

The amount of travel in the arm can also adjusted with the trajectory adjuster.  The adjuster is mounted between two arced slots; if it is fitted towards the bottom of the slots then the arm moves a small amount, and the projectiles fly mainly upwards with little forward movement.  If the trajectory adjuster is fitted towards the furthest end, then the projectiles fly lower but with greater distance.  After a few test fires we found a good position, roughly halfway round the arcs.

To fire the catapult the servo lifts the trigger momentarily, releasing the arm.  Reloading is then a case of pushing the arm back down to tension the band and replacing the five Nerf “cannon” balls.  When the arm is pushed down the trigger automatically locks it in place, ready to fire.  Reload can be done quickly which was one of our requirements for the new design.

This system works well; once adjusted it frequently achieves five out of five shots into the aquarium.  But it does require that the robot return to the fire position very accurately.

Having the robot return to the same position rapidly might be difficult to achieve.  So, we designed a second version which includes a turntable and camera mount, the idea being to rectify small location inaccuracies by aiming the catapult autonomously with a vision system.

Here is a render of version 2:

Version 2 Catapult. Modified for Autonomous Aiming.

This version has the same catapult mechanism as version 1.  But it is mounted on a turntable which can be turned by the targeting servo.  A raspberry pi camera module is also attached so that the whole system can be rotated (by about +/-10 degrees).  This is a small amount, but we think it is about right for targeting a gun.

Version 2 also fires forward (version 1 fires sideways).  This is useful since it allows the robot to approach the firing position while measuring the distance to the aquarium with its forward-facing ToF sensor.

To begin with we will try to use PiDrogen’s odometry, ToF and inertial measurement systems to align the whole machine on the target.  But if this is too unreliable (i.e. too many shots miss) then we will write code to automatically target the aquarium based on camera feedback to make small turntable adjustments.

We’re pleased with the turntable design.  Assuming it works well we will adjust the Nerf gun to use it too.

Getting ready for PiWars@Home

So, what about all these new challenges then?  The new challenges for PiWars at Home were announced at the start of September 2020. 

As in previous competitions, there are several ways to perform each challenge with differing amounts of points available for each approach.  We will make every effort to gain maximum points for each round so will generally attempt the most challenging method.

Let’s look at each game:

Feed the Fish.

The rules say that the robot can be remote controlled or autonomous, but it is not clear if an autonomous approach yields more points.  However, the game is against the clock and it is possible that an autonomous machine could be quicker: delicate aiming by radio control can be time consuming.  For this reason, we will keep an autonomous design in mind.  To begin with we will try to use PiDrogen’s odometry, inertial measurement and time of flight systems to manoeuvre the robot from the loading to the firing location on the arena.  However, we might also use video guidance if the simpler systems are insufficient.

Since the game is against the clock, we will try to create a design which fires five rounds at once.  The design will also need to be quick to re-load (since we think the re-loading time is part of the game).  Finally, the launching mechanism will be based on a catapult so that projectiles can be lobbed rather than shot with a flat trajectory so that they can drop into the top of the “aquarium”.

Extra “artistic merit” points are available for a nicely decorated aquarium.  Since this is a robotics competition, we’re thinking automaton.  But only if there’s time at the end.

Tidy Up the Toys

Again, we will be seeking the highest scoring approach to the game.  This requires that the robot stacks the boxes using an autonomous control system.

We have a barrel lift mechanism that was built for the Eco-Disaster game in the 2020 games.  This works well for barrels and could easily be adjusted to lift the toy boxes (by changing the jaws).  But it could not lift one barrel and place it on another.  So, a redesign is needed.

The game is against the clock, so we hope to lift the red box, drive to the green, place the red on it, then lift both the red and green box and drive them together to the blue box.  Finally, the robot will lift all three boxes and deliver them; hopefully, this is quicker than delivering the boxes one at a time.

We’re thinking that we might take another look at the motor interface software; it will be useful for this challenge if the robot can be made to make smooth movements, i.e. where acceleration and deceleration rates are controlled.  This will be a nice addition to the robot in any case.

Up the Garden Path

This game is a variation of previous line follow games.  PiDrogen can already use video guidance to follow a line, although it has no code to resolve routes at junctions.  It would be fun to add this functionality, but video guidance is not the highest scoring approach to the game.

Instead we will attempt to provide PiDrogen with a speech recognition system; this being the highest scoring method.

Analysis of the route suggests that we could use a fairly minimal repertoire of commands; we propose to use directions based on the numbers of a clock face.  To follow the route the driver should say “three”, “two”, “three” to get to the start of the first curve.  Then the driver should say “anti” meaning drive around the semi-circular part of the curve.  The full set of words that should be resolved will be: two, three, nine, ten, anti and clock.  The robot can stop at the end of the course based on the front facing time of flight sensor.

We have a pair of Blue Tooth headphones with excellent sound quality.  We hope to be able to connect these to PiDrogen’s Raspberry Pi brain and have it resolve verbal commands since this is explicitly allowed in the rules.  We think this will be more reliable than mounting a microphone on the chassis where ambient noise is likely to be a problem.

PiDrogen is quick at following a line with the camera.  It is even quicker making moves based on odometry and the IMU.  We may have to choose between a fast time recorded using odometry / video guidance or a slower time using voice guidance.

DIY Obstacle Course

At the moment we don’t have much of a plan for the obstacle course.  But there are a couple of thoughts at this time:

  • We think PiDrogen is pretty good at moving over obstacles.  P19 did quite well in the obstacle-based challenges in 2019 and P20 has had many updates that should make it better still, such as improved ground clearance, a smooth underside etc.  We want to demonstrate how well our machine can traverse obstacles, so we need to develop a challenging course.
  • In this game we will not go for the maximum points approach (which is an autonomous machine).  Instead we will use radio control so that our chief driver can demonstrate his skills.

That’s all for now.  Next time we will get into detail about one of the new game implements.

Hello Again

Hmm.  Keeping this blog up to date is one of the trickier aspects of PiWars.  A lot has happened since the last entry, both with our robot build and in the world.  This year has rather taken the wind out of our sails (and everyone else’s I think). Throughout lockdown, and now lockdown 2.0, we’ve been very slowly working on our robot and now we are working towards PiWars @ Home. 

So, let’s start with where we had got to in March, the time PiWars 2020 was postponed.  PiDrogen2020 (P20) was looking pretty impressive sporting the re-designed Nerf gun for Zombie Apocalypse.

P20 armed and ready for the Zombie Apocalypse

P20 was ready to take on many of the challenges:

  • P20 is quicker and has better ground clearance than PiDrogen 2019 (P19) so we were confident it was going to perform well on the obstacle course.
  • We can now fit mecanum wheels.  Nathan had been practicing driving with them, which is pretty tricky to do well, ready for PiNoon.
  • We were excited to show our upgraded Nerf gun to the world for Zombie Apocalypse.  The new design is more accurate and powerful.  It is also easier to align as the laser sight can be adjusted with thumb screws (machine screws actually).
  • Everything was also ready for Lava Palava which was to be performed using video camera guidance; this was a straight lift from P19.
  • We were also fully prepared for Escape Route.  P20 has three time-of-flight sensors mounted in its face plate; one facing forward, one left, one right (as can be seen in the photo above).  These allow P20 to follow a wall and detect when the wall comes to an end.  The maze route was pre-programmed.
  • Eco-Disaster was a little behind the curve.  An arm was developed that mounts under the robot chassis and this worked well; it can easily pick up a barrel.  But the code was easy to defeat. A barrel could easily be knocked over if it were between another barrel and the delivery zone – more work was needed (or kind/lucky barrel placement on the day!)
  • We also needed to do more work for Mine Sweeper.  The plan was to mount a mirror ball above an upward facing camera.  Experiments showed that it should work and that the robot should be able to see the entire arena. The code was set up for mecanum wheels and looked ready to test.  However we never built the mirror ball mount or the test arena.

We were pleased with P20. It was on target to be a strong competitor for PiWars 2020.  But we can do better…P21 is coming soon.

To be the first to see it in all its glory, follow @PiDrogen on Twitter or subscribe to our blog.

Hardware Almost Finished.

The finish line is in sight for the hardware now.

P20-5

We have hopefully gone through the last major redesign (number 7!) for this year’s robot.  The new design incorporates all the lessons from the test chassis and last year’s robot.  All that remains is to settle on mounting holes in the lid for the Nerf gun.

Printing has started.  Here’s the back controller module…

IMG_0974

This allows us to select the game and provides start and stop buttons.  The bottom line reports the battery voltage and goes red when the battery is getting close to minimum voltage.  Connection to the Pi is via a USB port.

The face plate is also printed.  Assembly of the front module, which contains a lot of components (3 time-of-flight sensors, 4 LEDs with their drivers and a micro controller) can start soon.

Pi Noon Practice

IMG_0934

Last year we didn’t do very well in Pi Noon as we were knocked out in the second round.  We decided to try to improve that this year.

To that end we have developed a “Pi Noon Simulator”.  The cylinder at the top bends when pressed and makes a connection.  This lights an LED via a pulse stretcher which counts as a “POP”.

We’ve got one unit mounted on last year’s PiDrogen and another mounted on our new mecanum test chassis.  We can now have unlimited games of Pi Noon without going through a shed load of balloons.

This really is a lot of fun!

Mecanum Wheel Trial

Since our last blog there’s been lots of progress.

IMG_0976We have drilled out the centres of the mecanum wheels and printed some new hubs.  This allows the gap between the motor and the wheel rim to be reduced so that the robot remains within the maximum width specification.  It also allows a common attachment method between the “balloon tyre” wheels and the mecanum wheels so that a wheel swap between games is viable.

We’ve all, but particularly chief driver Nathan, been driving the test chassis with the mecanum wheels installed.  We’ve been trying to imagine that we are in a PiNoon battle.  We try to always face the opposition then lunge at the appropriate moment.  Quite honestly it’s tricky.  Last year’s machine is easier to drive although it is not as versatile.  This is going to need some practice.

There’s good news though.  The 3D printed chassis is definitely robust enough.  This leaves us free to go through (hopefully) one last design pass and then to print the final version.

Hmm. One step forward, two back.

ElecStack1

We built some electronics.  This is an Arduino Beetle driving four Pololu motor drivers (which are based on MAX14870 chips).  The lower board (which you can’t really see) holds a couple of DC-DC converters which convert our 4S LiPo voltage down to a nice smooth 5.2v for the Pi and the other electronic bits.  The Pi can talk to the motor driver board via the USB port and direct the speed and direction of the motors.  Well it should do anyway.

Unfortunately it doesn’t work.  The motor drivers (which are rated at 2.5A peak) report an error (which I think means “too much current”) so the motors just buzz rather than move.  On paper this should all work; the motors have a stall current of 1.4A.  But unless we limit the motor current to about 800mA the drivers refuse to play ball.

I’ve heard it said that you have to massively over-specify motor drivers.  Now I see that’s true.

We are not amused.  This is a setback.