The Bad News & the Good News

Argh!

First the bad news: we may have been too ambitious when filming the obstacle course. PiDrogen has taken a significant tumble and is now in “dry dock”. I can’t tell you what happened right now (you’ll have to wait to see the out take at the end of our obstacle course video at the competition), but it was from a substantial height!

Now the good(ish) news: we rarely dismantle PiDrogen’s back axle, it’s too fiddly. But one of the motors is stuck on full power so there’s no choice. So here’s a rather rare picture inside it…

Inside P21’s back axle

To be honest, I’m amazed there wasn’t more damage.

The Obstacle Course

Yikes!  It’s June.  18 days left.  Better get on with it.

When we record the obstacle course, we plan to be running PiDrogen’s recording strategy.  This records a video from the robot’s camera and marks it up with various information, a bit like a Head Up Display (HUD).  See below.

The HUD reports the heading of the robot (assuming it was booted facing due North, measured by the IMU), the pitch and roll angles (also from the IMU), the battery voltage, and the forward speed.  An artificial horizon is used to represent the pitch and roll angles.

The plan is to submit an uninterrupted video from a phone (as required in the rules) but use snippets from PiDrogen’s camera as a picture-in-picture.

When the software was written, the maximum speed of the robot was estimated (i.e. not measured) at 1.9ms-1.  But when I thought about it, covering nearly two metres per second seems pretty quick.  I thought I should check my facts before submitting any videos!

The speed estimation was based on the following data:

  • The motors are rated at 350RPM at 12v with no load.  However, the robot uses 4S LiPos which deliver between 16.8v (fully charged) and 12.4v (when the battery protection circuit shuts down).  For the calculation I made the assumption that the motors would still rotate at 350RPM when driving the robot (i.e. under load) due to the additional voltage.
  • The obstacle course wheels are 104mm diameter.

So, the estimated speed = 350RPM x 104mm x π / 60 = 1.9ms-1.

Today we measured out a 5 metre course on our drive and timed the robot driving the length of the course at full speed using fully charged batteries.  We took four measurements, two in each direction (since the drive isn’t quite level).  The measurements were taken by hand (so are not likely to be particularly accurate), so were recorded to 0.1s accuracy.

The times measured were 2.2, 2.8 2.6 and 2.9 seconds.

The average time is 2.6 seconds. 5 metres in 2.6 seconds equates to a speed of 1.9ms-1.

Cool, the estimate was right!  That’s satisfying / a relief!

YACA5 Update

Sorry!  I forgot to update the blog on the new catapult arm. (See my previous blog entry).

Sadly, it didn’t make a difference on the Feed the Fish game.  But I’ll explain it anyway.

Nerf Rival balls are obviously made with a mold which has two hemispherical parts.  It is clear to see the seam where the two parts come together.  The balls also have dimples like a golf ball.  YACA5 has slots so that the ball seams can be aligned with the flight path.  I wondered if aligning then might make them fly more consistently.

As I said, it doesn’t.

However, I have realised that not all Nerf Rival balls are born equal.  I test fired 40 balls 10 times each.  Some balls were on target 10 times.  One ball only flew correctly 4 times out of 10.

Perhaps the best hint I can give here is this: test your balls and pick the best!

Yet Another Catapult Arm

I’ve fallen into the trap of searching for the Holy Grail of Feed the Fish: that is 15/15 shots.  I’ve done so many “takes” that the course has gone darker (from tyre rubber) where the robot makes a right-angle turn to line up on the aquarium! 

This is my latest attempt at designing a more reliable catapult arm which I will call YACA5:

Yet Another Catapult Arm V5

If it works, I’ll explain it in another blog post.

Wish me luck!

Take a look at our Aquarium!

I’m rather proud of our aquarium. Here’s a picture:

iPads need less cleaning than real fish.

It is a 3D printed frame into which you can slot two (or three) iPads.  The red circle on the bottom is a target for the robot to align on.

The outside of the box is the regulation 200mm, but the hole in the top is rather smaller (because of the thickness of the iPads), but I reckon it’s worth it!

It’s a proud Dad moment: my eldest son (who is also the team driver) created a Scratch animation of the fish swimming about.  We then captured video of it running on a PC, did some Python reformatting of the video to make it full screen in portrait mode, then used Giphy.com to convert it into a gif.  We can open the gif in photos app, then the animation cycles forever.  Just the job for doing dozens of takes while trying to get 15 out of 15 shots!

Here’s the gif for your entertainment.

Well fed fish

Testing the new Catapult

I built the new catapult (that I talked about in my last post) and tested it.

Immediately I could see many of the improvements that were expected of the new design:

  • It is easy to adjust.
  • It fires straight.
  • The turntable moves as expected (+/- 30 degrees).
  • The new design is much easier to install on the robot.

However, I found that it was still too inconsistent.  Specifically, the energy stored in the system would gradually decrease as the catapult was used for many shots.  Initially I imagined that the rubber band was stretching and becoming weaker.  But then I realised that the band was slipping along the catapult arm.

So here is the new design for the arm.  It now incorporates a hole for the band to pass through so that it can’t slip along the arm.

Catapult arm (version lost count).

I also found that there was insufficient travel in the trajectory adjuster.  I went for the very high-tech approach of gluing on a piece of Lego to fix that:

Catapult with Lego “tweak”.

This new (dare I say final) version is now yielding much better results.  I’m optimistic it will achieve close to 15 out of 15 shots!

A New Catapult for Feed the Fish

Back in November I made this design for Feed the Fish.

Feed the Fish Catapult, Version 3

At the time, I was pleased with it.  Initial tests went well, and I had occasions where the catapult delivered five rounds into the aquarium in one shot.  Since then, however, I have become less satisfied with the design.  Here are the problems with it:

  • I misinterpreted the initial results.  The catapult can deliver five shots into the aquarium.  But it would be lucky to get three consecutive deliveries of five shots.  The catapult needs to deliver five shots almost every time for this game to be viable without hundreds of attempts.
  • Adjustments to the catapult are too difficult to make.  It is difficult to add or subtract energy storage (by adjusting the rubber bands) since it is necessary to partially dismantle the device.  Adjusting the trajectory is also tricky since it is necessary to fight the rubber bands whilst doing it.
  • The catapult generally fires to the left.  This might be because the trigger acts on one side of the catapult arm rather than through the middle.  It might also be because the rubber band is tighter on one side than the other.  I would like a design that fires straight!
  • The turntable only moves +/-10 degrees: in November I thought this would be about right.  But the catapult fires too far left for this to work.  Since I want the turntable design for our Nerf gun, I think the turntable is worth revisiting, and I think +/-30 degrees is a better choice.
  • It is difficult to set the robot up with this implement because it has the camera fitted.  It is necessary to juggle the lid, catapult and battery during installation: you need lots of hands.  This is probably not a problem for PiWars@Home, but it feels like the camera cable would be easy to break in the process.
  • There are quite a few aspects of the design which probably contribute to inconsistent results: the arm bearing has too much play, the turntable is not stable enough and the trigger does not act centrally on the catapult arm.

So, here is version 4.

Feed the Fish Catapult, Version 4.

These are the changes:

  • The trigger now acts centrally on the catapult arm.
  • The turntable has been re-worked to move +/-30 degrees.  It is also easier to tighten to remove excess play.
  • The rubber band can now be adjusted with a thumbscrew via a worm gear.
  • The trajectory can be adjusted by the two long bolts mounted on the sides of the catapult.
  • The arm now pivots on ball bearings rather than a bolt so it has less play.
  • The system uses the hull mounted camera.

That’s the theory anyway.  I’m currently printing the new parts to try it.

Voice Control Taking Shape

I’m a bit late to the voice control party.  But that’s good news because everyone else has done all the hard work.  In particular, Neil Stevenson from Team Dangle has blogged how to install Vosk (https://dangle.blogsyte.com/?p=199) on a Pi.  I copied Neil’s instructions and had a voice control system up and running (on a RPi400) in minutes.

The good news is that that my wife’s posh Bose noise cancelling headphones work beautifully with it too: I was worried that they might not be Pi-compatible.  Actually, setting them up wasn’t easy; I used the instructions on https://howchoo.com/pi/bluetooth-raspberry-pi but had to be very careful not to let them configure as “headphones” (the rather insistent default). They must be a set up as a “headset” to work with Vosk.

I then tried to connect the headphones to P21’s Pi.  This didn’t go to plan.  The problem is that I redirected the Pi’s hardware UART to the Pi’s header ages ago.  This is used to pass messages between the Pi and the robot’s Arduino hardware controller and is fundamental to the robot’s design.  The problem is that the UART is normally required by the Pi’s Bluetooth system.

I thought of the following options to counter this:

  1. Revert the robot interface to the Pi’s software UART.  I’m not keen on this; on https://howchoo.com/pi/bluetooth-raspberry-pi it says that the software UART timing fails if the Pi is heavily loaded.  The Pi is certainly heavily loaded when the vision system is running.
  2. I could try connecting a Bluetooth dongle to a USB port on P21’s RPi.  I have no idea if this will help so I will probably look into this in time.  I would probably have to reprint P21’s lid to fit a dongle in.
  3. I could place another RPi on the robot and connect it to the implement connector.  I might run into UART problems again though.  Vosk probably loads the Pi and I would not be able to use the hardware UART (obviously!).
  4. I could place another RPi on the robot and connect it via IP and WiFi. 

I decided to go with the last option.  I decided to use the RPi and touchscreen from P19 since it is currently redundant.  This setup was appealing since it can be created with minimal changes to P21.

So, here’s what I did:

  1. I took the Pi and touchscreen off of P19.  I updated the OS then installed the Bluetooth manager software as described in https://howchoo.com/pi/bluetooth-raspberry-pi . This time the headphones connected without problem.
  2. I added a new socket interface into PiDrogen’s software framework.  A client can now connect to it and issue words.  I updated the general framework to understand some words like “go”, “stop” and “pause”.  I updated the line follower game to understand “straight” (go straight on at the next junction), “right” and “left”.  I also added “return” which has the robot do an about turn, reverse the directions it followed before, follow the line (back to the start) then do another about turn.
  3. I installed Vosk on the RPi following Neil’s instructions.  I then added a socket client to glue Vosk to P21’s framework.  This is awkward since Vosk often repeats itself; the glue code clears the repetition. 
  4. I designed a carrier for the RPi, touchscreen and a battery pack, see below.

Does it work?

Yes.  But it took quite a few “takes” to get a complete video of the whole system working.  Three errors made the system unreliable:

  1. The line follower code made a number of errors where it either lost the line or it took the wrong turn at junctions.  At the time I countered this by down-tuning the robot speed but I now believe I should have adjusted the black threshold.
  2. Sometimes Vosk is too slow.  I start the robot by saying the sequence “go straight right left” (those being the directions to follow for Up the Garden Path).  If the word “straight” reaches P21 after the robot has reached the first junction then the robot defaults the first junction to straight on, but then the sequence is out of step with the robot’s location.  As a result, the robot takes a wrong turn at a subsequent junction.  I can bypass this problem by omitting the word “straight”, but I would prefer to get it working nicely.
  3. Sometimes Vosk misses a word.  If one of the direction words is missed then the robot might make a wrong turn.

At time of writing I have a video that could be submitted to the competition.  The robot is voice controlled and uses the camera to follow the line (which is the maximum scoring option).  But the video was made with down-tuned speed and I forgot to return the robot back to the start by saying “return” (which is not part of the game, but it’s fun).

I think I’m going to address some of the issues above then have another go!

I want Doughnuts.

Two blog posts in one day!  Whatever next?

Version 1 of the junction detector has a problem.  Frequently the angle of the road ahead does not match the preset angles of the coloured bars in the detector (see the earlier blog).  As a result, the code misses road options.

So, I have developed a new system.

Image 1

As with version 1 of the junction detector, the image is split into top, middle and bottom bands.  The bottom two bands are used to work out the orientation of the line in the image.  This is then used to locate a centre for the detector, where the road is half way up the top band; see the turquoise line in the image above.  This is all the same as version 1 of the junction detector (which was the subject of the previous blog entry).

However, at this point the code creates a mask consisting of two concentric circles; yes, like a doughnut.  The code then segments the line in the top band using the doughnut mask. The result of this are the two purple segments (one of which has been partially overwritten in blue) in the image.  At this point the code can see two portions of road in the top segment. 

There are a few other things to note in the frame:

  • The blue line shows the target that the line follower is using to direct the robot. Note there is an inherent assumption that the robot position is half way across the bottom of the frame.
  • The steerDegrees caption shows the angle that the line follower is asking the robot to steer; 17 degrees left of centre.
  • NextTurn:R shows that the robot should turn right at the next junction; the image is the approach to the second junction on the PiWars@Home course.

Now we can roll the video forward to the junction:

Image 2

Now there are three doughnut portions visible.  As a result the code is reporting “NextTurn:R at junction”.  In other words, it can see the junction.

Because the next turn should be right (the code has a pre-coded list of turns: straight, right, left for the PiWars@Home course), the right-most doughnut portion is overwritten in blue, the blue target line passes through it and the target steer angle is 3 degrees left of centre.

This system is better than the coloured lines in version 1 because it reads the directions of the exits from the image, rather than needing them to conform to a preset ideal.  The “doughnut” system ensures that the segmentation yields a blob for each exit rather than one contiguous blob.

However, an additional system is needed in this algorithm. As the robot gets close to a junction the following occurs:

Image 3

The centre of the doughnut has now moved beyond the junction intersection.  But the robot is still to reach the junction, so it is important that the junction detector does not move on to the next direction (in the list of directions) until the camera can no longer see this current junction.

To achieve this a counter is employed. When a junction is detected the counter is set to a value (from the parameter set) of about 10 (equating to 0.5 seconds of robot travel).  The counter is decremented for each video frame following the junction: the junction is deemed traversed when the counter reaches zero.

In the image above the counter is more than zero.  But the doughnut has only detected two line segments.  This is inconsistent, so the code re-segments a strip through the centre of the top band; this is shown in yellow.  The code now steers towards the centroid of the yellow segment (as denoted by the blue outline).  Had there been multiple yellow segments, the algorithm would have chosen the left-most, since the next turn is marked as left; see the next image:

Image 4

This algorithm works well.  The robot consistently identifies the junctions correctly and it accurately follows the pre-defined route.

A new video of the robot completing the course has been made.  It is now completely vision controlled (unlike the first video that used vision and odometry).  The new code is also faster; the robot completes the course in under 15 seconds.

The next job is to replace the pre-coded directions with a voice controlled system.