A “Minimum Viable Racer” for OpenMV

This is the cheapest good computer vision autonomous car you can make — less than $85! It uses the fantastic OpenMV camera, with its easy-to-use software and IDE, as well as a low-cost chassis that is fast enough for student use. It can follow lanes of any color, objects, faces and even other cars. It’s as close to a self-driving Tesla as you’re going to get for less than $100 😉

It’s perfect for student competitions, where a number of cars can be built and raced against each other in an afternoon.

Parts:

Total: $85 to $95, depending on whether you can 3D print your own parts.

The code is optimized for standard RGB tracks, which can be made with tape.

Instructions:

  1. Assembly the rover kit as per the instructions
  2. 3D print (or have printed at a service) the camera mount. Attach it to the chassis with screws. Screw the OpenMV camera on to the pivoting holder and tilt it down about 15-20 degrees as shown above.
  3. Attach the motor wires to the two side terminals of the motor driver as shown below. Attach the battery wires to the “GND” (black wire) and “+12V” (red wire) terminals.
  4.  Plug one jumper wire on the “IN1” pin and another on the “IN4” pin of the motor controller.
  5. If you have female-to-male jumper cables, attach one (ideally red or some other bright color) to the “+5V” terminal of the motor controller and another one (ideally black or some other dark color) to the “GND” terminal, joining the battery wire you already attached there.  If you only have female-to-female cables, snip off one of the ends and strip the wire and just insert that into the terminal instead.
  6. Plug the other side of these cables to the OpenMV board. Red goes to “VIN”, Black goes to “GND”, the wire that went to in “IN1” goes to “P7” and the one that went to “IN4” goes to “P8”, as shown below.
  7. Load the code into the OpenMV IDE, plug your USB cable into the OpenMV board and run it while it’s looking at a green object (it defaults to following green, although that’s easy to change to any other color in the IDE). (Make sure your rover is powered on with batteries in). If one of the motors is turning backwards, just swap the wires from that motor going into the motor controller.

  1. If you want it to follow a different color, just change this number in the code:
threshold_index = 1
# 0 for red, 1 for green, 2 for blue

If you want to tune it for another color, use the IDE’s built-in Threshold Editor (Tools/Machines/Vision/Threshold Editor) and add a threshold set for the color (or replace one of the generic thresholds) that you want in this section of the code:

thresholds = [(0, 100, -1, 127, -25, 127), # generic_red_thresholds
              (0, 100, -87, 18, -128, 33), # generic_green_thresholds
              (0, 100, -128, -10, -128, 51)] # generic_blue_thresholds
# You may pass up to 16 thresholds above. However, it's not really possible to segment any
# scene with 16 thresholds before color thresholds start to overlap heavily.

In the below example, I’ve tuned it to look for red lanes. So I’d copy the “(0,100,30,127,-128,127)” and replace the generic red threshold numbers above with that.  Then I’d change the line above that to “threshold_index = 0”, so it would look for the first threshold in the list, which is red (lists are “zero-based”, so they start at zero).

 

When you’re done, the IDE will show it tracking a color lane like the above (the lowest rectangular “region of interest” — ROI — is weighted highest, with the other two weighted less). You can modify the ROIs by dragging boxes on the screen to show the region you want to identify, and then give it a weighting as show here:

# Each ROI is (x, y, w, h). The line detection algorithm will try to find the
# centroid of the largest blob in each ROI. The x position of the centroids
# will then be averaged with different weights where the most weight is assigned
# to the ROI near the bottom of the image and less to the next ROI and so on.
ROIS = [ # [ROI, weight]
(38,1,90,38, 0.4),
(35,40,109,43,0.2),
(0,79,160,41,0.6)
]

Written by 

Leave a Reply

Your email address will not be published. Required fields are marked *