Skip to content

MasterAI-Inc/libauto

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

The MasterAI Device Library (libauto)

AutoAuto Docs AutoAuto Labs

Use Python and A.I. to program your own autonomous vehicles! 🚗 🚁

All MasterAI devices use this library (libauto). For example, the AutoAuto car below:

AutoAuto Fleet 1 Car

Beginner or Advanced?

If you are a beginner, you will first want to follow along through the lessons on AutoAuto Labs. After you leveled-up through the beginner and intermediate lessons, you can come back here to explore this library more fully.

If you are an advanced programmer, you are welcome to dive right in! This library is already installed on your MasterAI device. Have a look at the section Connecting to Your Device and the section Examples, then you will be off-to-the-races! 🏃

Library Overview

The library is segmented into four packages:

  • auto: The core package. Contains the critical components for every MasterAI device, such as the camera interface and the Machine Learning (ML) models.

  • cio: A package whose only job is to talk to the on-board microcontroller. The name cio is short for "controller input/output". It is pluggable and can support multiple backends.

  • cui: A package whose only job is to run the console application on the device's LCD screen. The name cui is short for "console UI". It is pluggable and can support multiple backends.

  • car: The car package contains helper functions for the AutoAuto cars. E.g. car.forward(), car.left(), car.right(), car.reverse(). If you look at the implementations of these helper functions, you find they use the auto package under the hood (pun intended).

Connecting to Your Device

Here are the ways you can connect to your device:

  • SSH: SSH'ing into your device is the quickest way to gain privileged access (i.e. to get sudo powers; remember Uncle Ben's words). You can log in to the device with the username hacker. You must obtain your device's default password from AutoAuto Labs (from the "My Devices" page, you can view your device's "Info for Advanced Users"). Every device has a different default system password. You are encouraged to change your device's system password (using the usual passwd command).

  • Jupyter: Every device runs a Jupyter Notebook server on port 8888. You must obtain the password for Jupyter from AutoAuto Labs (from the "My Devices" page, you can view your device's "Info for Advanced Users"). Every device has a different Jupyter password. Note the Jupyter server does not run as a privileged user; if you need privileged access, you must log into the device as the hacker.

  • AutoAuto Labs: AutoAuto Labs offers a simple editor where you can write and run programs. It is pleasant to use, but it is only good for short and simple programs.

Examples

Drive your car!

Note: Only applicable to AutoAuto cars, not other devices.

import car

# Each line below defaults to driving for 1 second (results in, 4 seconds total driving).
car.forward()
car.left()
car.right()
car.reverse()

# You can also specify the duration (in seconds), for example:
car.forward(2.5)

Print to the Console

Many MasterAI devices are equipped with an LCD screen which runs a console application. You can print your own text to the console, example below:

from auto import console

console.print("Hi, friend!")
console.print("How are you?")

The car package also has a print function which prints to stdout and to the console. (For those who use the car package, this is convenient.)

import car

car.print("Hi, friend!")
car.print("How are you?")

Use the camera

Capture a single frame:

import car

frame = car.capture()
car.stream(frame, to_console=True, to_labs=True)

Note: The car.capture() and car.stream() functions are convenience functions. They use the auto package internally. E.g. The following code uses the next-layer-down interfaces to capture frames continuously.

from auto.camera import global_camera
from auto.frame_streamer import stream

camera = global_camera()

for frame in camera.stream():
    # <process frame here>
    stream(frame, to_console=True, to_labs=True)

You can clear the frame from the console like this:

import car
car.stream(None, to_console=True, to_labs=True)

# --or--

from auto.frame_streamer import stream
stream(None, to_console=True, to_labs=True)

Detect faces

import car

while True:
    frame = car.capture(verbose=False)
    car.detect_faces(frame)
    car.stream(frame, to_labs=True, verbose=False)

The lower-level class-based interface for the face detector can be found in auto.models.FaceDetector. The face detector uses OpenCV under the hood.

Detect people

We call this the "pedestrian detector" in the context of an AutoAuto car.

import car

while True:
    frame = car.capture(verbose=False)
    car.detect_pedestrians(frame)
    car.stream(frame, to_labs=True, verbose=False)

The lower-level class-based interface for the people detector can be found in auto.models.PedestrianDetector. The people detector uses OpenCV under the hood.

Detect stop signs

import car

while True:
    frame = car.capture(verbose=False)
    car.detect_stop_signs(frame)
    car.stream(frame, to_labs=True, verbose=False)

The lower-level class-based interface for the stop sign detector can be found in auto.models.StopSignDetector. The stop sign detector uses OpenCV under the hood.

Helper functions: object location & size

The following works with the returned value from:

  • car.detect_faces() (shown in example below)
  • car.detect_pedestrians()
  • car.detect_stop_signs()
import car

frame = car.capture(verbose=False)
rectangles = car.detect_faces(frame)
car.stream(frame, to_labs=True, verbose=False)

location = car.object_location(rectangles, frame.shape, verbose=False)
size = car.object_size(rectangles, frame.shape, verbose=False)

car.print("Object location:", location)
car.print("Object size:", size)

Raw OpenCV

In the example below, we will use OpenCV to do edge-detection using the Canny edge filter.

import cv2
import car

print("OpenCV version:", cv2.__version__)

while True:
    frame = car.capture(verbose=False)
    frame_gray = cv2.cvtColor(frame, cv2.COLOR_RGB2GRAY)
    frame_edges = cv2.Canny(frame_gray, 100, 200)
    car.stream(frame_edges, to_labs=True, verbose=False)

Classify frame's center color

import car

frame = car.capture()
color = car.classify_color(frame)
car.stream(frame, to_labs=True)
car.print("The detected color is", color)

The lower-level class-based interface for the color classifier can be found in auto.models.ColorClassifier.

Read QR Codes

import car
from auto.qrcode import qr_scan

while True:
    frame = car.capture(verbose=False)
    car.plot(frame, verbose=False)
    qr = qr_scan(frame)
    print(qr)
    if qr:
        car.honk(1)

Precise steering

Note: Only applicable to AutoAuto cars, not other devices.

from car.motors import set_steering
import time

for angle in range(-45, 46):       # goes from -45 to +45
    set_steering(angle)
    time.sleep(0.05)

for angle in range(45, -46, -1):   # goes from +45 to -45
    set_steering(angle)
    time.sleep(0.05)

time.sleep(0.5)
set_steering(0.0)  # STRAIGHT
time.sleep(1.0)

Important Note: The call to set_steering() is asynchronous; that is, the function returns immediately, very likely before the wheels have actually had a chance to fully turn to the desired angle! Furthermore, the call only "lasts" for 1 second, then the angle will automatically revert back to straight. As a result you must call set_steering() in a loop to keep it active. (This is a safety feature, allowing the car to revert to going straight if your program crashes or if the Pi loses communication with the microcontroller.)

Precise throttle

Note: Only applicable to AutoAuto cars, not other devices.

WARNING: You can easily injure the car, yourself, or others by setting the throttle too high. Use this interface with extreme caution. These cars are VERY powerful and very fast.

from car.motors import set_throttle, safe_forward_throttle
import time

throttle = safe_forward_throttle()
print("Safe throttle is:", throttle)

set_throttle(0.0)        # Car in NEUTRAL
time.sleep(1.0)

set_throttle(throttle)   # Car moves at safe forward speed
time.sleep(1.0)

set_throttle(min(100, 2*throttle))       # HALF THROTTLE (DANGER! THIS IS VERY FAST!)
time.sleep(0.5)

set_throttle(0.0)        # Back to NEUTRAL
time.sleep(1.0)

Important Note: The call to set_throttle() is asynchronous; that is, the function returns immediately, very likely before the car's speed actually changes! Furthermore, the call only "lasts" for 1 second, then the car will revert back to a throttle of zero. As a result you must call set_throttle() in a loop to keep it active. (This is a safety feature, allowing the car to automatically STOP if your program crashes or if the Pi loses communication with the microcontroller.)

Plot frames in Jupyter

The helper function car.plot() will both stream a single frame to your AutoAuto Labs account and it returns a PIL.Image object, so you can conveniently use it from Jupyter. See the screenshot below:

Servos

If your device has extra servo outputs (e.g. Rhobota), you can control them via the auto.servos module.

from auto.servos import get_servo
import time

servo = get_servo(0)  # <-- servo index
                      # see your device's documentation

servo.on()

for i in range(0, 181):
    servo.go(i)
    time.sleep(0.02)

servo.off()

LEDs

If your device has RGB LEDs, then you can control them via the auto.leds module.

The following example works on both Fleet 1 and Fleet 2 AutoAuto cars:

from auto.leds import (
    led_map,
    set_many_leds,
)
import random
import time

leds = led_map()

while True:
    vals = [
        (led, random.randint(0, 1))
        for led in leds
    ]
    set_many_leds(vals)
    time.sleep(0.5)

Fleet 2 AutoAuto cars have RGB LEDs, so you can specify RGB colors for them. Note: The following example only works on devices (like the Fleet 2 AutoAuto car) that have RGB LEDs:

from auto.leds import (
    led_map,
    set_many_leds,
    set_brightness,
)
import random
import time

color_vals = [
    (1.0, 0.0, 0.0),  # red
    (0.0, 1.0, 0.0),  # green
    (0.0, 0.0, 1.0),  # blue
    (1.0, 1.0, 0.0),  # yellow
]

brightnesses = [
    *range(5, 255, 5),
    *range(255, 5, -5),
]

leds = led_map()

while True:
    rand_colors = random.sample(color_vals, k=len(leds))
    vals = list(zip(leds, rand_colors))
    set_many_leds(vals)
    for b in brightnesses:
        set_brightness(b)
        time.sleep(0.01)

List the device's capabilities

Different MasterAI devices (and different versions of the same device) may have a different set of hardware capabilities. You can ask your device to list its capabilities like this:

from auto.capabilities import list_caps

my_capabilities = list_caps()

print(my_capabilities)

Gyroscope

You can get instantaneous measurements from the gyroscope like this:

import car
from car import gyro
from car import motors

while True:
    x, y, z = gyro.read()
    string_vals = [
        f'{v:7.2f}'
        for v in (x, y, z)
    ]
    print(', '.join(string_vals))

Or you can get accumulated (or integrated, if you prefer) measurements like this (which is likely what you actually want):

import car
from car import gyro
from car import motors

while True:
    x, y, z = gyro.read_accum()
    string_vals = [
        f'{v:7.2f}'
        for v in (x, y, z)
    ]
    print(', '.join(string_vals))

Accelerometer

import car
from car import accel
from car import motors

while True:
    x, y, z = accel.read()
    string_vals = [
        f'{v:7.2f}'
        for v in (x, y, z)
    ]
    print(', '.join(string_vals))

Buzzer

The car package has two helper functions:

import car

car.buzz('!V10 O4 L16 c e g >c8')

car.honk()

See Buzzer Language to learn how to write notes as a string that the buzzer can interpret and play.

Photoresistor

You can use the photoresistor as a very simple ambient light detector. The photoresistor's resistance changes based on the amount of light hitting it.

import car
from car import photoresistor
import time

for i in range(100):
    millivolts, resistance = photoresistor.read()
    car.print(resistance)
    time.sleep(0.1)

The program above prints the resistance of the photoresistor (in Ohms). You can play around with where a good threshold is for your application, and you can quickly see the value change by simply covering the light with your hand or by shining a flashlight at it.

Push Buttons

import car
from car import buttons

car.print("""Press the buttons, and you'll see the
events being printed below:""")

while True:
    button, action = buttons.wait_for_action('any')
    car.print("The {}th button was {}.".format(button, action))

Battery voltage

from auto.capabilities import list_caps, acquire, release

power = acquire('Power')

millivolts = power.millivolts()
percentage, minutes = power.estimate_remaining(millivolts)

print('The power voltage is {} millivolts.'.format(millivolts))
print('It is at ~{}% and will last for ~{} more minutes.'.format(minutes, percentage))

release(power)

Note: There's a background task that will monitor the battery voltage for you and will buzz the buzzer when the battery gets to 5% or lower.

Encoders

Some devices have motor encoders to track how many "clicks" the motor has rotated.

import car
from car import enc

N = enc.num_encoders()

car.print(f'This device has {N} encoders.')

for i in range(N):
    enc.enable(i)

while True:
    vals = [
        f'{enc.read(i):8.0f}'
        for i in range(N)
    ]
    car.print(', '.join(vals))

Calibration

Depending on the device you have, you can run the appropriate calibration script.

Device Name Calibration Script Name
AutoAuto Car with v1 Controller calibrate_car_v1
AutoAuto Car with v2 Controller calibrate_car_v2
AutoAuto Car with v3 Controller calibrate_car_v3

Buzzer Language

The Buzzer Language1 works as follows:

The notes are specified by the characters C, D, E, F, G, A, and B, and they are played by default as quarter notes with a length of 500 ms. This corresponds to a tempo of 120 beats/min. Other durations can be specified by putting a number immediately after the note. For example, C8 specifies C played as an eighth note (i.e. having half the duration of the default quarter note). The special note R plays a rest (no sound). The sequence parser is case-insensitive and ignores spaces, although spaces are encouraged to help with human readability.

Various control characters alter the sound:

Control character(s) Effect
AG Specifies a note that will be played.
R Specifies a rest (no sound for the duration of the note).
+ or # after a note Raises the preceding note one half-step.
- after a note Lowers the preceding note one half-step.
12000 after a note Determines the duration of the preceding note. For example, C16 specifies C played as a sixteenth note (1/16th the length of a whole note).
. after a note "Dots" the preceding note, increasing the length by 50%. Each additional dot adds half as much as the previous dot, so that "A.." is 1.75 times the length of "A".
> before a note Plays the following note one octave higher.
< before a note Plays the following note one octave lower.
O followed by a number Sets the octave. (default: O4)
T followed by a number Sets the tempo in beats per minute (BPM). (default: T120)
L followed by a number Sets the default note duration to the type specified by the number: 4 for quarter notes, 8 for eighth notes, 16 for sixteenth notes, etc. (default: L4)
V followed by a number Sets the music volume (0–15). (default: V15)
MS Sets all subsequent notes to play play staccato – each note is played for 1/2 of its allotted time, followed by an equal period of silence.
ML Sets all subsequent notes to play legato – each note is played for full length. This is the default setting.
! Resets the octave, tempo, duration, volume, and staccato setting to their default values. These settings persist from one play() to the next, which allows you to more conveniently break up your music into reusable sections.

Examples:

  • The C-major scale up and back down: "!L16 cdefgab>cbagfedc"
  • The first few measures of Bach's fugue in D-minor: "!T240 L8 agafaea dac+adaea fa<aa<bac#a dac#adaea f4"

1: Pololu Corporation developed and holds the copyright for the Buzzer Language and its documentation. Further information about the Buzzer Language's license and copyright can be found in the LICENSE file.