Skip to content

Latest commit

 

History

History
55 lines (40 loc) · 3.57 KB

README.md

File metadata and controls

55 lines (40 loc) · 3.57 KB

This repo contains the source code for a demo we created to show off machine learning on the edge using the Coral Dev Board with the Edge TPU.

Demo in all its glory

This is a video of it in action at the workshop where we were doing work on it just prior to Google Cloud Next 2019.

The source code is split into folders representing the different links in the architecture that piece together the whole thing. The demo uses a webcam, connected via USB-C to the Coral board, which is running a Python script pulling frames from the camera, and passing them through to our models. The models' output is then sent to a secondary machine (we had all them networked over a LAN with a router) which is running a Node.js script. That script does a few things:

  1. Business logic for interpreting the model output and deciding which bucket the gear is dropped into.
  2. Drives the Arduino which controls the mechanicals of the demo.
  3. Sends telemetry data to a Cloud Firestore instance which is then picked up by the Dashboard.

The Coral board also forwards the video stream from the webcam to a Python process running on the secondary machine. The streaming Python script on the secondary machine also saves off an image buffer of the last "captured" gear with bounding box displaying the missing gear detected, both available to be shown in the Dashboard.

Architecture diagram

android/Sorting_Demo: This is the folder for the Android application used at Google I/O 2019 to show off AutoML's model compiled down to mobile device and showing off the ability to use the same model on multiple formats of devices accurately.

base_training_images: Holder folder for any images we take and need to transfer between places for training our model. Heavily depending upon lighting changes, etc to be accurate.

dashboard: Front-end UI code for displaying the dashboard which is shown on the monitor attached to the demo.

coral: Code that lives and runs on the Edge TPU development board.

server: Code that lives and runs on the Windows Lenovo box. The node.js code that has all the business logic for what to do with the outputs from our AutoML model, as well as running the mechanicals of the demo itself by way of the Arduino.

imgs: Contain footage/images of the demo in action.

Running the demo

There is an order dependency on getting the demo running. Each of these requires running in its own terminal window as the processes keep running.

  1. Streaming server pass-through for dashboard feed
    1. server/stream_video.py
  2. Node server receiving telemetry from Coral board
    1. set the environment variable for the GOOGLE_PROJECT_ID to your GCP project containing the Firestore instance (if you don't want any telemetry data you can remove a lot of the code around that piece and forget the environment variable in this step)
    2. server/server.js
  3. Python Coral SDK script to do the actual inference on the Coral board
    1. coral/recognize.py (this has hardcoded model values in there, which you can change, or you can use the script flags to specify your own models as seen in coral/run_python.sh)
  4. Node server which handles serving up the Dashboard JavaScript page
    1. dashboard/server.js (this is only necessary if you want to be running the dashboard)