Link to video recording of the talk
Repository for my talk at Fluttercon USA 2024 on training Convolutional Neural Networks (CNNs) and using them in Flutter apps
The Jupyter notebook train_mlp_and_cnn.ipynb shows how to train a very simple Multilayer Perceptron (MLP) and CNN on images of handwritten digits.
Check out this repository by my classmate Bernardo Ribeiro which goes into more depth on how to train different models for image classification of hand gestures. Specifically this notebook to train MobileNetV3.
You can find the code to run inference on the Flutter side here and the rock paper scissor game logic is here.
Download the demo app apk for Android here
The gestures are mapped as follows:
- Rock <- fist
- Paper <- stop, stop inverted, palm
- Scissors <- peace, peace inverted
- Lizard <- ok
- Spock <- call
To enable the lizard and spock moves you need to tap the flash IconButton in the top right corner of the app bar.