Four optimization techniques for Machine Learning Inference on Raspberry Pi SBC
Вставка
- Опубліковано 24 сер 2021
- Link to the article:
www.hackster.io/dmitrywat/fas...
Hardware in the video:
reTerminal
www.seeedstudio.com/ReTermina...
Blink Blink ICE Tower CPU Cooling Fan for Raspberry Pi (Support Pi 4)
www.seeedstudio.com/Blink-Bli...
Links to the materials mentioned in the video (tell me if I forgot any):
ARMNN GIthub
github.com/ARM-software/armnn
XNNPACK GIthub
github.com/google/XNNPACK
TF-MOT Toolkit Documentation
www.tensorflow.org/model_opti...
Smart Pruning: Improve Machine Learning Performance on Mobile
community.arm.com/developer/i...
Support FP32 computations using INT8 weights in TensorFlow Lite XNNPACK delegate
github.com/tensorflow/tensorf...
0:10 Why I made this video
1:12 Designing ...
2:17 Reducing ...
3:51 Optimized ...
8:17 ... inference
9:35 Don't set your Pi on Fire
Credits for the music:
Nostalgia Drive - A Nostalgic Synthwave / Chillwave / Retrowave mix
• Nostalgia Drive - A No...
ED-209 - Frequency
Photon - Cosmos
Credits for the artwork:
Awesome Pi On Fire Demo on Bare metal Raspberry Pi
• Pi On Fire Raspberry P... - Наука та технологія
Thanks, it was helpful!
You're welcome!
Thank you for the information, everything is very well explained.
Glad it was helpful! :)
Well explained, thank you
Glad you liked it
Very good
Thanks!
hi! This results from XNNPACK and ArmNN are using the CPU? Have you tried to delegate to GPU? I'm currently getting beating up trying to compiling ArmNN libraries and dependencies, just a nightmare!
Yes, it's CPU. For Raspberry Pi, at least at the time this video was made it was not possible to utilize the GPU.