How to Use the Coral USB Accelerator with the Raspberry Pi - Increase TensorFlow Lite FPS!

Поділитися
Вставка
  • Опубліковано 4 жов 2024

КОМЕНТАРІ • 187

  • @AffectiveApe
    @AffectiveApe 4 роки тому +2

    Feel fortunate! I just so happen to be setting up my raspberry pi and coral usb accelerator for the first time TONIGHT, of all times :P thanks for the content!

  • @shaheerasghar3424
    @shaheerasghar3424 4 роки тому +2

    Clicked the like button before playing the video. Your videos are top quality and very helpful!

  • @diggleboy
    @diggleboy 4 роки тому +1

    A Raspberry Pi 4 with the Coral Tensorflow accelerator is a great alternative to the NVidia Jetson Nano (completely sold out!).
    I'm going to give this a try because I can't afford to wait 18 weeks for them to be made in China and distributed to North America.
    Thank you for putting these great Tensorflow Light demonstrations together. Really great production quality.

  • @mtheory1999
    @mtheory1999 4 роки тому +2

    Edje Baby!!! You're back!!! I love it! Hope you're doing well, Mann!!!

  • @kyleheppler2860
    @kyleheppler2860 4 роки тому +2

    Excellent video. Follows the written guide very well. I had already followed this guide and moved onto the how to train models on/for raspi. Really looking forward to that video, i tried the written and cant get it to work correctly for some reason.
    I have a fixed wing drone with a few hours of flight time soaring over the mountains everyday, and i want to train models based on the sky camera perspective. With proper programming ill have it automatically enter into holding patterns if certain animals or people are seen. Ill tag you in the video for credit once its up!
    Looking forward to the last video to finish this series!! Thanks bro

    • @EdjeElectronics
      @EdjeElectronics  4 роки тому

      Thanks! That sounds like a VERY cool project, I look forward to seeing it. Feel free to comment a link to it on this video once you've got it done! My next video will just show how to convert a TensorFlow Lite model into an Edge TPU model using edgetpu-compiler (it will only be a 3 minute video). I am going to make a series of videos stepping through how to train a custom TensorFlow Lite model, but it won't be until later this summer when I start working on them.

    • @mtheory1999
      @mtheory1999 4 роки тому

      Dude thats dope good luck with that! You should look into Google Cloud Vision.

  • @MakerMadness
    @MakerMadness 4 роки тому +4

    Have been waiting for this video for a super long time! Thank you!

    • @EdjeElectronics
      @EdjeElectronics  4 роки тому

      Thanks for your support! I'm glad to finally have it finished 😁

  • @peterpirog5004
    @peterpirog5004 4 роки тому +2

    Great Tutorial :) It's not frequen that everything works exactly like in tutorial ! There were no errrors - great job.

  • @niteshverma2310
    @niteshverma2310 4 роки тому +1

    Much awaited video. Thanks for posting. Super cool. Stay Safe!

  • @FanMaoyi
    @FanMaoyi Рік тому +2

    Thank you so much. You did such a great job, and explained all related topics clear and easy to follow!

  • @achimherrmann1692
    @achimherrmann1692 8 місяців тому +7

    I appreciate your videos very much.
    Do you have any plans to make one with Raspberry Pi 5 + TensorFlow Lite + Coral Accelerator USB or mini PCIe?
    I know there is something around already, but I think your's are the most professional ones.

    • @armisis
      @armisis 4 місяці тому

      Yes this please!

  • @wxfield
    @wxfield 4 роки тому +1

    I had no idea Google was making something like this. I've been using Nvidia GPU's for some time with Tensor..but this really piques my interest in devices specifically optimized for tensor models. Great video..I think I'll throw some parts together this weekend and try your non-edge tensor setup first. I doubt with all the shipping delays happening I'll be able to get a edge tpu very quickly to eval.

  • @musicmx2772
    @musicmx2772 4 роки тому +3

    you and your video excellent and amazing!!!

  • @christianegana4443
    @christianegana4443 4 роки тому +1

    Excellent tutorial again!!!! Thank you so much for sharing!!!!

  • @AffectiveApe
    @AffectiveApe 4 роки тому +2

    And just like that I have everything working! Running the coral on std I am getting about max 16FPS, but I think this might be because I am connecting to the pi via AnyDesk which might be taking up memory. Sincerely, thank you for sharing your projects with the community, I find it pretty remarkable that it was so easy to get set up.
    Quick question: Does your previous tutorial on how to train your own tensorflow lite model NOT apply when you are using the coral USB accelerator? Is that the next video you are releasing?

    • @EdjeElectronics
      @EdjeElectronics  4 роки тому

      Awesome! Glad you were able to get it all working. This tutorial (see link) does work for training your own TensorFlow Lite model which can then be compiled to run on the USB Accelerator. My next video will just show how to compile the TensorFlow Lite model for the USB Accelerator. github.com/EdjeElectronics/TensorFlow-Lite-Object-Detection-on-Android-and-Raspberry-Pi

  • @ricetv942
    @ricetv942 3 роки тому +1

    Can't wait for your new tutorial

  • @YarikSychov
    @YarikSychov 3 роки тому +2

    Great content, thanks!
    Very interested in tutorials on how to train your model/improve the accuracy of the existing one and on how to forward the output of the model to the app for notifications.
    *edit: found the link to your colab file, thank you

  • @murattsarakishvili8164
    @murattsarakishvili8164 4 роки тому +1

    Can't wait for your new videos!

    • @EdjeElectronics
      @EdjeElectronics  4 роки тому +1

      Thanks! I've been crazy busy lately and haven't had much time to make videos... But I will get them done eventually!

    • @TenshiTV
      @TenshiTV 4 роки тому +1

      @@EdjeElectronics I'm really looking forward to it! Your guides have been INCREDIBLY helpful. I've been able to have a working prototype but stuck at being able to compile a custom model. Thank you for your guides!!

  • @gopal7675
    @gopal7675 11 місяців тому +2

    Your documentation is still working. Ubuntu 18+ Rpi4 +coral usb rocking. It would be great if you have any garden or farm related model for edge TPU. I'd like to buy a super thanks for your but I couldn't find options. Awesome work

    • @EdjeElectronics
      @EdjeElectronics  11 місяців тому

      Awesome, thanks for letting me know it's still working! It's been a while since I've tried it out. I don't have any garden or farm models, but you may be able to find one on Roboflow Universe or similar. universe.roboflow.com/

  • @Wingly113
    @Wingly113 3 роки тому +4

    Would running on Nvidia Jetson Nano improves the FPS? The usb accelerator doesn't justify the price

  • @Agronomistapolo
    @Agronomistapolo 4 роки тому +3

    Great tutorial I get this error. "ValueError: Failed to load delegate from libedgetpu.so.1.0"

  • @BeniWinter
    @BeniWinter 3 роки тому

    Really cool and helpful video! Thanks a lot for making it! Your are great!!

  • @rafeemiracle
    @rafeemiracle 4 роки тому +1

    Thank you so much for helping us 😘🙏

  • @tobieabel7474
    @tobieabel7474 4 роки тому +1

    Great video, adding to a great series! I'll be buying the USB Accelerator from your link. Like others here I'm using Goolge Colabs in conjunction with your videos as it simplifies the process for the less experienced amongst us.
    I noticed that on some of your clips the frame rate is about 19/20 FPS and for others its over 30, was just wondering if you knew what was making the difference? My particular project is to make an automated goalkeeper for subutteo football, so the quicker i get the FPS ,the less blurry the image captured, the more likely the pi camera will identify the ball and save it! Thanks again for the great tutorials.

    • @EdjeElectronics
      @EdjeElectronics  4 роки тому

      Good question! The video that has 30FPS is only 640x480 resolution, while the video at the end is 1280x720 resolution. The lower the webcam/video resolution, the faster it will run. Your project can probably use a 640x480 resolution, so try that! You can set the resolution to 640x480 by using the --resolution argument: "python TFLite_detection_webcam.py --modeldir=Sample_TFLite_model --resolution=640x480 --edgetpu"

  • @brechtdebackere
    @brechtdebackere 3 роки тому +1

    Awesome! Works like a charm. Somehow on a RP4 with 4GB ram video tends to freeze and go, freeze and go,... while the fps keeps saying ~20... A problem that doesn't happen on the RP4 with 8GB ram...

    • @sundownsupper7409
      @sundownsupper7409 2 роки тому +1

      that would the uh.... lack of ram talking to you.

  • @JeromeDemers
    @JeromeDemers 3 роки тому +3

    21fps with Coral vs 4fps on the PI alone!

  • @_Bernardo
    @_Bernardo 4 роки тому +1

    Excellent! thank you so much for putting work on these videos mate, I truly appreciate it. And as a sign of appreciation, buying now with your link. May I suggest another video? Since you mention it is necessary to compile with a linux machine, could you make a video on how to make a Bootable, Persistent Linux USB? im struggling with that as I have a laptop which I use for work and I dont want to risk an error re partitioning the windows bootable drive. Seems like The Bootable USB option is the way to go but then some methods are not persistent or the space allocations are limited.
    Thanks again for your work!

    • @EdjeElectronics
      @EdjeElectronics  4 роки тому +1

      Thanks for your support! My next video will show how to either create a Bootable Ubuntu Linux USB or install an Ubuntu virtual machine on your PC. I'm still figuring out which method is easiest and most robust (i.e. will result in the least amount of errors for users 😅). I love my bootable Ubuntu USB drive though! Here are some great instructions on how to set one up, straight from the Ubuntu website: ubuntu.com/tutorials/tutorial-create-a-usb-stick-on-windows#1-overview

  • @bennedictbyy
    @bennedictbyy 4 роки тому +1

    Hi Edje, Thanks for the great tutorial! It is really well explained. One question, is there a way to let this program run on boot up? if so, what is recommended?

  • @puerlatinophilus3037
    @puerlatinophilus3037 3 роки тому +7

    3:35 If I were an AI, I'd say that you could achieve 480 FPS at 480°C

  • @gamingbystarlight6405
    @gamingbystarlight6405 4 роки тому

    Thanks for the videos! Very helpful. I'm a newbie and will experiment at creating a burglar detector. Every year or two, we get a bunch of bored kids visit our subdivision and try to burglarize vehicles. I might build a system that monitors the street and looks at cars, deer and turkeys, and pedestrians. If it's between midnight and 5:00a and one or more pedestrians are detected, I'll get notified. I'll share any interesting findings. Cheers!

  • @楊贄豪
    @楊贄豪 4 роки тому +3

    Hi, If I just want return the label in the terminal what can I do

    • @contractorwolf
      @contractorwolf 4 роки тому

      hey, sorry I cant tell your name but I was wondering the same thing (interested in a similar problem for and idea I am thinking about). I took a look at his code and found that it would be fairly easy to print out the labels found on object detected to the terminal by just editing the python script he is running that displays the camera feed with the labels and scores. If you look at this file:
      github.com/EdjeElectronics/TensorFlow-Lite-Object-Detection-on-Android-and-Raspberry-Pi/blob/master/TFLite_detection_video.py#L138
      you can see the detected "labels" of the objects seen in each frame and a simple print statement for each item in that label array would do it for you, you could also print the corresponding item in the scores array to the terminal if that would help as well. Contact me on twitter @contractorwolf if you are still stuck

  • @Sullixio
    @Sullixio 4 роки тому +1

    I have a pi4/usb accelorator set up and would like to retrain the model to recognize a new object, which i have hundreds of pictures and annotated pic for. Is there a tutorial that explains how to best do this? Thanks!

  • @vishal01mehra
    @vishal01mehra 4 роки тому +1

    Coral or movidius? Whick one is bang for buck?

  • @lorenzoleongutierrez7927
    @lorenzoleongutierrez7927 4 роки тому +1

    Great , bravo !

  • @BooBar2521
    @BooBar2521 2 роки тому +1

    honestly you are my hero!!

    • @EdjeElectronics
      @EdjeElectronics  2 роки тому +1

      Thanks man! One of these days I'll put out more TensorFlow videos... as soon as I get the time 🤞🤞

  • @davidverheyen6635
    @davidverheyen6635 3 роки тому +2

    Looks promising. Would this combination (pi4 with google Coral) be able to detect of which species a bird is out of a list of about 50 in realtime?

    • @EdjeElectronics
      @EdjeElectronics  3 роки тому +3

      Yes, it definitely would be able to do it in real time. However, the accuracy might not be as good as you'd like. Lightweight models like MobileNet aren't very good at distinguishing between visually similar objects (like a finch vs a sparrow). You should check out this bird classification model on TF Hub (it even lets you upload your own images to test out): tfhub.dev/google/lite-model/aiy/vision/classifier/birds_V1/3

  • @justin9558
    @justin9558 4 роки тому +1

    can you show us how to use the “stream” feature? i’m having problems setting it up

  • @spectralcodec
    @spectralcodec 4 роки тому +1

    Thank you!

  • @jdpantoja442
    @jdpantoja442 3 роки тому

    I am waiting for the next amazing tutorial.

  • @kemfes5263
    @kemfes5263 4 роки тому +3

    it is possible to export normal tf model (for example RCNN -> edge tpu model) to run it on rpi + coral?

    • @EdjeElectronics
      @EdjeElectronics  4 роки тому

      Nope 😭 unfortunately, the Edge TPU only supports running SSD-MobileNet detection models for now. They might add RCNN support in the future. You should also keep an eye on EfficientDet, the newest state-of-the-art lightweight object detection model from Google.

  • @raghaviyer3097
    @raghaviyer3097 4 роки тому +1

    loved your vid

  • @Lotte-qn1qd
    @Lotte-qn1qd 4 роки тому +1

    thank you so much
    you save me

  • @armisis
    @armisis 4 місяці тому +1

    Sigh need to find this updated for raspberry pi 5 can't past the cv2 error.

  • @pranitpokhrel470
    @pranitpokhrel470 3 роки тому +3

    Any update on when you'll be releasing a video on how to install the compiler?

    • @EdjeElectronics
      @EdjeElectronics  3 роки тому +5

      Here's Google Colab session that will allow you to compile a TFLite model into EdgeTPU format. You just need to upload your .tflite file and run the commands. If I make a video on this, it will be a quick one! colab.research.google.com/drive/1o6cNNNgGhoT7_DR4jhpMKpq3mZZ6Of4N?usp=sharing

    • @pranitpokhrel470
      @pranitpokhrel470 3 роки тому

      @@EdjeElectronics Perfect, thank you. I will try this, since they discontinued support for 32 bit systems. Hopefully this works on my 32 bit raspian.

    • @luisFelix
      @luisFelix 3 роки тому +1

      @@EdjeElectronics Hi, i cant compile tf from source, exist any alternative? im lost last weak trying on diferent computers and versions but no success, i already have .pb and .pbtxt... Can help you give some help?

    • @EdjeElectronics
      @EdjeElectronics  3 роки тому +1

      @@luisFelix I couldn't compile TF from source last time I tried either! I guess I was lucky when I got to work one year ago :) . Here is a link to a Colab that will allow you to convert your .pb model to a .tflite model. colab.research.google.com/drive/1Px7I6PxeeLhCepyA9Dv22pwz66NuT1JR?usp=sharing

    • @luisFelix
      @luisFelix 3 роки тому

      @@EdjeElectronics Perfect!!! big big tks!!!

  • @thecubemaster
    @thecubemaster 2 роки тому +3

    Looks like this no longer works for raspberian 11 bullseye. Is there a way to fix that? Looks like its an issue with opencv and it not finding the camera. I tried using the camera in legacy mode but no luck

    • @EdjeElectronics
      @EdjeElectronics  2 роки тому

      Can you let me know what type of camera you're using?

    • @BooBar2521
      @BooBar2521 2 роки тому +3

      for me its working perfectly at raspberian 11. only the Pi Camera didnt work i have to use a webcam

  • @jaredfellows9444
    @jaredfellows9444 4 роки тому +1

    Should I get the RPI 4 with coral USB accelator, or the Coral Dev Board? (Use case: use the AI model to tell camera servo to track a person)

  • @richarddenboer5364
    @richarddenboer5364 2 роки тому

    I love your videos......just brilliant

  • @liamhan8145
    @liamhan8145 3 роки тому +1

    Thank you for the video! Big Thumbs UP! Did you use a high speed (5gbps) usb-c cable to get up to ~30-40 fps? I only get around 20fps using the google coral

    • @EdjeElectronics
      @EdjeElectronics  3 роки тому

      You're welcome! Yes, I used a USB 3.0 cable to plug in the Coral USB Accelerator. The reduced framerate you're seeing might be because you're running at a higher resolution. I get 30-40 FPS when running the camera at 640x480, and about 20FPS when running at 1280x720.

    • @liamhan8145
      @liamhan8145 3 роки тому +1

      @@EdjeElectronics The google coral came with a usb cable. Did you purchase a separate one with higher data transmit speed? I see. I'm trying to design a mask detector using rpi4+coral and display it on a 50" TV. Would using a bigger screen reduce the fps?

    • @EdjeElectronics
      @EdjeElectronics  3 роки тому +1

      @@liamhan8145 Nope, I just used the cable that it came with. And no, using a bigger screen will not reduce the FPS. (However, the video might look kind of grainy or blurry.) I've been developing a mask detection camera at work, and we're going to open-source all the code for it in a couple weeks! I'll share that with you once it's ready.

    • @liamhan8145
      @liamhan8145 3 роки тому

      @@EdjeElectronics That would be amazing (: Thank you! As of now, I am using MobileNet SSD v2 (Faces) from coral.ai/models/. And I used transfer learning to add couple layers at the end for mask detection. With these two models, I'm trying to first detect a face and then once finding the ROI, pass that through the mask detection. There's been a lot of nice videos of others who did this. But none of them uses google coral so the speed is very slow. I'm guessing you might be planning something similar! In terms of implementation with google coral, both models just have to be in edge tpu format right? And then just pass them through the coral? Thank you!

  • @maciejwojcik7226
    @maciejwojcik7226 4 роки тому +1

    Did you maybe try to test the TFLite model's performance using implementation with C++ API for TF instead of Python? It can be interesting if it would increase performance

    • @EdjeElectronics
      @EdjeElectronics  4 роки тому

      I haven't tested that! You're right, it would be interesting to see how much the performance increases. Unfortunately, I just don't have time to try it out!

    • @namvu4180
      @namvu4180 4 роки тому

      I haven't check the performance increase yet, but here's my c++ repo: github.com/Namburger/edgetpu-detection-camera

  • @kentine2015
    @kentine2015 4 роки тому +1

    Hi sir, do you have a tutorial on how to add object in an existing model?

    • @EdjeElectronics
      @EdjeElectronics  4 роки тому

      Unfortunately, you can't just add an object to existing models. You have to retrain the whole model from scratch with data from the old model plus the new data!

    • @kentine2015
      @kentine2015 4 роки тому

      Okay thank you,

    • @revhappyrs2k
      @revhappyrs2k 4 роки тому +3

      @@EdjeElectronics Would you happen to have a tutorial on how to make a custom object detection model from scratch, converting it to tflite so it can run on the Coral+Pi? Thanks, btw.. your videos are great and you're a good teacher.

    • @kentine2015
      @kentine2015 4 роки тому

      @@EdjeElectronics if your model trained in windows 10. can it be use in rpi?

  • @christophertams8359
    @christophertams8359 4 роки тому +2

    Hey Edje, my raspberry
    pi is having a little bit of trouble when I try to test the sample edgetpu. When ever I type in that last command, the message VIDIOC_QBUF: Invalid argument continuously pops up. Think u can help?

    • @EdjeElectronics
      @EdjeElectronics  4 роки тому

      Does a window still appear with a live camera feed and detected objects drawn on each frame? Or does the program not run at all? Either way, it might be an issue with your webcam. Try borrowing a friend's webcam to see if the error still occurs!

    • @christophertams8359
      @christophertams8359 4 роки тому

      Edje Electronics hey, so I switch from the pi camera into a usb camera and when I now try to test the sample edgetpu, it gets stuck on /home/pi/tflite1/Sample_TFLite_model/edgetpu.tflite.

  • @razzaqhalim1946
    @razzaqhalim1946 3 роки тому

    really nice video. I hope u can make a video on how to apply this image processing to make autonomous car. can we make conditions for the autonomous car based on detected object?. if can, I really want to know how. :D

  • @abdallahatef5823
    @abdallahatef5823 Рік тому +1

    Tried this but keep getting a "Segmentation Fault"
    error

  • @WFB-ng
    @WFB-ng Місяць тому

    excellent video !

  • @MaeLSTRoM1997
    @MaeLSTRoM1997 4 роки тому +2

    Can the RPi+Coral usb accelerator combination support regular tensorflow library rather than the lite version? I am using a python library that requires TensorFlow 2.1 and I don't know if using TF lite will work for the application.

    • @EdjeElectronics
      @EdjeElectronics  4 роки тому +1

      Yes, if you are using a 2.X version of TensorFlow, it will have the compatible TensorFlow Lite libraries built in. My code automatically handles importing packages from the correct TensorFlow library regardless of which one you have installed.

    • @MaeLSTRoM1997
      @MaeLSTRoM1997 4 роки тому +1

      @@EdjeElectronics Ah I see. Thank you very much!

  • @Car_Ram_Rod
    @Car_Ram_Rod 3 роки тому

    Cool video. I plan on getting a coral due to this video. Any chance you could do a NCS2 vs Coral? I dont see many videos comparing the two.

  • @brunesi
    @brunesi Рік тому

    Very good content, thank you.

  • @mtheory1999
    @mtheory1999 4 роки тому +1

    Hey Edje have considered using Google Colab for your tutorials?

    • @EdjeElectronics
      @EdjeElectronics  4 роки тому +1

      Yes! I still need to look at Colab more, but I'm thinking of using it for when I do my video series showing how to train custom TensorFlow Lite models. I'm a little hesitant because Google Colab doesn't give a consistent amount of processing power to each user, and it's liable to change at any time. But I'm definitely looking in to it!

    • @mtheory1999
      @mtheory1999 4 роки тому

      Edje Electronics yea that is true the GPUs you get can be quite in consistent. Another thing is also Google Cloud Vision it’s a lot more automated, but still worth looking into. Good luck and stay safe in the meantime. Thanks for the reply!

  • @ricetv942
    @ricetv942 4 роки тому +1

    Can I use it for image classification custom data set?

  • @thomashu1095
    @thomashu1095 3 роки тому

    Hi Evan, i ran your card model generated from Windows 10 on Pi 4. Although able to detect, the FPS was extremely slow. Have you implemented it with Coral TPU? Although I am a 70 years old man, I have learned a lot from your previous videos. Do you have any new videos coming soon such as using Colab to train a model?

  • @adr130590
    @adr130590 4 роки тому +2

    hello, could you help me with this error? usage: TFLite_detection_video.py [-h] --modeldir MODELDIR [--graph GRAPH]
    [--labels LABELS] [--threshold THRESHOLD]
    [--video VIDEO]
    TFLite_detection_video.py: error: unrecognized arguments: --edgetpu
    is weird because the tensorflow lite is working
    thanks

    • @EdjeElectronics
      @EdjeElectronics  4 роки тому

      Hmm, I think you have an older version of my code that doesn't support the Edge TPU. From inside the tflite1 folder, try issuing "git pull github.com/EdjeElectronics/TensorFlow-Lite-Object-Detection-on-Android-and-Raspberry-Pi.git". That should update your local files with the newer files from my repository.

    • @adr130590
      @adr130590 4 роки тому

      yes i was using the older version, thanks!!

  • @bengabizon2558
    @bengabizon2558 7 місяців тому +2

    Segmentation fault error

  • @nguyenluu3082
    @nguyenluu3082 4 роки тому +1

    Hi Edge, hope to have good health when saw it.
    I was following your tutorial and while I connect my Coral USB, the LED on it does not light up????
    And when i try to run command with --edgetpu, the "Failed to load delegate from libedgetpu.so.1.0" was appered.
    my Coral USB has problem?

    • @namvu4180
      @namvu4180 4 роки тому

      Probably just need to do something like this: sudo usermod -a -G plugdev $USER and reboot

  • @woolfel
    @woolfel 4 роки тому +1

    it would be interesting to see how well faster_rcnn_resnet_101 runs on coral usb. The last time I tried to run that on pi 3B+, I wasn't able to convert the model to tflite. I may have to try it again, since I haven't tried in over a year.

    • @EdjeElectronics
      @EdjeElectronics  4 роки тому +1

      Unfortunately, TFLite does not support Faster-RCNN models. It only supports SSD-MobileNet models. Maybe Google will update it to support heavier models some day!

    • @woolfel
      @woolfel 4 роки тому +1

      @@EdjeElectronics bummer, I was hoping they would add support for it eventually. A couple of recent papers in 2019 were making gradual progress on neural net compression. I've been thinking or asking myself this question, "what if you remove the residual layers after the model is trained?" One of the primary benefits of resnet is reducing vanishing gradient during training. If we remove the residual layers, you probably wouldn't be able to retrain the model, but it might make it easier to convert to tflite.

  • @yalmadiable
    @yalmadiable 3 роки тому

    Hey man, what about tracking would it be in high FPS? And are you open to make projects for unmanned vehicles applications?

  • @davewhittaker159
    @davewhittaker159 4 роки тому +1

    Hi great video
    Would I be able to add a nest outdoor camera as camera and use coral usb accelerator to add to security system as item detection ?

    • @EdjeElectronics
      @EdjeElectronics  4 роки тому +1

      Hmm, I haven't used a Nest outdoor camera before so I'm not sure how they work. Do they stream the video feed over an IP? If so, it might work. You'd have to set up your Raspberry Pi to grab the stream from the Nest camera and then process it with TensorFlow. Here's the code that lets it work with a web stream: github.com/EdjeElectronics/TensorFlow-Lite-Object-Detection-on-Android-and-Raspberry-Pi/blob/master/TFLite_detection_stream.py

    • @davewhittaker159
      @davewhittaker159 4 роки тому

      @@EdjeElectronics thank you for the reply, I'll give it a try and let you know how it works out.

  • @contractorwolf
    @contractorwolf 4 роки тому +1

    First off great video and tutorial @edjeelectronics, your method of explanation and documentation is really terrific. I am currently up and running with the TPu, but only getting ~18 FPS with nothing else running. This is a fresh install of Raspian Buster with nothing else installed and only went through the setup to get it up and running without the TPU first (which performed at ~5 FPS). Does anyone have any tips to get the FPS up to something closer to 30 FPS without going to Max? Thanks in advance.

    • @itsmeintorrespain2714
      @itsmeintorrespain2714 4 роки тому +1

      I've just this minute fired up my brand new Accelerator on my Pi4 4GB and I'm getting 35FPS from a creative USB webcam (using STD, not Max)

    • @EdjeElectronics
      @EdjeElectronics  4 роки тому

      @james wolf Thanks! Which model of Raspberry Pi are you using? I used the Pi 4 4GB model for this video, and I do think the extra RAM helps it run a bit faster.

    • @contractorwolf
      @contractorwolf 4 роки тому +1

      @@EdjeElectronics I am also using the Pi 4 with 4GB, I tried to match what you did exactly

    • @EdjeElectronics
      @EdjeElectronics  4 роки тому

      @@contractorwolf Weird! Do you know if you're running at a higher resolution (1920x1080 instead of 1280x720) maybe? Sometimes a webcam will automatically force a high resolution. Also, do you have the TPU plugged in to a USB 3.0 port?

    • @contractorwolf
      @contractorwolf 4 роки тому

      @@EdjeElectronics I am using the normal PiCam and running the Coral from the USB-3 port (blue). Would the PiCam have a slower refresh rate or something?

  • @baronzemo7224
    @baronzemo7224 4 роки тому +1

    hi, there in this tutorial How To Train an Object Detection Classifier Using TensorFlow (GPU) on Windows 10 after termination of cmd what are the exact commands that we use. I am getting an error in first line of ipynb file i.e.
    ImportError Traceback (most recent call last)
    in
    15 # This is needed since the notebook is stored in the object_detection folder.
    16 sys.path.append("..")
    ---> 17 from object_detection.utils import ops as utils_ops
    18
    19 if StrictVersion(tf.__version__) < StrictVersion('1.9.0'):
    C:\tensorflow1\models
    esearch\object_detection\utils\ops.py in
    26 from six.moves import zip
    27 import tensorflow.compat.v1 as tf
    ---> 28 import tf_slim as slim
    29 from object_detection.core import standard_fields as fields
    30 from object_detection.utils import shape_utils
    ImportError: No module named 'tf_slim'
    How to solve this problem help plz

    • @TenshiTV
      @TenshiTV 4 роки тому

      try the following:
      pip install tf_slim

  • @BooBar2521
    @BooBar2521 2 роки тому +1

    is it possible to use two google coral TPUs?

  • @DanielSmith-yx6zm
    @DanielSmith-yx6zm 4 роки тому +1

    thanks for putting this out!

  • @sahrishmemmon4409
    @sahrishmemmon4409 4 місяці тому

    Everything worked perfect except bounding box is not directly on the objects.placement of rectangle box is away from the object detected.can any1 know how to solve it?

  • @ciscomike82
    @ciscomike82 Рік тому

    Hi Edge, I guess you used 32bit version os on Pi at your videos

  • @liamhan8145
    @liamhan8145 3 роки тому

    Will it be possible to make the window size much bigger? Also if I were to run this on a 50" TV, would the fps slow down??

  • @staebchen69
    @staebchen69 3 роки тому +1

    Hi, which operating system must be installed on the raspberry pi4 8gb? I have the raspberry brand new

    • @EdjeElectronics
      @EdjeElectronics  3 роки тому +1

      For this project, please use the official 32-bit Raspberry Pi OS. www.raspberrypi.org/software/

  • @thenoobgam3r190
    @thenoobgam3r190 4 роки тому +1

    When will we finally get to see how to make our own edge tpu model

    • @EdjeElectronics
      @EdjeElectronics  4 роки тому

      Good question! That video has moved way to the backburner for me. I did just create a Google Colab guide that you can use to compile Edge TPU models. If you have a quantized TFLite model, it's as easy as uploading the .tflite file to Colab and running the compiler. Try it out! colab.research.google.com/drive/1o6cNNNgGhoT7_DR4jhpMKpq3mZZ6Of4N?usp=sharing

    • @thenoobgam3r190
      @thenoobgam3r190 4 роки тому

      Ok

    • @thenoobgam3r190
      @thenoobgam3r190 4 роки тому

      How do I get a quantized TFLite model. I followed your tutorials and have a detect.tflite file that runs. However, in the google colab it says not quantized. Is there a step I missed?

    • @thenoobgam3r190
      @thenoobgam3r190 4 роки тому

      Edge TPU Compiler version 14.1.317412892
      Invalid model: detect.tflite
      Model not quantized

    • @thenoobgam3r190
      @thenoobgam3r190 4 роки тому

      Please help!

  • @protonlogy4198
    @protonlogy4198 4 роки тому

    Hi, sorry am quite new here for the Tensorflow. Can I know is there a way to output the result of the object detection and their percentage into a text file? Thanks

  • @ちゃーあっとほーむ
    @ちゃーあっとほーむ 3 роки тому

    Very cute!!

  • @falseee4445
    @falseee4445 4 роки тому +1

    Hello, is this the same with the Dev board from coral? I live in france and the usb accelerator cant be sent to where i live

    • @EdjeElectronics
      @EdjeElectronics  4 роки тому

      I don't know much about the Dev Board, so I'm not sure. They have some good information on the website about what kind of projects you can do with it (see link). I didn't know they couldn't ship to some places in France 😞. Are there any EU websites you can purchase the USB Accelerator from? coral.ai/docs/dev-board/get-started/

    • @falseee4445
      @falseee4445 4 роки тому

      @@EdjeElectronics Hum, i tried on a lot of EU websites and they all said that it wanst shippable except one but it said that they would send me an email if they can ship it to my country . If they doesnt, i guess i'll order the dev board and i'll use the coral docs, thanks

  • @saexpat
    @saexpat 9 місяців тому +1

    Can you add a second/third camera?

  • @deepakjose9889
    @deepakjose9889 3 роки тому

    sir is there any video on using intel ncs2 instead of the coral USB accelerator. Because coral USB accelerator is out of stock for over 4 months now

  • @kumruyont6071
    @kumruyont6071 3 роки тому

    Hello , I try to use Led for person detection . Install RPi.GPIO on my python3 but I get a 'ModuleNot FountError: No module named 'RPi' ' . After I try to install RPi.GPIO on virual env but again same error. Please can you help me for this problem.

  • @ArvindJuneja
    @ArvindJuneja Рік тому

    Did you do the traffic counter?

  • @voldemore6300
    @voldemore6300 3 роки тому

    can we only type "--edgetpu" behind any coding file after I completly install a usb coral

  • @pranitpokhrel470
    @pranitpokhrel470 4 роки тому

    No module named "edgetpu"! Is this some pathway error? i have installed edgetpu and i can see the folder where its installed

  • @jnn01972
    @jnn01972 4 роки тому +1

    👏

  • @allcore5172
    @allcore5172 4 місяці тому

    Would this help using an all sky camera?

  • @chinmayrane7830
    @chinmayrane7830 3 роки тому

    Hi, have you tried the intel neural compute stick 2?

  • @basavarajgangadhar7521
    @basavarajgangadhar7521 2 роки тому

    Hi, is there any video for coral on windows ?

  • @Doomslayer151
    @Doomslayer151 4 місяці тому

    i think can use a RTX2050 for this job it haves TENSOR CORES inside by CUDA CORES and WAY MORE CHEAP(gives 2x more performance than same priced tpu) AND WAY MORE EFECTIVE WAY(when nvidia drivers available to linux arm there is hardest part)

    • @azimsametergin9954
      @azimsametergin9954 2 місяці тому +1

      It seems logical, but isn't power consumption one of the most important things in such embedded systems?

  • @thenoobgam3r190
    @thenoobgam3r190 4 роки тому +2

    I have a quick question, with the coral USB accelerator why am I only getting 10 fps. Note that before the coral accelerator I was running 0.9 fps. I have a raspberry pi 4 with 4 gb of ram. I do not know why it is running so poorly.

    • @alanhunter351
      @alanhunter351 4 роки тому +3

      I had the same issue, and I doubled my frame rate by using a high-quality USB 3 cable. The cable that came with my Coral USB accelerator may have been damaged or defective.

  • @OpenYoureyes304
    @OpenYoureyes304 2 роки тому

    when i run it with edgetpu mine is not accurate please help

  • @justinjanes3431
    @justinjanes3431 4 роки тому +1

    Coral has a price drop right now FYI circa July 2020

  • @cbt0949
    @cbt0949 3 роки тому

    where is the whole directory?

  • @Den-Geist-Befreien
    @Den-Geist-Befreien 11 місяців тому

    Anyone experiment with the OAK D lite Camera with this?

  • @alecmunguia9688
    @alecmunguia9688 3 роки тому

    can you do this with a jetson nano :) 2gb

  • @wentaozhong6184
    @wentaozhong6184 4 роки тому

    Hi I followed the instructions but why my fps only increase to 10

    • @contractorwolf
      @contractorwolf 4 роки тому

      hey wentao look at my question (I have the same issue). I found that my RPi was failing on the SD Card Speed Test and that my affect the possible FPS when running it. I am interested to see if your SD card fails the same test as mine, that might be the issue? Let me know. The test is here: Menu > Accessories > RPI Diagnostics >SD Card Speed Test

    • @Spreme91
      @Spreme91 4 роки тому

      @@contractorwolf could you find the performance difference after changing SD card ? cuz mine stucks at 10fps as well

    • @contractorwolf
      @contractorwolf 4 роки тому

      @@Spreme91 did not seem to help me, passes all diagnostic tests now with the faster card but still can only achieve ~18fps with the cable that comes with the Coral, I made changes for my app to his original code that drops my performance down to like a 12fps (doing calculations to find the largest identified object and writing a bunch of data to a tiny TFT). Not the greatest performance, but good enough for my project. Let me know what you are doing or if you find any tweaks that help. @contractorwolf on twitter

  • @KadekSukaAstawa
    @KadekSukaAstawa Рік тому +1

    Would you like to sell your USB Accelerator to me? I've looked everywhere and always run out

  • @marsrocket
    @marsrocket 4 роки тому

    Why don’t you plug in the coral until after the libraries are installed? This isn’t Windows where drivers will be auto-installed.

  • @Ashukr711
    @Ashukr711 4 роки тому +1

    Hey can you please tell me is it possible to convert the detected object in to speech. That will help as i am trying to this as a final year project.

  • @davidploetz2796
    @davidploetz2796 4 роки тому

    Hey, can Edje Electronics contact me directly? My inquiry is pertaining to OmniVision OV6948.

  • @EbrahimHasan
    @EbrahimHasan Рік тому

    Is it possible to use it on a windows machine, sir?