How to Deploy a Tensorflow Model to Production

Поділитися
Вставка
  • Опубліковано 19 вер 2024

КОМЕНТАРІ • 196

  • @atikkhatri6942
    @atikkhatri6942 7 років тому +1

    I dived into the world of ML using scikit-learn and now I am learning the Tensorflow. I searched alot about the deployment of models, but I am having a hard times understanding the whole meachnaism. I really appreciate your effort, this is the best content on ML deployments on UA-cam 👍🏻

  • @radosccsi
    @radosccsi 7 років тому +11

    I made a model in Keras. Installed Keras and Tensorflow on AWS instance in Virtualenv and created single python instance listening to RabbitMQ with Pika and used Flask over WSGI to put messages to the queue. HTML client uploads a photo and is returned with ID than it should request id info from the server in one second intervals. Works fine and queuing is kind of bullet proof since it's running on a small cpu instance :)

    • @nourhacker3734
      @nourhacker3734 7 років тому +1

      Hey rad, sounds very interesting. Where do I learn how to do this?

    • @altairpearl
      @altairpearl 7 років тому

      rad rabbitMQ. I have heard about it and thought of using it .

    • @SirajRaval
      @SirajRaval  7 років тому

      very cool

    • @shreyanshvalentino
      @shreyanshvalentino 7 років тому

      That's awesome!

  • @q-leveldesign5342
    @q-leveldesign5342 7 років тому

    Thank you, I have been wondering what to do with a model once trained. No one seems to be talking about this and it seems like a very important step. And yes, I have been searching furiously to figure it out. Thanks again.

  • @michaelbell6055
    @michaelbell6055 6 років тому

    Siraj... my dude, yours are the shoulders I am standing on in my job. Thank you so much for all the incredible tutorials and additional resources!!!

  • @arjunsinghyadav4273
    @arjunsinghyadav4273 7 років тому +26

    Hey Siraj, Firstly, great video
    Request: A tutorial on how to build a deployed Deep learning model that learns from live data and updates itself to a new version.

    • @2500204
      @2500204 5 років тому

      just load the model and do model.fit(new data) and then overwrite the file using model.save() or whatever save function your are using.
      Incremental Learning is the best solution for continuously updating models with new data.

    • @bharatsahu1599
      @bharatsahu1599 4 роки тому

      @Shashwat don't you think it will take so much time to retrain with new data included and user won't be waiting till infinity for results.

  • @jijojohn5168
    @jijojohn5168 7 років тому +35

    Long story short siraj earned around 864.84 dollars for this month lol go to 35:40.. He deserves lot more.. Keep up the good work.

    • @stephk8316
      @stephk8316 7 років тому +1

      jijo john not bad for a side job, and well deserved!

    • @tamgaming9861
      @tamgaming9861 7 років тому +6

      He deserves a lot more - i wish him the best!

    • @SirajRaval
      @SirajRaval  7 років тому +19

      ha! that slipped through. cool. i'll keep it there. transparency ftw

    • @chicken6180
      @chicken6180 7 років тому

      i mean, does he not deserve it?

    • @theempire00
      @theempire00 7 років тому

      Damn, imagine what those youtubers with millions of followers earn...

  • @vijayabhaskar-j
    @vijayabhaskar-j 7 років тому +1

    I always wondered "Ok, I created a model, now what?". Thanks, Siraj!

  • @angelomenezes6044
    @angelomenezes6044 7 років тому

    Man, you are really underrated! You deserve a lot for these great videos about ML. A big thanks from Brazil for the awesome work!!!

  • @adamyatripathi2743
    @adamyatripathi2743 7 років тому +76

    His notebook is Untitled... He chose the dark path....

    • @SirajRaval
      @SirajRaval  7 років тому +11

      renamed it to demo now, so much more content coming

    • @adamyatripathi2743
      @adamyatripathi2743 7 років тому +4

      Siraj Raval Your videos are good! May the force be with you...

    • @breakdancerQ
      @breakdancerQ 5 років тому

      @@adamyatripathi2743 Naming notebooks is for noobs

  • @KelvinMeeks
    @KelvinMeeks 6 років тому +1

    Siraj, excellent tutorial - thanks for creating this.

  • @mercolani1
    @mercolani1 6 років тому

    Loved the video, love the energy, he clearly has a deep understanding

  • @theempire00
    @theempire00 7 років тому +8

    24:18 When I run the command:
    'docker build --pull -t $USER/tensorflow-serving-devel -f tensorflow_serving/tools/docker/Dockerfile.devel .'
    I get an error:
    'invalid argument "/tensorflow-serving-devel" for t: invalid reference format'
    Help? (On Windows 7, Docker Toolbox)
    UPDATE: The following does work:
    'docker build --pull -t tensorflow-serving-devel -f tensorflow_serving/tools/docker/Dockerfile.devel .'

  •  7 років тому +1

    Love your teaching :) Keep it up☺

  • @andresvourakis6880
    @andresvourakis6880 7 років тому

    Your explanation was on point!! Thank you Siraj

  • @AbhishekKrSingh-ls5xu
    @AbhishekKrSingh-ls5xu 7 років тому +3

    Hey Siraj, Firstly, great video
    Request: Can u post a tutorial on tensorflow distributed training on GPUs and Kubernates.

  • @svin30535
    @svin30535 7 років тому

    Great topic! Thanks Siraj.

  • @aug_st
    @aug_st 7 років тому

    Very useful. Thanks Siraj!

  • @gattra
    @gattra 5 років тому +1

    Please rehearse more and these would be 10000% better

  • @genricandothers
    @genricandothers 7 років тому

    I barely ever comment on videos but I have got to show love for all I've learned on your channel. I've been recommending you to everyone I can find. What software do you use to do the screen background with you in the foreground by the way? I want to start a channel teaching atmospheric science and I like this style...

  • @abhiwins123
    @abhiwins123 7 років тому

    Thanks for end 2end tensor flow tutorial. World wows you for AI revolution

  • @eliassocrates338
    @eliassocrates338 7 років тому

    Siraj, could you please upload weights of models you trained as well, as neither online and personalized training of models is a viable option financially.

  • @captainwalter
    @captainwalter 4 роки тому

    I honestly dont get how to employ the model. At what stage do we use the neural net to make decisions about actionable data, in this case see it decode the words?

  • @AlienService
    @AlienService 7 років тому

    Thank you for these. I've learned a lot already. The big question and use case that I'm interested in is using ML in blender. The goal would be to create a blender add on that could be trained on and manipulate mesh in a character model. With Blender and its add ons all written in python, this seems doable. The mesh data can be called within the blender python api pretty easily. My question is how to best set up a system that would take a character mesh (this would be in the thousands of vertex coordinates) and then train a model on with shape keys for happy in each one, then be able to make a shape key on a new character mesh that also produces a happy expression.

  • @Oneillphotographyithaca1
    @Oneillphotographyithaca1 6 років тому

    So cool! This is inspiring me to make some models. :)

  • @thoughtsmithinnovation5432
    @thoughtsmithinnovation5432 7 років тому

    Hi Siraj, you mentioned at 28:00 that inception has 100s of layers. If I am not wrong presently it has only 48 layers. Please correct me if I am wrong or you are referring something else.

  • @abdelhaktali
    @abdelhaktali 6 років тому

    Hi Siraj
    I have trained the keras model using imagdedatagenerator and flow_from_directory. When I deploy in tensorflow servimg i got wrong class due to shuffle true in flow_from_directory. How can i resolve this problem ?
    Thanks

  • @igorpoletaev8188
    @igorpoletaev8188 7 років тому +1

    I was very surprised by the fact that bazel have been building my custom client for serving for a very long time ...Does it need to compile so many sources every time when I change the client code?

  • @JabaBanik
    @JabaBanik 7 років тому

    This is amazing, thanks Siraj. Since we are talking about production level can you plz suggest server configuration required for Tensorflow serving?

  • @afshananwarali9462
    @afshananwarali9462 6 років тому

    Thanks for this. It works for me.

  • @Hustada
    @Hustada 7 років тому

    Thanks for sharing this. I've been wondering how to do this.

  • @xtr33me
    @xtr33me 7 років тому +2

    Thanks so much for this vid! Could you by chance in the future do the same thing, but for something custom like a tensorflow model that simply adds two floats and returns the response? Reason I ask is because I have been having a big problem trying to figure out how to setup a custom model for serving with regards to configuring the proto files and client.

  • @ProfessionalTycoons
    @ProfessionalTycoons 5 років тому

    very good video

  • @adesojialu1051
    @adesojialu1051 3 роки тому

    i am working o n image classificattion and my model is in tflite, how do i deploy? do i need to change anything in your video tutorial?

  • @themakeinfo
    @themakeinfo 7 років тому +2

    Hi @siraj, Could you please tell How to Deploy a Keras Model to Production?

  • @sig7813
    @sig7813 4 роки тому

    If I use a saved scaler function from sklearn for the input data - can that be loaded to the server along with the model?
    Basically before model is called - i have to use that function first on every input.
    I had to use a scaler since i have many inputs and they are very different : one can be in a range of 1-3, another 50000-1000000. For that i used StandardScaler from sklearn and it does great. In case of getting right prediction i have to apply it on the new coming data.

  • @bibhu_107
    @bibhu_107 6 років тому +1

    To build the docker file use :
    sudo docker build --pull -t $USER/tensorflow-serving-devel -f tensorflow_serving/tools/docker/Dockerfile.devel .
    To run :
    sudo docker run --name=tensorflow_container -it $USER/tensorflow-serving-devel

  • @MrSanselvan
    @MrSanselvan 6 років тому

    @Siraj : Can we train the models and deploy them Incremental ?. Is TF Serving supports multiple smaller models. If yes, how can we do it. I cannot get any help in internet.

  • @kevinwong322
    @kevinwong322 6 років тому

    such a helpful video!

  • @CKSLAFE
    @CKSLAFE 6 років тому +6

    So sad this tutorial is broken now, they changed the github repository. Now you don’t have the tensorflow folder inside serving. If anybody knows of a tutorial please let me know.

    • @afshananwarali9462
      @afshananwarali9462 6 років тому +1

      There is no tensorflow folder inside of serving on github. What should I do?

    • @lakrounisanaa9156
      @lakrounisanaa9156 6 років тому

      hi what do you do in this case i face the same issue

    • @iulia2190
      @iulia2190 6 років тому

      try to build in serving directory

    • @jenlee6693
      @jenlee6693 6 років тому

      You can do 'bazel build -c opt tensorflow_serving/...' at tensorflow-serving directory in docker container.

    • @FZ8Yamaha
      @FZ8Yamaha 6 років тому

      According to github.com/tensorflow/serving/issues/755 , looks like we can just skip the cd tensorflow and ./configure steps

  • @moelgendy_
    @moelgendy_ 7 років тому

    Great video, Siraj! Could you add resources on how to deploy Keras models?

  • @phurien
    @phurien 7 років тому

    Hey Siraj, Love the videos. Question: I am taking the Udacity DL course, and am getting more and more into it and plan to continue on to make a career out of this. Would you recommend I switch over to Ubuntu as my primary OS or is it feasible to stay in Windows?

  • @sandhyakale9054
    @sandhyakale9054 4 роки тому

    Why we want to train the model.. I want deploy in website my chatbot.. Can you tell me

  • @Superjeka1979
    @Superjeka1979 7 років тому

    Hi Siraj, nice video! But I'm a bit confused about classification_signature and predict_signature in MNIST example. Should I use both of them, is there any difference between them, why classification's input is a string, etc. Or it's just example that I can use number of signatures to query single model?
    Thank you.

  • @shivajidutta8472
    @shivajidutta8472 7 років тому

    I think an alternate would be deploy the models in your code directly rather than calling a rest API. I have a model running on my iPhone, I don't see performance issues. The new chipsets are getting more and more powerful.

  • @wasimnadaf11
    @wasimnadaf11 5 років тому

    super informative:)

  • @harshmunshi6362
    @harshmunshi6362 7 років тому +2

    I guess you have shared enough knowledge for someone to start a company :/

  • @jenlee6693
    @jenlee6693 6 років тому

    There is no /tensorflow folder to do 'configure' as Google has taken it out. It is no longer required to do the configure according to Google latest issue response. Just do 'bazel build -c opt tensorflow_serving/...' at tensorflow-serving directory. (of course without the ')

  • @afshananwarali9462
    @afshananwarali9462 6 років тому

    Please share link of part 2 of this tutorial for pushing this to cloud.

  • @machartpierre
    @machartpierre 7 років тому

    Hey Siraj! Thanks a lot for all this amazing content.
    I am working on generative models for symbolic (MIDI) music sequences. Your videos on the topic have been very useful.
    However, I'm intending on running the inference / generation part on mobile device (iOS). I am using TensorFlow and things seem to gradually improve (more functions, more support, more documentation) but I still find it very tricky to port the model on device (strip the unused / unsupported nodes, optimize, porting the generation scripts etc.).
    Even porting the fairly simple RBM model you used for one of your videos is challenging. Any suggestion on that?
    Given that running inference on mobile devices is becoming a trend, would you care to make a video about it?

  • @larryteslaspacexboringlawr739
    @larryteslaspacexboringlawr739 7 років тому

    thank you for tensorflow video

  • @xPROxSNIPExMW2xPOWER
    @xPROxSNIPExMW2xPOWER 7 років тому

    lol need this in about two weeks thanks for a dank upload siraj!!!! really hope I dont run into that docker problem you had, I have over 20 docker images I think. lol 27:00 building custom linux kernels amirite lol

  • @ttwan690
    @ttwan690 6 років тому

    May the force be with you

  • @pietart3596
    @pietart3596 6 років тому

    Stupid question: Are we using the MNIST model? Since we're using the ImageNet model right?

  • @bhisal
    @bhisal 6 років тому

    What’s the advantage of serving model using TF serving compared to a rest api

  • @fabregas1291
    @fabregas1291 7 років тому

    Hi, How could we use this approach of deploying a TensorFlow model to production, for a re-trained inception model using transfer learning?

  • @yashsrivastava677
    @yashsrivastava677 6 років тому +2

    How can one do incremental training of models already deployed to serving?

    • @debu2in
      @debu2in 4 роки тому

      I think once you have accumulated the data, you can wrap the phases of the model training steps in functions then those functions in a class and trigger the class to train the model, persist the model on the disk and save the path in db, atleast this is how I do it :)

  • @jenlee6693
    @jenlee6693 6 років тому

    after uncompress the inception model, do --> 'bazel-bin/tensorflow_serving/example/inception_saved_model --checkpoint_dir=inception-v3 --output_dir=inception-export' as the command on the tutorial is old and no longer works.

  • @bhushanvernekar5121
    @bhushanvernekar5121 7 років тому

    i am not able to find step by step procedure to how to work on tensorflow in android studio

  • @LeksaJ4
    @LeksaJ4 7 років тому

    Hi Siraj, thank you so much for the videos. bazel build failed on some error and I am gonna try it tomorrow (it might be problem with not enough memory for docker). However I am kinda lost with docker and containers. Now when I shut it down, how do I get back to the step where I can write bazel bild etc..? Thank you.

  • @MrKemusa
    @MrKemusa 6 років тому

    How would one go from building tensorflow in docker on a local CPU without CUDA support and then deploying the container to a GPU instance in the cloud with CUDA support? Would I need to build tensorflow again when I deploy the docker container to the GPU and just enable CUDA support there? Or is there a way to have CUDA support on my CPU and maintain that when I deploy the container?

  • @EpicMicky300
    @EpicMicky300 5 років тому

    what's the difference between a docker image and a simple executable file?

  • @simpleman5098
    @simpleman5098 7 років тому

    Hey Siraj, what software do you use to make those images like on 2:34 or 11:46 etc?

  • @bilalchandio1
    @bilalchandio1 3 роки тому

    I am having issue while deploying my deep learning model in h5 format on flask. It works fine on local machine however, it has issues on my pythoneverywhere hosting server.

  • @udaysah8038
    @udaysah8038 5 років тому

    I am currently facing a problem to deploy my custom models where my images data is located on my local computer, can u make a video to how to deploy custom models where image data is located in the local computer, save models and deploy for in android devices.

  • @alexp5693
    @alexp5693 7 років тому

    Hello. I hope you will answer as it's really important for me. I'm currently working on a project and my task is to generate meaningful unique text from a set of keywords. It doesn't need to be large, at least a couple of sentences. I'm pretty sure I have to use LSTM but I can not find any good examples of generation of meaningful texts. I saw a few of randomly generated but that's all. I would be grateful for any advice. Thank you in advance.

  • @wahi_wahi
    @wahi_wahi 6 років тому

    When I run "bazel build -c .." ,
    I get "no targets found beneath' tensorflow_serving' ".

  • @justinviola2479
    @justinviola2479 5 років тому

    How can we take that JSON output and have it display bounding boxes in the browser?

  • @saitaro
    @saitaro 7 років тому

    Siraj, if I wanna write an ML algorithm and make a web app based on it, would learning Django be useful for this task?

  • @heathervica1108
    @heathervica1108 7 років тому

    Awesomeeeeee.
    Hello guys, do you know if is possible using:
    • Variational Autoencoders Neural Network (VAE) or
    • Generative adversarial networks (GANs)
    For structured data? I have seen some examples and it could be used but just for unstructured data such as images, audio, etc. Maybe do you have any example with structured data? Thanks a lot

  • @tonydenion3557
    @tonydenion3557 7 років тому

    Nice vid man ! Did you like C (didnt see any vids about it :D).
    I would like to know more about Tensorflow C API.
    Thanks alot for all knowledge you share

    • @cameronfraser4136
      @cameronfraser4136 7 років тому +1

      My understanding is the tensorflow C api wasnt designed to be used for production directly. If you want to deploy a model in C/C++ consider writing it from scratch, its not as bad as it sounds (inference is much simpler than training). Deep networks are mostly just a series of matrix multiplies.

    • @SirajRaval
      @SirajRaval  7 років тому +2

      more tf vids coming thx

    • @tonydenion3557
      @tonydenion3557 7 років тому

      ty for answer, world gonna change thanks to guys like you ;)

  • @Vijaykumar-jx8jq
    @Vijaykumar-jx8jq 5 років тому

    Hey siraj, actually i want to know that i have created a image classifier in docker and now i want to integrate into system which is written in python, how i can do that?

  • @souuu42
    @souuu42 6 років тому

    the process crashes when i try to create the docker image, it goes on for about 10 minutes and then everything freezes. any idea why ? i have an intel i5 processor

  • @akashtripathi5947
    @akashtripathi5947 7 років тому

    Can you please explain how I can make and serve CNN model using deeplearning 4j in java ?

  • @matrixzoo8434
    @matrixzoo8434 6 років тому

    Does this mean that in order to make an ML web app I don't have to learn Django or any other python web framework, I could just use tensorflow?

  • @600baller
    @600baller 6 років тому

    If I have an existing tf model, and I trained my data with train_test_split, what to do if I want to see the predictions for my model on the entire dataset (including the original training and testing data)?

  • @sathyasarathi90
    @sathyasarathi90 7 років тому +1

    Siraj, I wonder if a similar strategy can be used to deploy a sci-kit learn model?

    • @SirajRaval
      @SirajRaval  7 років тому +2

      absolutely loads.pickle.me.uk/2016/04/04/deploying-a-scikit-learn-classifier-to-production/

  • @AaronSarkissian
    @AaronSarkissian 7 років тому +1

    I don't get this part: 32:08 How that bazel command worked out of the docker?

    • @prarthana1122
      @prarthana1122 6 років тому

      same question...the bazel command didnt work in my docker too..How did he do ..could you please tell us siraj

  • @shreyanshvalentino
    @shreyanshvalentino 7 років тому +2

    the only useful video that you have uploaded till date!

    • @SirajRaval
      @SirajRaval  7 років тому +3

      thx what else would be useful?

    • @shreyanshvalentino
      @shreyanshvalentino 7 років тому

      I was probably too excited when I typed that, hence the exaggeration !
      You probably don't want to have suggestions from a crappy coder, like me
      However as much as I love your other tutorial videos, which are informative too, but are restricted to jupyter notebooks
      There is no way to send across the information processed from that to anywhere which a common person can use
      I started learning Django and rabbitMQ, with thoughts that only it can provide an interface to tensorflow

    • @shreyanshvalentino
      @shreyanshvalentino 7 років тому +1

      Also I am not sure if we have used the mnist - numerical recognition classifier in your docker
      Why did we not use that and instead use inception?
      Edit - no need to answer, got answered at 29:48

    • @MrKemusa
      @MrKemusa 6 років тому

      Something else that could be useful if you can make videos that showcase how to tailor out of the box tutorials (e.g. the MNIST tutorial) to a completely different use case where there model is still useful (e.g. something with a dataset we've built from scratch). Sometimes there's friction going from these templates to your own use case. Eventually I figure it out but I would be nice to have key things to consider when going from one use case to the next.

    • @Neonb88
      @Neonb88 5 років тому

      If you want more detailed tutorials, look at Melvin L. He's really good with step-by-step solutions

  • @theophilusananias1416
    @theophilusananias1416 7 років тому

    Siraj, Please, put together a video tutorial on how to generate an Image from Text with TensorFlow. (Text to Image)

  • @lotfiraghib7029
    @lotfiraghib7029 7 років тому

    Hello Siraj, Firstly thank you for this great video. I train a model in Python, than i saved with the train.saver to generate my checkpoint. i want to load this model in C++ , is there a way to do that ????

  • @st0ox
    @st0ox 5 років тому

    "we have to deal with C++" count me in :DD

  • @deepanshuchoudhary4598
    @deepanshuchoudhary4598 4 роки тому

    Come back buddy, we miss you!

  • @ShepardEffekt
    @ShepardEffekt 7 років тому

    Was waiting for this

  • @hussain5755
    @hussain5755 7 років тому

    Siraj can you please please recommend me a book to get start on ML, your videos are great but I am having hard time in grasping the concept

    • @SirajRaval
      @SirajRaval  7 років тому +1

      deep learning by bengio

  • @vibhanshusharma3150
    @vibhanshusharma3150 7 років тому

    Any video on image localisation

  • @mockingbird3809
    @mockingbird3809 6 років тому

    Hey Siraj , you did'nt teach what I want I want to know how to deploy a model into web if you have that video please share me or Please Replay my how to do that

  • @adesojialu1051
    @adesojialu1051 3 роки тому

    pls can i have a copy of your pipeline or pls how do i do mine?

  • @wafaayad5899
    @wafaayad5899 7 років тому

    failed: gcc failed: error executing command /usr/bin/gcc -U_FORTIFY_SOURCE -fstack-protector -Wall -B/usr/bin -B/usr/bin -Wunused-but-set-parameter -Wno-free-nonheap-object -fno-omit-frame-pointer -g0 -O2 '-D_FORTIFY_SOURCE=1' -DNDEBUG ... (remaining 106 argument(s) skipped): com.google.devtools.build.lib.shell.BadExitStatusException: Process exited with status 4.
    :( I can't get rid of this error, I'm new in ML/DL and that's what I get as my welcoming message, can anyone help please ?

    • @fabregas1291
      @fabregas1291 7 років тому

      You are likely running out of memory. Try reducing number of parallel builds by passing '--local_resources 2048,.5,1.0', which would instruct bazel to spawn no more than one compiler process at the time.

  • @zoranrazarac
    @zoranrazarac 7 років тому

    bazel-bin/tensorflow_serving/example/inception_export: No such file or directory
    Now what?

  • @Gerald-iz7mv
    @Gerald-iz7mv 6 років тому

    how can you upload new models at runtime?

  • @bibhu_107
    @bibhu_107 6 років тому +1

    #update
    The tensorflow submodule has been removed. You should no longer have to run TensorFlow's configure script manually

  • @RowdyReview
    @RowdyReview 5 років тому

    Hi Siraj, Thanks for great video.
    please help me out to fix the issue,I have my own model. here i am using faster_rcnn_inception_v2_pets.config architecture. currently i have trained check points.
    But when ever i am exporting checkpoints by using below command
    bazel-bin/tensorflow_serving/example/inception_saved_model --checkpoint_dir=my-model6 --export_dir=inception-export
    at that time i am getting below error
    DataLossError (see above for traceback): Unable to open table file my-model6/model.ckpt-21292: Data loss: not an sstable (bad magic number): perhaps your file is in a different file format and you need to use a different restore operator?
    [[Node: save/RestoreV2_34 = RestoreV2[dtypes=[DT_FLOAT], _device="/job:localhost/replica:0/task:0/device:CPU:0"](_arg_save/Const_0_0, save/RestoreV2_34/tensor_names, save/RestoreV2_34/shape_and_slices)]]
    Here we have TF=1.4 and Bazel=0.5.4
    while training i got checkpoints like
    model.ckpt-21292.data-00000-of-00001
    model.ckpt-21292.meta
    model.ckpt-21292.index
    for the above checkpoints i was renamed like model.ckpt-21292.
    I was followed your video, your downloading pre-trained model.
    but my question is we both having the same type of checkpoints, then why am getting above error??
    Thank you

    • @RowdyReview
      @RowdyReview 5 років тому

      I found solution.,.,.,.
      Hello all, just follow the below video and export your own model with in a 10 seconds
      ua-cam.com/video/w0Ebsbz7HYA/v-deo.html

  • @ChicagoBob123
    @ChicagoBob123 7 років тому

    Not really helping. Thought it would.
    I have seen SEVERAL videos online about training, how to train etc. BUT if I have a mini PC onboard a device that I want to use the trained data. HOW do I do that?
    What does training produce? There is NO CLARITY in any of the videos on that ACTUAL workflow of what is produced by training on how to re-purpose the results across platforms. Lets take cars for example. Small computers on board. HOW do I run a trained CNN for detection of road signs? I dont want to train it I just want to RUN it because it will go on thousands of cars.
    Do you have ANYTHING that helps to understand how to train create something and then LOAD and use the trained data on a machine?
    NOT a web connected thing

  • @rociogarcialuque6988
    @rociogarcialuque6988 4 роки тому +3

    "If Google can use it, we can use it." is so 2017.

  • @drdeath2667
    @drdeath2667 4 роки тому

    cardigan lol. inception network is savage

  • @johnnychan6755
    @johnnychan6755 7 років тому +1

    Has anyone got an error like this, at the bazel build step? (run on Macbook Pro, OSX 10.11.6, via Docker method. With bazel 0.5.4 in Dockerfile)
    ERROR: /root/.cache/bazel/_bazel_root/f8d1071c69ea316497c31e40fe01608c/external/org_tensorflow/tensorflow/core/kernels/BUILD:2904:1: C++ compilation of rule '@org_tensorflow//tensorflow/core/kernels:conv_ops' failed (Exit 4).
    gcc: internal compiler error: Killed (program cc1plus)

    • @johnnychan6755
      @johnnychan6755 7 років тому

      Solved! See this GitHub issue thread -
      scroll down. github.com/tensorflow/serving/issues/227

    • @charlesaydin2966
      @charlesaydin2966 6 років тому

      Thanks a lot!

  • @charrystheodorakopoulos4843
    @charrystheodorakopoulos4843 4 роки тому

    hi @Siraj, great video,
    Please help with if you can,
    I want to convert a tf model(only .pb file is obtained) into a tflite model
    Whlie writing this script:
    import tensorflow as tf
    converter = tf.lite.TFLiteConverter.from_saved_model("C:/tmp/")
    tfl_model = converter.convert()
    open("converted_model.lite", "wb").write(tfl_model)
    i get this error:
    Traceback (most recent call last):
    File "C:/Users/user/PycharmProjects/converter/convert.py", line 4, in
    tfl_model = converter.convert()
    File "C:\Users\user\PycharmProjects\converter\venv\lib\site-packages\tensorflow_core\lite\python\lite.py", line 400, in convert
    raise ValueError("This converter can only convert a single "
    ValueError: This converter can only convert a single ConcreteFunction. Converting multiple functions is under development.
    While writing a command line:
    tflite_convert --saved_model_dir=C:/tmp --output_file=saved-model.tflite --input_format=TENSORFLOW_GRAPHDEF --output_format=TFLITE --input_shape=1,416,416,3 --input_array=input \ --output_array=output \ --inference_type=FLOAT \ --input_data_type=FLOAT
    i get this error:
    RuntimeError: MetaGraphDef associated with tags serve could not be found in SavedModel. To inspect available tag-sets in the SavedModel, please use the SavedModel CLI: `saved_model_cli`

  • @kariuki6644
    @kariuki6644 7 років тому

    Where would i be without you?

  • @limyohwan
    @limyohwan 4 роки тому

    on docker build --pull -t tensorflow-serving-devel -f tensorflow_serving/tools/docker/Dockerfile.devel i get docker build requires exactly 1 argument

  • @AbdulWahid-vn4kp
    @AbdulWahid-vn4kp 6 років тому

    Does anyone have the github.com/tensorflow/serving/ repo when this video published as the latest version is deficit some files and its hard to follow form there.
    Thanks.

    • @jenlee6693
      @jenlee6693 6 років тому

      You can do 'bazel build -c opt tensorflow_serving/...' at tensorflow-serving directory in docker container.