Тёмный

How to Deploy a Tensorflow Model to Production 

Siraj Raval
Подписаться 769 тыс.
Просмотров 118 тыс.
50% 1

Once we've trained a model, we need a way of deploying it to a server so we can use it as a web or mobile app! We're going to use the Tensorflow Serving library to help us run a model on a server that we can then make HTTP requests to for data. We'll have the user upload an image and it will return a classification for that image.
Code for this video:
github.com/llSourcell/How-to-...
Please subscribe! And like. And comment. That's what keeps me going.
More learning resources:
www.tensorflow.org/deploy/tfs...
/ tensorflow-serving-pra...
tensorflow.github.io/serving/...
gist.github.com/avloss/01e43d...
github.com/tensorflow/serving
fdahms.com/2017/03/05/tensorfl...
books.google.com/books?id=rsy...
Join us in the Wizards Slack channel:
wizards.herokuapp.com/
And please support me on Patreon:
www.patreon.com/user?u=3191693
Follow me:
Twitter: / sirajraval
Facebook: / sirajology Instagram: / sirajraval Instagram: / sirajraval
Signup for my newsletter for exciting updates in the field of AI:
goo.gl/FZzJ5w
Hit the Join button above to sign up to become a member of my channel for access to exclusive content! Join my AI community: chatgptschool.io/ Sign up for my AI Sports betting Bot, WagerGPT! (500 spots available):
www.wagergpt.co

Опубликовано:

 

2 авг 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 196   
@michaelbell6055
@michaelbell6055 5 лет назад
Siraj... my dude, yours are the shoulders I am standing on in my job. Thank you so much for all the incredible tutorials and additional resources!!!
@atikkhatri6942
@atikkhatri6942 7 лет назад
I dived into the world of ML using scikit-learn and now I am learning the Tensorflow. I searched alot about the deployment of models, but I am having a hard times understanding the whole meachnaism. I really appreciate your effort, this is the best content on ML deployments on RU-vid 👍🏻
@radosccsi
@radosccsi 7 лет назад
I made a model in Keras. Installed Keras and Tensorflow on AWS instance in Virtualenv and created single python instance listening to RabbitMQ with Pika and used Flask over WSGI to put messages to the queue. HTML client uploads a photo and is returned with ID than it should request id info from the server in one second intervals. Works fine and queuing is kind of bullet proof since it's running on a small cpu instance :)
@nourhacker3734
@nourhacker3734 7 лет назад
Hey rad, sounds very interesting. Where do I learn how to do this?
@altairpearl
@altairpearl 7 лет назад
rad rabbitMQ. I have heard about it and thought of using it .
@SirajRaval
@SirajRaval 7 лет назад
very cool
@shreyanshvalentino
@shreyanshvalentino 7 лет назад
That's awesome!
@angelomenezes6044
@angelomenezes6044 7 лет назад
Man, you are really underrated! You deserve a lot for these great videos about ML. A big thanks from Brazil for the awesome work!!!
@vijayabhaskar-j
@vijayabhaskar-j 7 лет назад
I always wondered "Ok, I created a model, now what?". Thanks, Siraj!
@jijojohn5168
@jijojohn5168 7 лет назад
Long story short siraj earned around 864.84 dollars for this month lol go to 35:40.. He deserves lot more.. Keep up the good work.
@stephk8316
@stephk8316 7 лет назад
jijo john not bad for a side job, and well deserved!
@tamgaming9861
@tamgaming9861 7 лет назад
He deserves a lot more - i wish him the best!
@SirajRaval
@SirajRaval 7 лет назад
ha! that slipped through. cool. i'll keep it there. transparency ftw
@chicken6180
@chicken6180 7 лет назад
i mean, does he not deserve it?
@theempire00
@theempire00 7 лет назад
Damn, imagine what those youtubers with millions of followers earn...
@arjunsinghyadav4273
@arjunsinghyadav4273 7 лет назад
Hey Siraj, Firstly, great video Request: A tutorial on how to build a deployed Deep learning model that learns from live data and updates itself to a new version.
@2500204
@2500204 5 лет назад
just load the model and do model.fit(new data) and then overwrite the file using model.save() or whatever save function your are using. Incremental Learning is the best solution for continuously updating models with new data.
@bharatsahu1599
@bharatsahu1599 4 года назад
@Shashwat don't you think it will take so much time to retrain with new data included and user won't be waiting till infinity for results.
@q-leveldesign5342
@q-leveldesign5342 7 лет назад
Thank you, I have been wondering what to do with a model once trained. No one seems to be talking about this and it seems like a very important step. And yes, I have been searching furiously to figure it out. Thanks again.
@SirajRaval
@SirajRaval 7 лет назад
np
@mercolani1
@mercolani1 6 лет назад
Loved the video, love the energy, he clearly has a deep understanding
@KelvinMeeks
@KelvinMeeks 6 лет назад
Siraj, excellent tutorial - thanks for creating this.
@AbhishekKrSingh-ls5xu
@AbhishekKrSingh-ls5xu 7 лет назад
Hey Siraj, Firstly, great video Request: Can u post a tutorial on tensorflow distributed training on GPUs and Kubernates.
@Hustada
@Hustada 7 лет назад
Thanks for sharing this. I've been wondering how to do this.
7 лет назад
Love your teaching :) Keep it up☺
@SirajRaval
@SirajRaval 7 лет назад
thx
@xtr33me
@xtr33me 7 лет назад
Thanks so much for this vid! Could you by chance in the future do the same thing, but for something custom like a tensorflow model that simply adds two floats and returns the response? Reason I ask is because I have been having a big problem trying to figure out how to setup a custom model for serving with regards to configuring the proto files and client.
@afshananwarali9462
@afshananwarali9462 6 лет назад
Thanks for this. It works for me.
@igorpoletaev8188
@igorpoletaev8188 7 лет назад
I was very surprised by the fact that bazel have been building my custom client for serving for a very long time ...Does it need to compile so many sources every time when I change the client code?
@theempire00
@theempire00 7 лет назад
24:18 When I run the command: 'docker build --pull -t $USER/tensorflow-serving-devel -f tensorflow_serving/tools/docker/Dockerfile.devel .' I get an error: 'invalid argument "/tensorflow-serving-devel" for t: invalid reference format' Help? (On Windows 7, Docker Toolbox) UPDATE: The following does work: 'docker build --pull -t tensorflow-serving-devel -f tensorflow_serving/tools/docker/Dockerfile.devel .'
@yassineelb8735
@yassineelb8735 6 лет назад
just ommit $USER/
@600baller
@600baller 6 лет назад
If I have an existing tf model, and I trained my data with train_test_split, what to do if I want to see the predictions for my model on the entire dataset (including the original training and testing data)?
@Oneillphotographyithaca1
@Oneillphotographyithaca1 6 лет назад
So cool! This is inspiring me to make some models. :)
@abdelhaktali
@abdelhaktali 6 лет назад
Hi Siraj I have trained the keras model using imagdedatagenerator and flow_from_directory. When I deploy in tensorflow servimg i got wrong class due to shuffle true in flow_from_directory. How can i resolve this problem ? Thanks
@adamyatripathi2743
@adamyatripathi2743 7 лет назад
His notebook is Untitled... He chose the dark path....
@SirajRaval
@SirajRaval 7 лет назад
renamed it to demo now, so much more content coming
@adamyatripathi2743
@adamyatripathi2743 7 лет назад
Siraj Raval Your videos are good! May the force be with you...
@breakdancerQ
@breakdancerQ 5 лет назад
@@adamyatripathi2743 Naming notebooks is for noobs
@andresvourakis6880
@andresvourakis6880 7 лет назад
Your explanation was on point!! Thank you Siraj
@SirajRaval
@SirajRaval 7 лет назад
np
@st0ox
@st0ox 5 лет назад
"we have to deal with C++" count me in :DD
@svin30535
@svin30535 7 лет назад
Great topic! Thanks Siraj.
@SirajRaval
@SirajRaval 7 лет назад
np
@xPROxSNIPExMW2xPOWER
@xPROxSNIPExMW2xPOWER 7 лет назад
lol need this in about two weeks thanks for a dank upload siraj!!!! really hope I dont run into that docker problem you had, I have over 20 docker images I think. lol 27:00 building custom linux kernels amirite lol
@SirajRaval
@SirajRaval 7 лет назад
dope u will do fine
@aug_st
@aug_st 7 лет назад
Very useful. Thanks Siraj!
@SirajRaval
@SirajRaval 7 лет назад
np
@kevinwong322
@kevinwong322 6 лет назад
such a helpful video!
@JabaBanik
@JabaBanik 7 лет назад
This is amazing, thanks Siraj. Since we are talking about production level can you plz suggest server configuration required for Tensorflow serving?
@MrSanselvan
@MrSanselvan 6 лет назад
@Siraj : Can we train the models and deploy them Incremental ?. Is TF Serving supports multiple smaller models. If yes, how can we do it. I cannot get any help in internet.
@adesojialu1051
@adesojialu1051 3 года назад
i am working o n image classificattion and my model is in tflite, how do i deploy? do i need to change anything in your video tutorial?
@AlienService
@AlienService 7 лет назад
Thank you for these. I've learned a lot already. The big question and use case that I'm interested in is using ML in blender. The goal would be to create a blender add on that could be trained on and manipulate mesh in a character model. With Blender and its add ons all written in python, this seems doable. The mesh data can be called within the blender python api pretty easily. My question is how to best set up a system that would take a character mesh (this would be in the thousands of vertex coordinates) and then train a model on with shape keys for happy in each one, then be able to make a shape key on a new character mesh that also produces a happy expression.
@moelgendy_
@moelgendy_ 7 лет назад
Great video, Siraj! Could you add resources on how to deploy Keras models?
@LeksaJ4
@LeksaJ4 6 лет назад
Hi Siraj, thank you so much for the videos. bazel build failed on some error and I am gonna try it tomorrow (it might be problem with not enough memory for docker). However I am kinda lost with docker and containers. Now when I shut it down, how do I get back to the step where I can write bazel bild etc..? Thank you.
@fabregas1291
@fabregas1291 7 лет назад
Hi, How could we use this approach of deploying a TensorFlow model to production, for a re-trained inception model using transfer learning?
@sig7813
@sig7813 4 года назад
If I use a saved scaler function from sklearn for the input data - can that be loaded to the server along with the model? Basically before model is called - i have to use that function first on every input. I had to use a scaler since i have many inputs and they are very different : one can be in a range of 1-3, another 50000-1000000. For that i used StandardScaler from sklearn and it does great. In case of getting right prediction i have to apply it on the new coming data.
@Vijaykumar-jx8jq
@Vijaykumar-jx8jq 5 лет назад
Hey siraj, actually i want to know that i have created a image classifier in docker and now i want to integrate into system which is written in python, how i can do that?
@themakeinfo
@themakeinfo 7 лет назад
Hi @siraj, Could you please tell How to Deploy a Keras Model to Production?
@ttwan690
@ttwan690 6 лет назад
May the force be with you
@genricandothers
@genricandothers 7 лет назад
I barely ever comment on videos but I have got to show love for all I've learned on your channel. I've been recommending you to everyone I can find. What software do you use to do the screen background with you in the foreground by the way? I want to start a channel teaching atmospheric science and I like this style...
@larryteslaspacexboringlawr739
@larryteslaspacexboringlawr739 7 лет назад
thank you for tensorflow video
@SirajRaval
@SirajRaval 7 лет назад
np
@MrKemusa
@MrKemusa 6 лет назад
How would one go from building tensorflow in docker on a local CPU without CUDA support and then deploying the container to a GPU instance in the cloud with CUDA support? Would I need to build tensorflow again when I deploy the docker container to the GPU and just enable CUDA support there? Or is there a way to have CUDA support on my CPU and maintain that when I deploy the container?
@wasimnadaf11
@wasimnadaf11 4 года назад
super informative:)
@justinviola2479
@justinviola2479 4 года назад
How can we take that JSON output and have it display bounding boxes in the browser?
@wahi_wahi
@wahi_wahi 6 лет назад
When I run "bazel build -c .." , I get "no targets found beneath' tensorflow_serving' ".
@yashsrivastava677
@yashsrivastava677 6 лет назад
How can one do incremental training of models already deployed to serving?
@debu2in
@debu2in 4 года назад
I think once you have accumulated the data, you can wrap the phases of the model training steps in functions then those functions in a class and trigger the class to train the model, persist the model on the disk and save the path in db, atleast this is how I do it :)
@machartpierre
@machartpierre 7 лет назад
Hey Siraj! Thanks a lot for all this amazing content. I am working on generative models for symbolic (MIDI) music sequences. Your videos on the topic have been very useful. However, I'm intending on running the inference / generation part on mobile device (iOS). I am using TensorFlow and things seem to gradually improve (more functions, more support, more documentation) but I still find it very tricky to port the model on device (strip the unused / unsupported nodes, optimize, porting the generation scripts etc.). Even porting the fairly simple RBM model you used for one of your videos is challenging. Any suggestion on that? Given that running inference on mobile devices is becoming a trend, would you care to make a video about it?
@pietart3596
@pietart3596 6 лет назад
Stupid question: Are we using the MNIST model? Since we're using the ImageNet model right?
@captainwalter
@captainwalter 4 года назад
I honestly dont get how to employ the model. At what stage do we use the neural net to make decisions about actionable data, in this case see it decode the words?
@AaronSarkissian
@AaronSarkissian 7 лет назад
I don't get this part: 32:08 How that bazel command worked out of the docker?
@prarthana1122
@prarthana1122 6 лет назад
same question...the bazel command didnt work in my docker too..How did he do ..could you please tell us siraj
@matrixzoo8434
@matrixzoo8434 6 лет назад
Does this mean that in order to make an ML web app I don't have to learn Django or any other python web framework, I could just use tensorflow?
@tonydenion3557
@tonydenion3557 7 лет назад
Nice vid man ! Did you like C (didnt see any vids about it :D). I would like to know more about Tensorflow C API. Thanks alot for all knowledge you share
@cameronfraser4136
@cameronfraser4136 7 лет назад
My understanding is the tensorflow C api wasnt designed to be used for production directly. If you want to deploy a model in C/C++ consider writing it from scratch, its not as bad as it sounds (inference is much simpler than training). Deep networks are mostly just a series of matrix multiplies.
@SirajRaval
@SirajRaval 7 лет назад
more tf vids coming thx
@tonydenion3557
@tonydenion3557 7 лет назад
ty for answer, world gonna change thanks to guys like you ;)
@phurien
@phurien 7 лет назад
Hey Siraj, Love the videos. Question: I am taking the Udacity DL course, and am getting more and more into it and plan to continue on to make a career out of this. Would you recommend I switch over to Ubuntu as my primary OS or is it feasible to stay in Windows?
@ProfessionalTycoons
@ProfessionalTycoons 5 лет назад
very good video
@harshmunshi6362
@harshmunshi6362 7 лет назад
I guess you have shared enough knowledge for someone to start a company :/
@SirajRaval
@SirajRaval 7 лет назад
yup
@Gerald-iz7mv
@Gerald-iz7mv 5 лет назад
how can you upload new models at runtime?
@souuu42
@souuu42 6 лет назад
the process crashes when i try to create the docker image, it goes on for about 10 minutes and then everything freezes. any idea why ? i have an intel i5 processor
@theophilusananias1416
@theophilusananias1416 7 лет назад
Siraj, Please, put together a video tutorial on how to generate an Image from Text with TensorFlow. (Text to Image)
@eliassocrates338
@eliassocrates338 7 лет назад
Siraj, could you please upload weights of models you trained as well, as neither online and personalized training of models is a viable option financially.
@saitaro
@saitaro 7 лет назад
Siraj, if I wanna write an ML algorithm and make a web app based on it, would learning Django be useful for this task?
@abhiwins123
@abhiwins123 7 лет назад
Thanks for end 2end tensor flow tutorial. World wows you for AI revolution
@SirajRaval
@SirajRaval 7 лет назад
awesome thx
@sathyasarathi90
@sathyasarathi90 7 лет назад
Siraj, I wonder if a similar strategy can be used to deploy a sci-kit learn model?
@SirajRaval
@SirajRaval 7 лет назад
absolutely loads.pickle.me.uk/2016/04/04/deploying-a-scikit-learn-classifier-to-production/
@simpleman5098
@simpleman5098 7 лет назад
Hey Siraj, what software do you use to make those images like on 2:34 or 11:46 etc?
@bhisal
@bhisal 5 лет назад
What’s the advantage of serving model using TF serving compared to a rest api
@EpicMicky300
@EpicMicky300 5 лет назад
what's the difference between a docker image and a simple executable file?
@ShepardEffekt
@ShepardEffekt 7 лет назад
Was waiting for this
@SirajRaval
@SirajRaval 7 лет назад
dope
@Superjeka1979
@Superjeka1979 7 лет назад
Hi Siraj, nice video! But I'm a bit confused about classification_signature and predict_signature in MNIST example. Should I use both of them, is there any difference between them, why classification's input is a string, etc. Or it's just example that I can use number of signatures to query single model? Thank you.
@alexp5693
@alexp5693 7 лет назад
Hello. I hope you will answer as it's really important for me. I'm currently working on a project and my task is to generate meaningful unique text from a set of keywords. It doesn't need to be large, at least a couple of sentences. I'm pretty sure I have to use LSTM but I can not find any good examples of generation of meaningful texts. I saw a few of randomly generated but that's all. I would be grateful for any advice. Thank you in advance.
@udaysah8038
@udaysah8038 5 лет назад
I am currently facing a problem to deploy my custom models where my images data is located on my local computer, can u make a video to how to deploy custom models where image data is located in the local computer, save models and deploy for in android devices.
@akashtripathi5947
@akashtripathi5947 7 лет назад
Can you please explain how I can make and serve CNN model using deeplearning 4j in java ?
@bhushanvernekar5121
@bhushanvernekar5121 7 лет назад
i am not able to find step by step procedure to how to work on tensorflow in android studio
@afshananwarali9462
@afshananwarali9462 6 лет назад
Please share link of part 2 of this tutorial for pushing this to cloud.
@thoughtsmithinnovation5432
@thoughtsmithinnovation5432 6 лет назад
Hi Siraj, you mentioned at 28:00 that inception has 100s of layers. If I am not wrong presently it has only 48 layers. Please correct me if I am wrong or you are referring something else.
@bilalchandio1
@bilalchandio1 3 года назад
I am having issue while deploying my deep learning model in h5 format on flask. It works fine on local machine however, it has issues on my pythoneverywhere hosting server.
@bilalchandio1
@bilalchandio1 3 года назад
It basically asks for GPU.
@bibhu_107
@bibhu_107 6 лет назад
To build the docker file use : sudo docker build --pull -t $USER/tensorflow-serving-devel -f tensorflow_serving/tools/docker/Dockerfile.devel . To run : sudo docker run --name=tensorflow_container -it $USER/tensorflow-serving-devel
@deepanshuchoudhary4598
@deepanshuchoudhary4598 3 года назад
Come back buddy, we miss you!
@jenlee6693
@jenlee6693 6 лет назад
There is no /tensorflow folder to do 'configure' as Google has taken it out. It is no longer required to do the configure according to Google latest issue response. Just do 'bazel build -c opt tensorflow_serving/...' at tensorflow-serving directory. (of course without the ')
@sandhyakale9054
@sandhyakale9054 4 года назад
Why we want to train the model.. I want deploy in website my chatbot.. Can you tell me
@AbdulWahid-vn4kp
@AbdulWahid-vn4kp 6 лет назад
Does anyone have the github.com/tensorflow/serving/ repo when this video published as the latest version is deficit some files and its hard to follow form there. Thanks.
@jenlee6693
@jenlee6693 6 лет назад
You can do 'bazel build -c opt tensorflow_serving/...' at tensorflow-serving directory in docker container.
@lotfiraghib7029
@lotfiraghib7029 7 лет назад
Hello Siraj, Firstly thank you for this great video. I train a model in Python, than i saved with the train.saver to generate my checkpoint. i want to load this model in C++ , is there a way to do that ????
@heathervica1108
@heathervica1108 7 лет назад
Awesomeeeeee. Hello guys, do you know if is possible using: • Variational Autoencoders Neural Network (VAE) or • Generative adversarial networks (GANs) For structured data? I have seen some examples and it could be used but just for unstructured data such as images, audio, etc. Maybe do you have any example with structured data? Thanks a lot
@vibhanshusharma3150
@vibhanshusharma3150 7 лет назад
Any video on image localisation
@gattra
@gattra 4 года назад
Please rehearse more and these would be 10000% better
@sohanjanaka6166
@sohanjanaka6166 6 лет назад
bro, where is your untitled.ipynb i can't find it. please help.
@sohanjanaka6166
@sohanjanaka6166 6 лет назад
okk, got it, if anyone wondering for the same reason. check down below. :P github.com/llSourcell/How-to-Deploy-a-Tensorflow-Model-in-Production/blob/master/demo.ipynb
@johnnychan6755
@johnnychan6755 6 лет назад
Has anyone got an error like this, at the bazel build step? (run on Macbook Pro, OSX 10.11.6, via Docker method. With bazel 0.5.4 in Dockerfile) ERROR: /root/.cache/bazel/_bazel_root/f8d1071c69ea316497c31e40fe01608c/external/org_tensorflow/tensorflow/core/kernels/BUILD:2904:1: C++ compilation of rule '@org_tensorflow//tensorflow/core/kernels:conv_ops' failed (Exit 4). gcc: internal compiler error: Killed (program cc1plus)
@johnnychan6755
@johnnychan6755 6 лет назад
Solved! See this GitHub issue thread - scroll down. github.com/tensorflow/serving/issues/227
@charlesaydin2966
@charlesaydin2966 6 лет назад
Thanks a lot!
@CKSLAFE
@CKSLAFE 6 лет назад
So sad this tutorial is broken now, they changed the github repository. Now you don’t have the tensorflow folder inside serving. If anybody knows of a tutorial please let me know.
@afshananwarali9462
@afshananwarali9462 6 лет назад
There is no tensorflow folder inside of serving on github. What should I do?
@lakrounisanaa9156
@lakrounisanaa9156 6 лет назад
hi what do you do in this case i face the same issue
@iulia2190
@iulia2190 6 лет назад
try to build in serving directory
@jenlee6693
@jenlee6693 6 лет назад
You can do 'bazel build -c opt tensorflow_serving/...' at tensorflow-serving directory in docker container.
@FZ8Yamaha
@FZ8Yamaha 5 лет назад
According to github.com/tensorflow/serving/issues/755 , looks like we can just skip the cd tensorflow and ./configure steps
@kariuki6644
@kariuki6644 7 лет назад
Where would i be without you?
@SirajRaval
@SirajRaval 7 лет назад
love u
@shivajidutta8472
@shivajidutta8472 7 лет назад
I think an alternate would be deploy the models in your code directly rather than calling a rest API. I have a model running on my iPhone, I don't see performance issues. The new chipsets are getting more and more powerful.
@jagdeepsihota6647
@jagdeepsihota6647 7 лет назад
Can you please share either blog or video on steps you took to deploy to iphone. Thank you
@SirajRaval
@SirajRaval 7 лет назад
share github!
@shreyanshvalentino
@shreyanshvalentino 7 лет назад
share, please!
@hussain5755
@hussain5755 7 лет назад
Siraj can you please please recommend me a book to get start on ML, your videos are great but I am having hard time in grasping the concept
@SirajRaval
@SirajRaval 7 лет назад
deep learning by bengio
@bibhu_107
@bibhu_107 6 лет назад
#update The tensorflow submodule has been removed. You should no longer have to run TensorFlow's configure script manually
@jenlee6693
@jenlee6693 6 лет назад
after uncompress the inception model, do --> 'bazel-bin/tensorflow_serving/example/inception_saved_model --checkpoint_dir=inception-v3 --output_dir=inception-export' as the command on the tutorial is old and no longer works.
@shreyanshvalentino
@shreyanshvalentino 7 лет назад
the only useful video that you have uploaded till date!
@SirajRaval
@SirajRaval 7 лет назад
thx what else would be useful?
@shreyanshvalentino
@shreyanshvalentino 7 лет назад
I was probably too excited when I typed that, hence the exaggeration ! You probably don't want to have suggestions from a crappy coder, like me However as much as I love your other tutorial videos, which are informative too, but are restricted to jupyter notebooks There is no way to send across the information processed from that to anywhere which a common person can use I started learning Django and rabbitMQ, with thoughts that only it can provide an interface to tensorflow
@shreyanshvalentino
@shreyanshvalentino 7 лет назад
Also I am not sure if we have used the mnist - numerical recognition classifier in your docker Why did we not use that and instead use inception? Edit - no need to answer, got answered at 29:48
@MrKemusa
@MrKemusa 6 лет назад
Something else that could be useful if you can make videos that showcase how to tailor out of the box tutorials (e.g. the MNIST tutorial) to a completely different use case where there model is still useful (e.g. something with a dataset we've built from scratch). Sometimes there's friction going from these templates to your own use case. Eventually I figure it out but I would be nice to have key things to consider when going from one use case to the next.
@Neonb88
@Neonb88 5 лет назад
If you want more detailed tutorials, look at Melvin L. He's really good with step-by-step solutions
@chicken6180
@chicken6180 7 лет назад
ok ive been convinced.... i will stop being a stubborn js scrub... *sigh* welp time to learn tf
@SirajRaval
@SirajRaval 7 лет назад
i made a js video called evolutionary tetris AI last week! check it out
@chicken6180
@chicken6180 7 лет назад
i know, i saw it. but as the majority of videos are in python it's working against me to be stubborn and not use that mainly
@adesojialu1051
@adesojialu1051 3 года назад
pls can i have a copy of your pipeline or pls how do i do mine?
@zoranrazarac
@zoranrazarac 6 лет назад
bazel-bin/tensorflow_serving/example/inception_export: No such file or directory Now what?
@rociogarcialuque6988
@rociogarcialuque6988 4 года назад
"If Google can use it, we can use it." is so 2017.
@RowdyReview
@RowdyReview 5 лет назад
Hi Siraj, Thanks for great video. please help me out to fix the issue,I have my own model. here i am using faster_rcnn_inception_v2_pets.config architecture. currently i have trained check points. But when ever i am exporting checkpoints by using below command bazel-bin/tensorflow_serving/example/inception_saved_model --checkpoint_dir=my-model6 --export_dir=inception-export at that time i am getting below error DataLossError (see above for traceback): Unable to open table file my-model6/model.ckpt-21292: Data loss: not an sstable (bad magic number): perhaps your file is in a different file format and you need to use a different restore operator? [[Node: save/RestoreV2_34 = RestoreV2[dtypes=[DT_FLOAT], _device="/job:localhost/replica:0/task:0/device:CPU:0"](_arg_save/Const_0_0, save/RestoreV2_34/tensor_names, save/RestoreV2_34/shape_and_slices)]] Here we have TF=1.4 and Bazel=0.5.4 while training i got checkpoints like model.ckpt-21292.data-00000-of-00001 model.ckpt-21292.meta model.ckpt-21292.index for the above checkpoints i was renamed like model.ckpt-21292. I was followed your video, your downloading pre-trained model. but my question is we both having the same type of checkpoints, then why am getting above error?? Thank you
@RowdyReview
@RowdyReview 5 лет назад
I found solution.,.,.,. Hello all, just follow the below video and export your own model with in a 10 seconds ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-w0Ebsbz7HYA.html
@subhankarbhattacharya2940
@subhankarbhattacharya2940 4 года назад
The day he can show proficiency in linear Algebra and differential equations etc, I would consider him to be a data scientist .. otherwise it’s all smartness practiced with code available in public
@OttoFazzl
@OttoFazzl 5 лет назад
Someone should invent Keras for Tensorflow Serving
@harshitagarwal5188
@harshitagarwal5188 7 лет назад
We wait for "How to tune hyperparameters"?
@wafaayad5899
@wafaayad5899 7 лет назад
failed: gcc failed: error executing command /usr/bin/gcc -U_FORTIFY_SOURCE -fstack-protector -Wall -B/usr/bin -B/usr/bin -Wunused-but-set-parameter -Wno-free-nonheap-object -fno-omit-frame-pointer -g0 -O2 '-D_FORTIFY_SOURCE=1' -DNDEBUG ... (remaining 106 argument(s) skipped): com.google.devtools.build.lib.shell.BadExitStatusException: Process exited with status 4. :( I can't get rid of this error, I'm new in ML/DL and that's what I get as my welcoming message, can anyone help please ?
@fabregas1291
@fabregas1291 7 лет назад
You are likely running out of memory. Try reducing number of parallel builds by passing '--local_resources 2048,.5,1.0', which would instruct bazel to spawn no more than one compiler process at the time.
Далее
Deploy ML model in 10 minutes. Explained
12:41
Просмотров 13 тыс.
Ne jamais regarder une fille à la plage 😂
00:10
Просмотров 1,2 млн
Which Activation Function Should I Use?
8:59
Просмотров 263 тыс.
NVIDIA REFUSED To Send Us This - NVIDIA A100
23:46
Просмотров 10 млн
Deploying ML Models in Production: An Overview
14:27
Просмотров 42 тыс.
Keras Explained
9:20
Просмотров 246 тыс.
How to Train Your Models in the Cloud
9:22
Просмотров 167 тыс.
The moment we stopped understanding AI [AlexNet]
17:38
Просмотров 831 тыс.
What is a Protocol? (Deepdive)
18:14
Просмотров 163 тыс.
How to deploy machine learning models into production
35:39