Тёмный

Deploying production ML models with TensorFlow Serving overview 

TensorFlow
Подписаться 601 тыс.
Просмотров 18 тыс.
50% 1

Wei Wei, Developer Advocate at Google, overviews deploying ML models into production with TensorFlow Serving, a framework that makes it easy to serve the production ML models with low latency and high throughput. Learn how to start a TF Serving model server and send POST requests using the command line tool. Wei covers what it is, its architecture, general workflow, and how to use it.
Stay tuned for the upcoming episodes on Deploying Production ML models with TensorFlow Serving. Wei Wei will cover how to customize TF Serving, tune performance, perform A/B testing and monitoring, and more.
Resources:
TensorFlow Serving → goo.gle/3tLWkqr
TensorFlow Serving with Docker → goo.gle/3tQHyi0
Training and serving a TensorFlow model with TF Serving → goo.gle/3HE2e2F
Deploying Production ML Models with TensorFlow Serving playlist → goo.gle/tf-serving
Subscribe to TensorFlow → goo.gle/TensorFlow
#TensorFlow #MachineLearning #ML

Наука

Опубликовано:

 

1 авг 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 9   
@TensorFlow
@TensorFlow 2 года назад
Subscribe to learn more about deploying production ML models with TensorFlow Serving!
@youseefahmed1649
@youseefahmed1649 2 года назад
س ،.
@paulallen1597
@paulallen1597 2 года назад
Thank you for taking the time to do these.I always greatly enjoy these. Really enjoy these tools you guys are always developing/improving!!!
@saivignesh1758
@saivignesh1758 Месяц назад
Thanks sir
@carlotonydaristotile7420
@carlotonydaristotile7420 2 года назад
thanks
@theneumann7
@theneumann7 Год назад
Elegant 👌
@Med_YAHYAOUI
@Med_YAHYAOUI Год назад
please , how to add pre & post processing in the server side ,in order to call the api and return the final result ,& thank you for your efforts
@malikkissoum730
@malikkissoum730 Год назад
I don't like this kind of videos, that does not details the entire process. But it was helpful, I mean it gives you the big titles for further research.
@mfvhhh5815
@mfvhhh5815 Год назад
Далее
Deploying ML Models in Production: An Overview
14:27
Просмотров 41 тыс.
TensorFlow Serving performance optimization
8:49
Просмотров 5 тыс.
Enabling Cost-Efficient LLM Serving with Ray Serve
30:28
AI Systems vs Traditional Coding
7:39
Просмотров 25 тыс.
Deploy ML model in 10 minutes. Explained
12:41
Просмотров 13 тыс.
How to Deploy a Tensorflow Model to Production
38:10
Просмотров 118 тыс.
What is Model Serving?
22:38
Просмотров 2 тыс.
Intro to JAX: Accelerating Machine Learning research
10:30
#samsung #retrophone #nostalgia #x100
0:14
Просмотров 13 млн