Тёмный

Introducing Cloud Run Jobs 

Google Cloud Tech
Подписаться 1,2 млн
Просмотров 16 тыс.
50% 1

Nightly invoice processing repository (Cloud Run Jobs) → goo.gle/3wztr2F
Does your organization receive hundreds of PDF and JPEG invoices daily? You could run these invoices with unattended jobs on a virtual machine, however this solution requires maintenance and even adds costs when it's idle. Introducing a serverless way to run unattended jobs: Cloud Run Jobs! In this episode of Serverless Expeditions Extended, Martin and Karolina will demonstrate the Cloud Run Jobs in Cloud Console. Watch along and learn how to save time and money when it comes to running daily unattended jobs.
Checkout more episodes of Serverless Expeditions → goo.gle/ServerlessExpeditions
Subscribe to Google Cloud Tech → goo.gle/GoogleCloudTech​
#ServerlessExpeditions​ #ServerlessExpeditionsExtended #CloudRun

Наука

Опубликовано:

 

7 авг 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 45   
@googlecloudtech
@googlecloudtech 2 года назад
What do you think of Cloud Run Jobs? We’d love to hear from you in the comments below! Don’t forget to subscribe to Google Cloud Tech → goo.gle/GoogleCloudTech
@TheMomander
@TheMomander 2 года назад
@@adigabaie1093 Thank you for the feedback! I will pass on your comment to the product team.
@TheGothicSecret
@TheGothicSecret 2 года назад
Perfect solution just in time! Amazing
@JagdeepSinghKalsi
@JagdeepSinghKalsi 2 года назад
Been trying to run jobs using different tools (mostly, too cumbersome, even within the realm of GCP), and this one just hits that sweet spot... time to remove all that unnecessary code for those "tricks" to run jobs for Cloud Run... thanks Martin - around 3 years of following you have really helped me stay ahead of the curve... thanks a ton man...
@TheMomander
@TheMomander 2 года назад
Good to hear you found it useful, Jagdeep, and thank you for the kind words!
@santoshperumal129
@santoshperumal129 Год назад
Martin the contents that you post are really great with a demo it is so easy to understand and you take it like a story love your contents please keep posting more like this
@TheMomander
@TheMomander Год назад
Happy to hear you find the videos useful, Santosh!
@Gary21H
@Gary21H 2 года назад
Excited to try this!
@anilmm2005
@anilmm2005 2 года назад
Thanks for wonderful update
@dailymeow3283
@dailymeow3283 2 года назад
Awesome
@StraightCoding
@StraightCoding 4 месяца назад
Hi, good video. I have a question: Leaving aside the container feature of Cloud Run Job, what will be the difference of executing the script using Cloud function 2nd or Cloud Run Job? Thanks
@TheMomander
@TheMomander 4 месяца назад
In my experience as a developer who's used both, these are the major differences: * Cloud Run Jobs let you run a single job for up to 24 hours. With Cloud Functions 2nd gen it's one hour for HTTP functions or 9 minutes for event-driven functions. * Cloud Run Jobs let you configure automatic retries in case of failure. That can't be done with Cloud Functions triggered by HTTP. It can be done for Cloud Functions triggered by other events. * Cloud Run Jobs can start multiple parallel tasks (workers) for you, to finish quicker. With Cloud Functions you'd have to orchestrate parallel work yourself. * Cloud Run Jobs make it easy to run shell scripts. Some tasks (like doing a database backup) are simpler to do with a gcloud command from a shell script, rather than writing code in a regular programming language.
@divyaalok5538
@divyaalok5538 Год назад
Awesome 😊
@ianyoung_
@ianyoung_ 2 года назад
Woah, this could be a game changer which removes the need for VMs or hacks for long-running tasks. Thanks for the demo!
@TheMomander
@TheMomander 2 года назад
Happy to hear this is useful. Life becomes a little simpler with each eliminated VM.
@elephantum
@elephantum 2 года назад
Is it possible to provide run-specific arguments when triggering a job?
@karolinanetolicka5982
@karolinanetolicka5982 2 года назад
Unfortunately not; we're looking into adding this in the future.
@elephantum
@elephantum 2 года назад
@@karolinanetolicka5982 thanks for the answer. It would be super helpful for use cases like scheduling re-processing for a specific date or for a specific chunk of data.
@elephantum
@elephantum 2 года назад
@@karolinanetolicka5982 another question: what are the reasonable limits for the number of job definitions? Is it OK to create new job definition for each hourly run? (I'm thinking of baking job run parameters into job definition)
@TheMomander
@TheMomander 2 года назад
@@elephantum Karolina told me there are no limits on the number of jobs per project - so yes, a new job for each hourly run is reasonable. We do have limits on the number of concurrently running executions in the project and will be introducing garbage collection of old executions. To find the page where limits are documented, search "google cloud run quotas".
@TheMomander
@TheMomander 9 месяцев назад
Run-specific arguments (the thing you asked for) have now been implemented. Do a web search for "google cloud run override job configuration for a specific execution" and you will find the documentation. Thank you for your feedback last year!
@user-zg2nl3gh1o
@user-zg2nl3gh1o 6 месяцев назад
Can I make sure that only one instance of the job is running? I have a Telegram client (python script with Telethon package), so in NO CIRCUMSTANCE i can allow two processes to be at the same time. Otherwise Telegram bans my Session ID, and I have to re-log and redeploy. That's what happened today when a trigger started a new execution while the old one was still running. Ideally, I want to have some number of retires (in case it fails), and a trigger in the morning, but so that the trigger would kill the previous execution. Is Cloud Run Jobs a good choice at all?
@obscurecloud5575
@obscurecloud5575 2 года назад
This is great, thanks. I was wondering if it's possible to trigger jobs using http
@TheMomander
@TheMomander 2 года назад
Yes, you can. See the URL that you enter in Cloud Scheduler to schedule a job:cloud.google.com/run/docs/execute/jobs-on-schedule. But be aware that you have to set the authentication header yourself. If you want to trigger your code with an HTTP call, consider putting it inside a regular Cloud Run service instead of a Cloud Run Job.
@franciscomerdot
@franciscomerdot 2 года назад
I'm wondering why use Cloud Run Jobs and not Cloud Function for such background operation?
@TheMomander
@TheMomander 2 года назад
Good question! Cloud Functions are great if you prefer deploying source code. Cloud Run is great if you prefer deploying containers.
@shubhamchaudhari6884
@shubhamchaudhari6884 2 года назад
Cloud Functions do not allow you to determine your environment and does not have the ability to run Shell scripts, Cloud Run Jobs can do both.
@ramgopalnalluri
@ramgopalnalluri 2 года назад
Cool
@DUCKJAIIIII
@DUCKJAIIIII Год назад
Hello Martin, I wonder if there are any plans to use CICD (via GitHub) like regular cloud run to deploy cloud run jobs? Wondering if CICD is deliberately not included or work in progress.
@ChrisJaydenBeats
@ChrisJaydenBeats Год назад
Pretty sure there's a GH action for it. I saw it a week ago
@TheMomander
@TheMomander Год назад
As you say, it's possible to add regular Cloud Run services with CI/CD via GitHub. Cloud Run Jobs are not there yet, but we will add it.
@katiefranz2101
@katiefranz2101 Год назад
A single cloud run job task has a max timeout of 60 minutes, but if you start a job with a cloud scheduler, the scheduler has a max timeout of 30 minutes. How can I get around the 30 minute max timeout from the scheduler? Or would the job continue to run to completion, even if the cloud scheduler may timeout?
@karolinanetolicka5982
@karolinanetolicka5982 Год назад
The Cloud Run job will continue to run until it's done. Make sure to monitor job success by looking at the status of the Cloud Run job, not the status of the Scheduler job - the Scheduler job will show as successful when the Cloud Run job is successfully started, not when it completes.
@Babbili
@Babbili Год назад
Nice, is the same k8s job, `apiVersion: batch/v1` if i want to create it from a yaml file
@knetolicka
@knetolicka Год назад
The Cloud Run jobs API is inspired by the k8s jobs API, but it's not the same. You'll likely need to make a few changes to your yaml file.
@kirankumarboddu655
@kirankumarboddu655 2 года назад
hi how many dags runs in airflow in parelllel processing using cloud run jobs
@TheMomander
@TheMomander 2 года назад
Not sure I understand your question. Would you elaborate? If you want to find out more about Airflow on Google Cloud, check out the Cloud Composer documentation.
@shubhamchaudhari6884
@shubhamchaudhari6884 2 года назад
Question: Is there a way to invoke these jobs from GCS events? or an integration with Eventarc?
@karolinanetolicka5982
@karolinanetolicka5982 2 года назад
Not at the moment, but it's something we may add in the future.
@RodrigoFernandes-fg6me
@RodrigoFernandes-fg6me 9 месяцев назад
@@karolinanetolicka5982 is there any plans to add continuous deployment to cloud run jobs?
@TheMomander
@TheMomander 8 месяцев назад
@@RodrigoFernandes-fg6me Karolina is out of the office today, so I will respond. Today you can do continuous deployment of jobs with the command line (which means you can do it from a CI/CD pipeline defined in Cloud Build or GitHub Actions) or with Cloud Deploy.
@babusivaprakasam9846
@babusivaprakasam9846 2 года назад
Good one. Small batch processing dataflows can be eliminated
@patricknelson
@patricknelson 2 года назад
I wonder if there’s a way to dynamically compute the number of workers at startup time. I’d imagine not, that you’d use Cloud Scheduler to instead trigger a task that then cascades into scheduling the correct number of workers. Not that _I_ need this, but it did make me wonder. p.s. For me: “Serverless” is _great_ because I don’t have to worry about maintaining a server anymore for these simple easily containerized workloads and applications. Now maintenance and updates are just done at the application container level, which would have been necessary anyway, and is easily streamlined or even automated with CI/CD.
@MartinOmander
@MartinOmander 2 года назад
Good question! I don't know of a way to compute the number of workers at startup time. One way of doing it would be to always start X workers, and each worker has the logic to terminate early if there isn't enough work for it. That approach would also decouple the logic in workers from the decision of how many workers to create. In other words, you can change the number of workers without having to worry about updating the worker code, and vice versa.
Далее
Code is a liability: Five ways to reduce it #Shorts
1:01
Picking the right serverless platform (Part 1)
14:02
Просмотров 14 тыс.
Symmetrical face⁉️🤔 #beauty
00:15
Просмотров 4,3 млн
SPILLED CHOCKY MILK PRANK ON BROTHER 😂 #shorts
00:12
Google Cloud Batch Quickstart
23:03
Просмотров 2 тыс.
Pub/Sub tips and tricks
16:05
Просмотров 22 тыс.
Terraform, serverless, and Cloud Run in practice
15:44
Cloud Run deployments with YAML
11:26
Просмотров 9 тыс.
Лучший браузер!
0:27
Просмотров 1,1 млн
КОМП-ПЫЛЕСОС
1:00
Просмотров 78 тыс.