Тёмный
Cloud & AI Analytics
Cloud & AI Analytics
Cloud & AI Analytics
Подписаться
Cloud & AI Analytics provides tutorials, lectures and guidance on the technology and certifications in popular Cloud Platforms like Google Cloud Platform, Microsoft Azure and AWS . We have a vision of making cloud education accessible to every student for free.

By subscribing to this channel, you will never miss out on high quality videos on trending topics in the areas of Cloud Platforms, Big Data ecosystem, Data Engineering, Data Science, Kubernetes, Docker, Databricks, Machine Learning, Apache Spark, Python, PySpark, DevOps, SQL and many more.

We also help students/professionals in interview preparation. Please reach us without any hesitation.

You can join our WhatsApp group and Telegram group to discuss and get yourself ready for interview.
Stay Updated with all Material on GitHub

Please Share, Subscribe and Support us.

If You want to connect with me, connect on LinkedIn
Cloud Workflows Pricing
2:07
7 часов назад
Use-cases of GCP Cloud Workflows
2:11
16 часов назад
Overview on Cloud Workflows
4:23
14 дней назад
Intro on Cloud BigQuery - Part 01
10:59
2 месяца назад
Soft Delete in Cloud Storage bucket - GCP
13:18
4 месяца назад
SQL Interview QA's - Cloud BigQuery - 01
10:52
8 месяцев назад
Upgrading external table to Biglake table in GCP
8:10
10 месяцев назад
Creating your first Cloud BigLake in GCP
8:58
10 месяцев назад
Комментарии
@khaulafarooqui-n6o
@khaulafarooqui-n6o 5 дней назад
Nice explanation
@shreyas9309
@shreyas9309 25 дней назад
Thanks a lot man ! <3
@user-em3gw8on5i
@user-em3gw8on5i 29 дней назад
Thanks for the video. I have a question..... Is that possible to restrict "Update" a project (name) through the Lien Restriction? TIA
@joyhodling
@joyhodling Месяц назад
This is very helpful. Could you point me a video or create one on how to upgrade data fusion ? There are pipelines connected. How to do it safely. This video is very helpful.
@UTubeAcount1000
@UTubeAcount1000 Месяц назад
Hi admin, Very nice explanation. do you have a free coupon/ discount voture for Snow Pro Core Certification registration?
@deepthimurali962
@deepthimurali962 Месяц назад
Do you know how fetch the deleted bytes for all buckets in a gcp project?
@AnantPradhan-y7m
@AnantPradhan-y7m Месяц назад
Couldn't understand. Complicated...
@rakshapadiyar
@rakshapadiyar Месяц назад
Apart from Skills boost, did u go through any youtube channels/udemy courses?
@shamilak1
@shamilak1 Месяц назад
head_usa_names share the file
@patriciodiaz2377
@patriciodiaz2377 Месяц назад
Thank you very much for your explanation! Pretty well explained, greetings from México.
@shwetapandey2308
@shwetapandey2308 2 месяца назад
It is helping me allot thank you for making it simple for us
@jakrac2790
@jakrac2790 2 месяца назад
Thanks for focusing on documentation and detailed breakdown of exam topics. I've just passed the exam and I recommend anybody who starts - please read the documentation carefully, as they will test your deep knowledge and understanding of the topics.
@UddhavParab
@UddhavParab 2 месяца назад
How to send Pipeline Alerts like if the pipeline fails how to send emails? It's not sending when I am trying to do can you please help.
@sotos47
@sotos47 2 месяца назад
Why do i get "module not found" when pressing the run button
@prasannakumar7097
@prasannakumar7097 2 месяца назад
Can you please explain how to write dataframe to bigquery
@nicstruebel3391
@nicstruebel3391 2 месяца назад
doesnt work for me
@cloudaianalytics6242
@cloudaianalytics6242 2 месяца назад
what's the error?
@figh761
@figh761 3 месяца назад
Does anyone using snowflake
@salmansayyad4522
@salmansayyad4522 3 месяца назад
Thanks a lot bro, inreresting content
@user-gm7yt9dd2u
@user-gm7yt9dd2u 3 месяца назад
Abe langoor khud ki marrketing kar raha hai paisa kamane ke Liye video bana Raha hai
@ashraf_isb
@ashraf_isb 3 месяца назад
thanks man!
@Ar001-hb6qn
@Ar001-hb6qn 3 месяца назад
I am unable to create a GCP Composer environment. After around 45 minutes it shows the error "Some of the GKE pods failed to become healthy". I have configured the setting and given the necessary access. I am using composer-2.7.0-airflow-2.7.3. But it failed to create the environment. Can you please help with this? Thanks.
@heenachhabra2977
@heenachhabra2977 3 месяца назад
This is a single dataflow pipeline right? how is this different from a cloud composer orchestrated one
@YugarajTamang
@YugarajTamang 3 месяца назад
hello bro, i have defined schema in bigquery and i have a dataframe without column name, have millions of rows. i am unable to upload that dataframe in bigquery. can you make a video on that or help me out. thanks
@MonicaPatil-so3ml
@MonicaPatil-so3ml 4 месяца назад
You explained how to recover object form a bucket. Can you also explain with deom if its possible to recover an entire bucket is its deleted accidentally.
@cloudaianalytics6242
@cloudaianalytics6242 4 месяца назад
No buckets are deleted permanently. In order not to delete accidentally we are enforcing bucket retention policies on cloud storage bucket. Hope it helps
@emmanuelihetu9848
@emmanuelihetu9848 4 месяца назад
Thank you soo much
@riyanshigupta950
@riyanshigupta950 4 месяца назад
Amazing content! Thanks
@sagarsitap3540
@sagarsitap3540 4 месяца назад
How does your source data file looks like in GCS ? Can we make it streaming ?
@venkatvlogs07
@venkatvlogs07 4 месяца назад
too hurry not able to understand it as you are switching tabs and doing all the things and not mentioning where you are writing the code. The course should be designed so that even beginner should be able to understand it. please make a pin to pin point to point explanation video so that everyone can understand it. Thanks in advance ❤
@ushasribhogaraju8895
@ushasribhogaraju8895 5 месяцев назад
Thanks for your videos, I find them helpful. I could get the message published by a python script to pub/sub, updated to the data column in a big query table, by simply creating a subscription that writes to Big Query (to the same topic) without using Dataflow. Since pub sub is schema less, it is receiving whatever schema is published by the python script. My question is , is there a way to update a big query table using the same schema received in pub/sub?
@NaveenPB-yg4vw
@NaveenPB-yg4vw 5 месяцев назад
Hi Can u plz paste python code here
@webnoxtechnicalsupport8005
@webnoxtechnicalsupport8005 5 месяцев назад
while giving ls u are having some files and folder right i only have this README-cloudshell.txt..
@ainvondegraff5233
@ainvondegraff5233 5 месяцев назад
Awsome explanation really wanted to know this, If I migrate Control-M Workload automation tool to GCP. How will I connect control-m to pub/sub?
@AnjaneyuluPonnam
@AnjaneyuluPonnam 5 месяцев назад
excellent job you are doing great SIR!
@zzzmd11
@zzzmd11 5 месяцев назад
Hi, Thanks for the great informative video. can you explain the flow if the data source is from a Rest API. Can we have a dataflow configured to extract from a Rest API to big query with dataflow without having cloud functions or Apache beam scripts involved? Thanks a lot in advance..
@honeylokesh2340
@honeylokesh2340 5 месяцев назад
How to enroll your training???
@ayseguldalgic
@ayseguldalgic 5 месяцев назад
Thanks a lot, I needed this tutorial very much.
@figh761
@figh761 5 месяцев назад
What is your background sir. i am a datawarehouse developer and not much knowledge on data science. How to learn data science, ml and vertex ai. Could you please share all training documents
@v5q211
@v5q211 5 месяцев назад
Is there any syllabus? Because i dont think everything's needed to be studied from the documentation
@Rajdeep6452
@Rajdeep6452 5 месяцев назад
Hey bro. Thanks for the video. I have a ETL process running on VM, using docker and Kafka. And the data is getting stored in big query, as soon as I run the producer and consumer manually. I wanted to use cloud compose to automate this (like whenever I login to my VM the etl process starts automatically), but I couldn’t. Can you tell me if it’s possible to do this with dataflow? I am having trouble setting it up.
@nilawarsakshi2431
@nilawarsakshi2431 5 месяцев назад
Can you provide cupon as you mentioned in thE end of the video I am interested to do GCP professional DataEngineer Certification
@varshasony2352
@varshasony2352 6 месяцев назад
You mentioned that you can complete it in 10 days, but the description days 6 months of hand on experience. Can you explain this pls? I am working to complete the course in a months time.
@cloudaianalytics6242
@cloudaianalytics6242 5 месяцев назад
Its an expectation set by Snowflake team but if you have already worked on other DW services like Bigquery, Synapse analytics and Redshift it will be very easy to pickup Snowflake and clear it in your very first attempt.
@PujaKiPyaariDuniya
@PujaKiPyaariDuniya 5 месяцев назад
​@@cloudaianalytics6242 I have worked on TSQL and have no prior working knowledge of snowflake. I am learning on my own from snowflake website learning track. Will I be able to crack the exam in 1 month?
@SK-rl3wu
@SK-rl3wu 3 месяца назад
@@cloudaianalytics6242 Is 6months of hands on experience required to attempt this test, I do have knowledge on data warehouses but did not work on DW services Bigquery, Synapse analytics and Redshift etc
@pournimaambikar5857
@pournimaambikar5857 6 месяцев назад
I am getting below error while trying to run dataflow job: import apache_beam as beam ModuleNotFoundError: No module named 'apache_beam' on both cloud sdk and cloud shell, wheras apache_beam is installed
@RajDas-uy2ro
@RajDas-uy2ro 6 месяцев назад
pip install apache-beam[gcp]
@cloudaianalytics6242
@cloudaianalytics6242 5 месяцев назад
pip install apache-beam[gcp] or try createing a virtual environment in cloud shell and run dataflow jobs from there after installing apache beam
@anurak1166
@anurak1166 6 месяцев назад
Awesome. Can you provide a URL of the source code, please?
@Alfred_vinci
@Alfred_vinci 6 месяцев назад
this is really good for me. can i get your mail so i can be mailing you if i need help with something?
@cloudaianalytics6242
@cloudaianalytics6242 5 месяцев назад
cloudaianalytics@gmail.com
@sulemanshaikh731
@sulemanshaikh731 6 месяцев назад
Very Informative.. Appreciate your Efforts
@cloudaianalytics6242
@cloudaianalytics6242 5 месяцев назад
Thanks a lot
@brjkumar
@brjkumar 7 месяцев назад
Thanks bro, nice info.
@cloudaianalytics6242
@cloudaianalytics6242 7 месяцев назад
Always welcome
@brjkumar
@brjkumar 7 месяцев назад
Thanks for biglake explanation.
@cloudaianalytics6242
@cloudaianalytics6242 7 месяцев назад
My pleasure
@archanajain99
@archanajain99 7 месяцев назад
hii, i need to create a GCP dataflow pipeline using Java. This pipeline should take file in GCS bucket as input and write the data into Bigtable. how to create it please help .
@cloudaianalytics6242
@cloudaianalytics6242 7 месяцев назад
You can use predefined template to do it.
@archanajain99
@archanajain99 6 месяцев назад
means i didn't understand. and not able to create cloud account coz it is asking charges.@@cloudaianalytics6242
@archanajain99
@archanajain99 6 месяцев назад
but yet i am not able to create account on google cloud they are asking charges@@cloudaianalytics6242
@archanajain99
@archanajain99 6 месяцев назад
not enable to bigtable data where it is asking me to pay. and you have created that documentation also. how can i create it?@@cloudaianalytics6242
@sanjaynayak2784
@sanjaynayak2784 7 месяцев назад
Is there any we can update side input value in main collection based on some matching key
@sanjaynayak2784
@sanjaynayak2784 7 месяцев назад
How to add side input column value in main p collection based on some look up key