Тёмный
No video :(

3.6 Spark Accumulator | Spark Interview Questions |Spark Tutorial 

Data Savvy
Подписаться 29 тыс.
Просмотров 18 тыс.
50% 1

As part of our spark Interview question Series, we want to help you prepare for your spark interviews. We will discuss various topics about spark like Lineage, reduceby vs group by, yarn client mode vs yarn cluster mode etc.
As part of this video we are covering What is accumulator in spark
Please subscribe to our channel.
Here is link to other spark interview questions
• 2.5 Transformations Vs...
Here is link to other Hadoop interview questions
• 1.1 Why Spark is Faste...

Опубликовано:

 

22 авг 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 25   
@raghadbaiad8201
@raghadbaiad8201 2 года назад
Very simple and straight forward explanation of the accumulators
@adijobs2582
@adijobs2582 5 лет назад
@data savy : you are awesome man .. Thanks for videos expecting more and any new concepts from spark
@RupeshKumar-nz9fg
@RupeshKumar-nz9fg 5 лет назад
Kindly Make a video on how to process unstructured data like some blog data kind of something.
@bhargavhr1891
@bhargavhr1891 6 лет назад
Clear explanation with examples, one question I have is can we use this in a continuously running streaming consumer application, I assume that accumulator is always getting updated for an scenario, if so when does the accumulator values would be retrieved
@bhargavhr1891
@bhargavhr1891 6 лет назад
Can you please update the answer for my query
@nafisaslam4605
@nafisaslam4605 5 лет назад
It acts as a counter, it can only be read by the driver. So at the end of your job, accumulator value will be sent back to driver.
@vikaskatiyar1120
@vikaskatiyar1120 2 года назад
clearly understood. Thanks
@SushantaKuSahu-se2hh
@SushantaKuSahu-se2hh 2 года назад
Thanks for the video...these are awesome. Please make some videos on AWS interview questions
@mmlmahesh
@mmlmahesh 3 года назад
Sir your videos are very helpful.Kindly,please check the audio setup.the voice is very feeble.
@user-mw4mt2mt1r
@user-mw4mt2mt1r 3 года назад
So, to boil it down - the whole purpose of the accumulators is to substitute for i++ in regular programs?
@srinivasgolete9232
@srinivasgolete9232 5 лет назад
Where do we use in realtime
@ampolusantosh5350
@ampolusantosh5350 6 лет назад
thank u
@sandhiyaarumugam4572
@sandhiyaarumugam4572 5 лет назад
the volume is very low. can this be re-recorded and posted
@DataSavvy
@DataSavvy 5 лет назад
Hi Sandhiya... I plan to rerecord the videos, which had volume issue... I will create a now video on this. Please also suggest if you face issue with any other video
@communicatortest5021
@communicatortest5021 5 лет назад
good, but why dont you add some slides to your video?
@DataSavvy
@DataSavvy 5 лет назад
Slides are added in New videos... RU-vid does not allow to edit existing videos
@shyam8999
@shyam8999 5 лет назад
nice work what is 3 in rdd (Seq(........),3)
@DataSavvy
@DataSavvy 5 лет назад
Number of partition in rdd
@maheswaraputha1297
@maheswaraputha1297 3 года назад
Also can you please share the codebase in github.
@DilliPH
@DilliPH 5 лет назад
Please increase the volume it's too low
@DataSavvy
@DataSavvy 5 лет назад
Hi Dilli... Excuse me for the problem... I have taken care of volume in new set of videos. unfortunatelly youtube is not allowing me to change volume settings of this old video...
@parulsharma8815
@parulsharma8815 2 года назад
Low sound quality
@arjunaare7950
@arjunaare7950 6 лет назад
how spark knows weather it is transformation or action ? Is there any logic ??
@rahulsharma-dk5jf
@rahulsharma-dk5jf 5 лет назад
spark is lazily evaluated , that means nothing happens if you dont call action after transformations . Transformation are basically to create new RDD from previous RDD . Actions are to execute those transformations .
@thesadanand6599
@thesadanand6599 2 года назад
your audio quality needs improvment...
Далее
Spark Session vs Spark Context | Spark Internals
8:08
Closures & Accumulators in Apache Spark | Session-7
9:16
Broadcast and Accumulator Variable in Spark #spark
8:02