Тёмный
No video :(

Learn MapReduce with Playing Cards 

Jesse Anderson
Подписаться 3,9 тыс.
Просмотров 344 тыс.
50% 1

The special extended preview of my new MapReduce screencast available for purchase at pragprog.com/sc....
To get access to my updated and in-depth course, go to my site at www.jesse-ander... and sign up. You'll get a free mini-course and then have the option to purchase the full 8 week course.

Опубликовано:

 

28 авг 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 135   
@kart00nher0
@kart00nher0 8 лет назад
This is by far the best explanation of the MapReduce technique that I have come across. I especially like how the technique was explained with the least amount of technical jargon. This is truly an ELI5 definition for MapReduce. Good work!
@jessetanderson
@jessetanderson 8 лет назад
+Subramanian Iyer Thanks!
@smushti
@smushti 5 лет назад
An innovative idea to use a pack of cards to explain the concept. Getting fundamentals right with an example is great ! Thank you
@ekdumdesi
@ekdumdesi 9 лет назад
Great explanation !! You Mapped the Complexity and Reduced it to Simplicity = MapReduce :)
@scottzeta3067
@scottzeta3067 2 года назад
The only one I watched which can clearly introduce mapreduce to newbie
@kabirkanha
@kabirkanha 3 года назад
Never trust a man whose deck of playing cards has two 7s of Diamonds.
@rodrigofuentealbafuentes695
@rodrigofuentealbafuentes695 3 года назад
Really good illustration.... really easy to understand for people as me that we are not computer experts.. thanks
@Useruytrw
@Useruytrw 10 лет назад
Jesse may you get all SUCCESS and BLESSINGS
@djyotta
@djyotta 9 лет назад
Very well done - not too slow, yet very clear and well structured.
@vamsikrishnachiguluri8510
@vamsikrishnachiguluri8510 2 года назад
what a great effort, i am astonished by your teaching skills.we need teachers like you.Thanks for your best explanation .
@sukanyaswamy
@sukanyaswamy 9 лет назад
Great presentation. The visualization makes it so much easier to understand.
@mmuuuuhh
@mmuuuuhh 8 лет назад
To wrap this up: Map = Split data Reduce = Perform calculations on small chunks of data in parallel Then combine the subresults from each reduced-chunk. Is that correct?
@jessetanderson
@jessetanderson 8 лет назад
+mmuuuuhh Somewhat correct. I'd suggest buying the screencast to learn more about the code and how it works.
@alphacat03
@alphacat03 8 лет назад
+mmuuuuhh merge-sort maybe?
@kemchobhenchod
@kemchobhenchod 7 лет назад
divide and conquer
@BULLSHXTYT
@BULLSHXTYT 6 лет назад
Map transforms data too
@dennycrane2938
@dennycrane2938 5 лет назад
No no... Map = Reduce the Data, Reduce = Map the Data . .... ....
@vscid
@vscid 8 лет назад
and that's how you explain any technical concept. simple is beautiful!
@amitprakashpandeysonu
@amitprakashpandeysonu 2 года назад
loved the idea. Now I understood how map reduce works. Thank you.
@victorburnett6329
@victorburnett6329 3 года назад
If I understand correctly, the mapper divvies up the data among nodes of the cluster and subsequently organizes the data on each node into key-value pairs, and the reducer collates the key-value pairs and distributes the pairs among the nodes.
@jessetanderson
@jessetanderson 3 года назад
Almost. Hadoop divvies up the data, the mapper creates key value pairs, and the reducer processes the collated pairs.
@menderbrains
@menderbrains 4 года назад
Great explanation! This is how a tutor should simplify the understanding! Thanks
@rahulx411
@rahulx411 9 лет назад
an ounce of example is better than a ton of precept! --Thanks, this was great!
@doud12011990
@doud12011990 9 лет назад
really cool one. It is always nice to come back to the basics. Thanks for that one
@LetsBeHuman
@LetsBeHuman 5 лет назад
4:51 - - i'm kind of lost. so you said two papers as two sets of nodes. left is node1 and right is node2. then you said, "I have two nodes, where each node has 4 stacks of cards". I also understood that you are merging two varieties of cards in node1 and another two varieties of cards in node2. " a cluster is made of tens, hundreds or even thousands of nodes all connected by a network". so in this example, let's say two papers(nodes) are one cluster. the part I get confused is , when you say " the mapper on a node operates on that smaller part. the magic takes the mapper data from every node and brings it together on nodes all around the cluster. the reducer runs a node and knows it has access to everything with same key ". So if there are two nodes A and B that has mapper data, then the reducer part will happen on two other nodes C and D. I'm confused when you say "on nodes all around the cluster".
@vivek3350
@vivek3350 8 лет назад
Really liked your way of presentation....."Simple" and "Informative". Thanks for sharing!!
@ZethWeissman
@ZethWeissman 8 лет назад
Might be a bit clearer to understand the advantage of this if instead of having the same person run the cards on each node sequentially and have two people do it at the same time. Or go further and have four people show it. Then each person can grab all the cards of the suit from each node and can sum their values up, again, at the same time. Show a timer showing how long it took for the one person to do everything on one node and the time of having all four running at the same time.
@moofymoo
@moofymoo 9 лет назад
huge 1Tb file.. anyone watching this in 2065?
@NuEnque
@NuEnque 5 лет назад
February 2019 (Go RAMS)
@anmjubaer
@anmjubaer 5 лет назад
@@NuEnque July 21 2019
@devanshsrivastava4265
@devanshsrivastava4265 4 года назад
feb 2020
@jonathannimmo9293
@jonathannimmo9293 4 года назад
more like 2025
@omrajpurkar
@omrajpurkar 4 года назад
August 11, 2020!!
@furkanyigitozgoren3847
@furkanyigitozgoren3847 2 года назад
It was very nice. But I could not find the video that you showed the shuffling "magic part"
@abhishekgowlikar
@abhishekgowlikar 10 лет назад
Nice video explaining the Map Reduce Practically.
@bit.blogger
@bit.blogger 10 лет назад
6:16 got a question! Would you please elaborate more on those moving data? Since there is two separate reduce task on those two nodes how does two different reduce tasks combine together? How do we choose which cards move to which node?
@jessetanderson
@jessetanderson 10 лет назад
That is called the shuffle sort. See more about that here www.inkling.com/read/hadoop-definitive-guide-tom-white-3rd/chapter-6/shuffle-and-sort.
@chandrakanthpadi
@chandrakanthpadi 3 года назад
Does the actual data in the node moves or copies of the data is moved?
@rohitgupta025
@rohitgupta025 9 лет назад
Just wow...very nicely explained
@IvanRubinson
@IvanRubinson 7 лет назад
Well, that explains the interview question: How would you sort a ridiculously large amount of data?
@mgpradeepa554
@mgpradeepa554 10 лет назад
The explanation is wonderful.. You made me understand things easily.
@TheDeals2buy
@TheDeals2buy 10 лет назад
Good illustration using a practical example...
@prasann26
@prasann26 10 лет назад
Wow.. You have made this look so simple and easy... Thanks a ton !!!
@ahmedatallahatallahabobakr8712
Your explanation is majic! Well done
@nkoradia
@nkoradia 7 лет назад
Brilliant approach to teach the concept
@hexenkingTV
@hexenkingTV 6 лет назад
So it follows mainly the principle of divide and conquer?
@jessetanderson
@jessetanderson 6 лет назад
Following that analogy, it would be divide, reassemble, and conquer.
@LetsBeHuman
@LetsBeHuman 5 лет назад
When you say nodes and clusters, does an input file of 1TB should definitely be run in more than one computer or we can install hadoop in a single laptop and virtually create nodes and clusters ?
@hazelmiranda8587
@hazelmiranda8587 8 лет назад
Good to understand for a layman! So its quite crucial to identify the basis of the grouping i.e. the parameters based on which the data should be stored in each node. Is it possible to revisit that at a later stage?
@davidy2535
@davidy2535 3 года назад
amazing explanation! I love it. Huge Thanks!
@asin0136-y6g
@asin0136-y6g 5 лет назад
Wonderful explanation ! Made it very simple to understand! Thanks a ton!
@urvisharma7243
@urvisharma7243 Год назад
What if the node with clubs and hearts breaks down during the reduce operation? Will data be lost? Or will the complete Map Reduce job be repeated using the replicated data?
@jessetanderson
@jessetanderson Год назад
The data is replicated and the reduce would be re-run on a different node.
@thezimfonia
@thezimfonia 7 лет назад
That was very helpful Jesse. Thank you for sharing this!!
@piyushmajgawali1611
@piyushmajgawali1611 4 года назад
I actually did this with cards.Thanks
@logiprabakar
@logiprabakar 9 лет назад
Wonderful, you have used the right tool(cards) and made it simpler. Thank you. Am i correct in saying, in this manual shuffle and sort, the block size is 52 cards where as in a node it would be 128.
@tkousek1
@tkousek1 7 лет назад
Great explanation!! worth a bookmark. Thank you sir!
@user-or7ji5hv8y
@user-or7ji5hv8y 5 лет назад
best explanation of mapReduce. Thanks!
@mahari999
@mahari999 8 лет назад
Superb. Thank you Jesse Anderson
@anandsib
@anandsib 9 лет назад
Good Explanation with simple example
@krupakapadia2498
@krupakapadia2498 7 лет назад
Great Explanation! Thanks!
@gboyex
@gboyex 6 лет назад
Great video with good explanation technique.
@tichaonamiti4616
@tichaonamiti4616 10 лет назад
Thats wonderful ..... you are a gret teacher
@arindamdalal3988
@arindamdalal3988 8 лет назад
really nice video and explain the terms in a simple way...
@arnavanuj
@arnavanuj 2 года назад
Good illustration. 😃
@go_better
@go_better 7 месяцев назад
Thanks! Great explanation
@AnirudhJas
@AnirudhJas 5 лет назад
Thanks Jesse! This is a wonderful video! I have 2 doubts. 1. Instead of sum, if it is a sort function, how will splitting it into nodes work? Because then every data point should be treated in one go. 2. The last part on scaling, how will different nodes working on a file and then combining based on key, be more efficient than one node working on one file? I am new to this and would appreciate some guidance and help on the same.
@jessetanderson
@jessetanderson 5 лет назад
1. This example goes more into sorting github.com/eljefe6a/CardSecondarySort 2. It isn't more efficient, but more scalable.
@AnirudhJas
@AnirudhJas 5 лет назад
@@jessetanderson Thank you!
@MincongHuang
@MincongHuang 9 лет назад
Great video, thanks for sharing !
@AlexChetcuti
@AlexChetcuti 7 лет назад
Thanks this really helped me for my exam !!
@rogerzhao1158
@rogerzhao1158 8 лет назад
Nice tutorial! Easy to understand
@sebon11
@sebon11 3 года назад
Great explanation!
@ZFlyingVLover
@ZFlyingVLover 5 лет назад
The 'scalability' of hadoop has to do with the fact that the data being processed CAN be broken up and processed in parallel in chunks and then the results can be tallied by key. It's not an inherent ability of the tech other than HDFS itself. Like most technology or jobs for that matter the actual 'process' is simple it's wading through the industry specific terminology that has makes it unnecessarily complicated. Hell you can make boiling an egg or making toast complicated too if that's your intent.
@jessetanderson
@jessetanderson 5 лет назад
Sorry, you misunderstood.
@ZFlyingVLover
@ZFlyingVLover 5 лет назад
@@jessetanderson I didn't misunderstand you. Your explanation was great.
@abdulrahmankerim2377
@abdulrahmankerim2377 7 лет назад
Very useful explanation.
@rodrigoborjas7727
@rodrigoborjas7727 3 года назад
Thank u very much for the explanation.
@sarthakmane2977
@sarthakmane2977 4 года назад
dude, whats the name of that magic??
@patrickamato8839
@patrickamato8839 9 лет назад
Great summary - thanks!
@wetterauerbub
@wetterauerbub 6 лет назад
Hi Jesse, can I use map reduce only on document-oriented DBs, or also e.g. on Graph databases?
@jessetanderson
@jessetanderson 6 лет назад
Hessebub you can use it for both, but the processing Algorithms are very different between them.
@wetterauerbub
@wetterauerbub 6 лет назад
Alright, thanks very much for answering & doing the video in the first place!
@ShoaibKhan-hy5nf
@ShoaibKhan-hy5nf 7 лет назад
The magic part u mentioned in the video resides in reducer or Map?
@jessetanderson
@jessetanderson 7 лет назад
Shoaib Khan mostly in between those two phases
@grahul007
@grahul007 8 лет назад
Excellent video explanation
@amandeepak8640
@amandeepak8640 8 лет назад
Thank You sir for such a wonderful explanation. :-)
@trancenut81
@trancenut81 9 лет назад
Excellent explanation!
@thiery572
@thiery572 6 лет назад
Interesting. Now I want to request a bunny comes out from a hat.
@amirkazemi2517
@amirkazemi2517 9 лет назад
greta video. why is there performance issues with hadoop however?
@jessetanderson
@jessetanderson 9 лет назад
I'm not sure what you mean by performance issues.
@lerneninverschiedenenforme7513
little bit long explanation. could be done faster (e.g. card-sorting). But after watching, you know what's happening. So all thumbs up!
@anmjubaer
@anmjubaer 5 лет назад
Great explanation. Thanks.
@Dave-lc3cd
@Dave-lc3cd 4 года назад
Thanks for the great video!
@gypsyry
@gypsyry 5 лет назад
Best explanation. Thanks a lot
@ajuhaseeb
@ajuhaseeb 9 лет назад
Aiwa. Simply explained.
@kart00nher0
@kart00nher0 8 лет назад
IMO the key takeaway from the video is that MR only works when: a. There is one really large data set (e.g. a giant stack of playing cards) b. Each row in the data set can be processed independently. (e.g. sorting or counting playing cards does not require knowing the sequence of cards in the deck - each card is processed based on information on the face of card) To process real-world problems using MR, the data sets will need to be massaged and joined to satisfy the criteria listed above. This is where all the challenges lie. MR itself is the easy part.
@jessetanderson
@jessetanderson 8 лет назад
+Subramanian Iyer agreed MR is difficult, but the understanding of how to use and manipulate the data is far more complex. This is why I think data engineering should be a specific discipline and job title. www.jesse-anderson.com/big-data-engineering/
@vincentvimard9019
@vincentvimard9019 9 лет назад
just great explanation !
@sarthakmane2977
@sarthakmane2977 4 года назад
great video by the way!!
@Nyocurio
@Nyocurio 6 лет назад
Why did they come up with such a terribly unintuitive name as "MapReduce" ??? It's basically just "bin by attribute, then process each bin in parallel". BinProcess.
@jessetanderson
@jessetanderson 6 лет назад
It's a well-known functional programming paradigm.
@MuhammadFarhan-ny7tj
@MuhammadFarhan-ny7tj 3 года назад
Which music is this in start of this video
@jessetanderson
@jessetanderson 3 года назад
I'm not sure where they got it from.
@abdellahi.heiballa
@abdellahi.heiballa 4 года назад
my friend: i wish i had ur calm we having an exam tomorrow you watching how playing cards....
@vigneshrachha8362
@vigneshrachha8362 7 лет назад
Superb video....thanks a lot sir
@Luismunoz-jf2zv
@Luismunoz-jf2zv 9 лет назад
Now I get it, thanks!
@rrckguy
@rrckguy 9 лет назад
Great lesson. Thanks..
@mudassarm30
@mudassarm30 8 лет назад
spade clubs ... I think you used the wrong suite names for them :)
@irishakazakyavichyus
@irishakazakyavichyus 6 лет назад
thanks! that is an easy explanation!
@SamHopperton
@SamHopperton 7 лет назад
Brilliant - thanks!
@alextz4307
@alextz4307 5 лет назад
Very nice, thanks a lot.
@devalpatel7243
@devalpatel7243 5 лет назад
Hat's of man. very well understood
@hemanthpeddi4129
@hemanthpeddi4129 4 года назад
awesome explanation super
@bijunair3807
@bijunair3807 9 лет назад
Good explanation
@user-ho2kf2xr7v
@user-ho2kf2xr7v 9 лет назад
Great video
@guessmedude9636
@guessmedude9636 6 лет назад
i like this technique nice keep it up
@__-to3hq
@__-to3hq 5 лет назад
wow this was great
@sumantabanerjee9728
@sumantabanerjee9728 6 лет назад
Easiest explanation.
@iperezgenius
@iperezgenius 7 лет назад
Brilliant!
@RawwestHide
@RawwestHide 6 лет назад
thanks
@pamgg1663
@pamgg1663 9 лет назад
excellent!!!
@yash6680
@yash6680 6 лет назад
awesome
@niamatullahbakhshi9371
@niamatullahbakhshi9371 8 лет назад
so nice
@covelus
@covelus 6 лет назад
awesome
@varshamehra8164
@varshamehra8164 4 года назад
Cool
@glennt1962
@glennt1962 5 лет назад
This is a great example video without the accent to deal with.
Далее
Understanding HDFS using Legos
15:03
Просмотров 149 тыс.
What is MapReduce?
5:37
Просмотров 243 тыс.
Bilasizmi?
00:12
Просмотров 280 тыс.
HBase with Playing Cards
7:01
Просмотров 17 тыс.
Map Reduce Paper - Distributed data processing
9:26
Просмотров 49 тыс.
MapReduce - Computerphile
6:41
Просмотров 254 тыс.
Map Reduce explained with example | System Design
9:09
Apache Hadoop & Big Data 101: The Basics
16:56
Просмотров 226 тыс.
Big Data Explained
7:03
Просмотров 37 тыс.
Map Reduce Concept with Simple Example
5:56
Просмотров 217 тыс.