Тёмный

Consumer vs Consumer group | Apache Kafka Tutorial 

Data Savvy
Подписаться 29 тыс.
Просмотров 8 тыс.
50% 1

After completing the Apache Kafka training, you will be able to build applications using Apache Kafka. You will be able to make educated decisions for building good design.
This series we will learn the following:
1) Introduction to Kafka
2) What is a producer
3) What is a consumer
4) what are Topic and Partition
5) what is a consumer group
6) What is Topic replication and how to scale Kafka
7) How to Install apache Kafka
8) How to build applications using Kafka
- - - - - - - - - - - - - -
Who should go for this Course?
This course is a must for anyone who aspires to embark into the field of big data and keep abreast of the latest developments around fast and efficient processing of ever-growing data using Spark and related projects. The course is ideal for:
1. Big Data enthusiasts
2. Software Architects, Engineers, and Developers
- - - - - - - - - - - - - -
Facebook: / xoomanalytics
Data Savvy: / data-savvy-10650369448...
Github: github.com/har...
LinkedIn: / harjeetk
#kafka #apachekafka #bigdata

Опубликовано:

 

11 сен 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 7   
@manideepkumar959
@manideepkumar959 2 года назад
Super clear and i understood ,u should have kept all videos on kafka
@Jay-vh3gx
@Jay-vh3gx 3 года назад
Thanks for crystal clear explanation 👍
@cheequsharma7391
@cheequsharma7391 Год назад
Great work 👍
@madhusudhann8590
@madhusudhann8590 3 года назад
Thank you for this nice video. To achieve a parallelism of 3, should I replicate the same consumer code thrice and run all 3 under same consumer-group?
@naudua9272
@naudua9272 2 года назад
If the costumer_offset maintained in consumer and if it goes down then how the another consumer will get those offset details. Thanks
@rajalakshmim4779
@rajalakshmim4779 3 года назад
How to add consumers in consumer group in pyspark.
@teja_209
@teja_209 3 года назад
Can you interview me please