Тёмный

Processing 1 Billion Rows Per Second 

Postgres Conference
Подписаться 3,6 тыс.
Просмотров 4,6 тыс.
50% 1

Everybody is talking about Big Data and about processing large amounts of data in real time or close to real time. However, to process a lot of data there is no need for commercial software or for some NoSQL stuff. PostgreSQL can do exactly what you need and process A LOT of data in real time. During our tests we have seen that crunching 1 billion rows of data in realtime is perfectly feasible, practical and definitely useful. This talk shows, which things has to be changed inside the PostgreSQL and what we learned when processing so much data for analytical purposes.

Опубликовано:

 

28 июл 2021

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 4   
@mrrolandlawrence
@mrrolandlawrence 14 часов назад
i do love some optimisations :)
@victormadu1635
@victormadu1635 14 дней назад
Good job
@hkpeaks
@hkpeaks Год назад
Do you mean the system can extract csv file for 1 Billion Row per second?
Далее
1 Billion Rows Challenge
8:58
Просмотров 113 тыс.
Что нового в 11.2?
58:32
Просмотров 59 тыс.
The NOSQL Store that Everyone Ignored
49:47
Просмотров 3,6 тыс.
Solving one of PostgreSQL's biggest weaknesses.
17:12
Просмотров 178 тыс.
How Fast can Python Parse 1 Billion Rows of Data?
16:31
Faster geospatial queries in MySQL
13:46
Просмотров 19 тыс.