Today we will take a quick look at how to speed up a query on a database table with over 850,000 rows and see how indexing can take a query from minutes to seconds.
You can tune the query with something like "SELECT id, SUM(price) AS INCOME FROM sales GROUP BY item, cashier" and add some "WHERE" conditions to filter just the items and/or cashier which you want (i.e WHERE cashier=5). What you are facing is a N+1 problem. If the query is processed in an asynchronous way that cannot be a problem but in your case is synchronous so you have to focus on fixing the N+1 and you can either optimize the query as I suggested or create a projection. 11s normally isn't acceptable for loading a webpage. Anyway a 4x gain in performance with just indexes is a good example of how powerful they can be.
Yeah. I wrote purposely bad data and queries to show the performance increase of the indexing itself. Since the indexing was the only thing that changed, it was a nice benchmark. But you are absolutely right.
Additionally I think the query would speed up immensely if you changed the loop from querying a thousand times into into building a single query that has a range in it. Queries in loops are a big no go in general
Is it bad if I index 7-8 out of 24 columns in the table? From which factor it will affect the database which have more than 300k rows of data in table? Can you please help me with it? Thanks in advance
I like to add one at a time and test how much of a performance benefit I get. you may find that some things you think will help will actually slow it down
@@AnotherMaker thank you mate. I tried it and got stunning results. I was just worried about its cons of this and I read about it that it creates a little mess for manipulation operations (insert, update, etc) Thank you 😊