Тёмный

A Better AspNetCore Background Service Approach 

The Pragmatic Programmer
Подписаться 1,8 тыс.
Просмотров 944
50% 1

Опубликовано:

 

16 сен 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 10   
@pazzuto
@pazzuto Месяц назад
Interesting take. I'm concerned if too many requests come in, and I have all these queues/threads (creation/disposal). I did something similar, but to avoid exhausting threads, I used a BlockingCollection... on startup, the BG service created as many worker threads from the available cores (16 in my server). In each thread, I attached to the blocking collection through its enumerator, and it managed dequeuing safely for, So at the end, I had 16 threads safely handling any request that came in. I tested it with K6 sending about 5000 requests at its peak. While the collection had to queue up, 16 requests were being processed in parallel. It beat the hell out of the server, but then I could easily scale it out to have more threads available -- haven't done this yet, but so far, the server in production has not caused me any issues.
@asrajan55
@asrajan55 5 месяцев назад
Why can't we just use a service with ConcurrenttQueue as a singleton?
@thepragmaticprogrammer
@thepragmaticprogrammer 5 месяцев назад
Thanks for watching. Depends upon your use case and is perfectly valid. But using a queue as the conduit between your http requests and your background service only gives you a fan-in pattern i.e. all your workitems are pushed into a single queue. A List of queues allows for a fan-out to multiple threads in your background service.
@user-me7yi7ln2o
@user-me7yi7ln2o 5 месяцев назад
I don't think it's safe to modify the WorkBufferItems list from multiple threads. It's just a List after all. Or am I missing something?
@thepragmaticprogrammer
@thepragmaticprogrammer 5 месяцев назад
The only place where this might be a problem is when the background service removes a WorkItemBuffer whilst the background service is still iterating over the WorkItemBuffers list. We want to keep WorkItemBuffer elements to a minimum, process and remove them as quickly as we can. This would minimise this scenario. I agree that a more robust implementation would be to have a lock on both the list iterator and the remove(WorkItemBuffer) - see: theburningmonk.com/2010/03/thread-safe-enumeration-in-csharp/
@codingbloke
@codingbloke 5 месяцев назад
Its an interesting approach. That each request gets its own queue of items that are completed sequentially might be useful or convenient. On the other hand this may not be desirable or necessary. What happens in this approach if 300 requests each queue up a items? Do we really want the number of background tasks to also be 300? Might be ok, might not be it would depend on the type of workload, I/O bound load might be ok but CPU bound stuff would be a problem. Perhaps some further control is needed. The deal breaker, though, is the WaitAll. In a set of concurrent requests where one of them has created a queue workload significantly larger than others. The WaitAll will cause the item pump to stall whilst the large load is completed. Meanwhile new requests are busy adding items to their new queue buffers but nothing is being done while the code sits at WaitAll. The premise posed around 55secs into this video is that queue items from multiple requests are just getting queued behind each other and processed sequentially. A better design I think would be to drop the assumption that queue items are processed this way. Instead assume that a set of items can all be processed concurrently even items generated by the same request. Use a single channel (that never completes) and all items are written to that channel. Use `await foreach` to consume the channel `Reader.ReadAllAync` in the BackgoundService ExecuteAsync. IOW this is the infinite loop (because the channel never completes) and 500ms sleep not needed. inside the foreach dispatch a task but do not await it. Before executing the queue item this new task will first Wait on a Semaphore (use `SemaphoreSlim` as an instance field in the BackroundService) and ensure in a finally block that the semaphore is released. The Semaphore will control how many concurrent tasks can be dispatched. In this approach a queue item will immediately begin execution if we haven't saturated the Semaphore and if we have the next item will start immediately any currently running item completes. No Sleeps, no WaitAll. Multiple queues could be created, say one for CPU intensive tasks that has a small Semaphore and I/O bound queue which could have a large Semaphore.
@thepragmaticprogrammer
@thepragmaticprogrammer 5 месяцев назад
This is exactly what i was hoping for, a good dialogue on the subject to highlight that background processing is not trivial matter. You have made some excellent points and described a very good alternative. Personally, I had a constraint that meant I could not concurrently process items in the same request. I needed the processed in the order that they were reached in the http request. This just highlights that you really need to understand your requirements and constraints when you design a solution. Many thanks for taking the time to respond.
@miskoralgol575
@miskoralgol575 5 месяцев назад
What is it intended for? What about masstransit for ex?
@thepragmaticprogrammer
@thepragmaticprogrammer 5 месяцев назад
Masstransit is for distributed applications i.e. separate process. This is all in-process. This presents how to implement a fan-in fan-out pattern for background processing without tying up your http request.
@miskoralgol575
@miskoralgol575 5 месяцев назад
@@thepragmaticprogrammer i think simple in memory queue do the job. But if you want proces tasks sequentially, right worker approach is ok. But queue could be configured in the way that it will behave as sync procesu.
Далее
The New Option and Result Types of C#
15:05
Просмотров 65 тыс.
All About C# Source Generators | .NET Conf 2023
27:59
Net SOAP Service   Advanced Parameter Binding
17:20
Higher-Order Components Are Misunderstood In React
17:38
Postgres just got even faster
26:42
Просмотров 21 тыс.
.NET and C# are in trouble. Here is what I'd do.
10:57
Don't Use Polly in .NET Directly. Use this instead!
14:58