C
C#2y ago
SWEETPONY

✅ how can 1000 requests be completed quickly?

I have a list of ids for each id I would go to database to find smth and let's say I'd like to work with each id as fast as possible
var arr = new int[]{1,2,3,4 .. 1000};
var tasks = new List<Task>();
foreach (var id in arr)
{
tasks.Add(Task.Run(() => {
// code for executing a database request for a given id
}));
}

Task.WaitAll(tasks.ToArray());
var arr = new int[]{1,2,3,4 .. 1000};
var tasks = new List<Task>();
foreach (var id in arr)
{
tasks.Add(Task.Run(() => {
// code for executing a database request for a given id
}));
}

Task.WaitAll(tasks.ToArray());
is it correct code? how it will work with 1000 items in array? will 1000 threads be created?
20 Replies
ero
ero2y ago
you probably want Parallel For
SWEETPONY
SWEETPONYOP2y ago
hm let's see
MODiX
MODiX2y ago
ivefifthsence#7281
REPL Result: Success
static void Main(string[] args)
{
Console.WriteLine("C# Parallel For Loop");

Parallel.For(1, 11, number => {
Console.WriteLine(number);
});
Console.ReadLine();
}
static void Main(string[] args)
{
Console.WriteLine("C# Parallel For Loop");

Parallel.For(1, 11, number => {
Console.WriteLine(number);
});
Console.ReadLine();
}
Compile: 617.175ms | Execution: 108.376ms | React with ❌ to remove this embed.
SWEETPONY
SWEETPONYOP2y ago
108.376ms
MODiX
MODiX2y ago
ivefifthsence#7281
REPL Result: Success
static void Main(string[] args)
{
Console.WriteLine("C# For Loop");
for (int i = 1; i <= 10; i++)
{
Console.WriteLine(i);
}
Console.ReadLine();
}
static void Main(string[] args)
{
Console.WriteLine("C# For Loop");
for (int i = 1; i <= 10; i++)
{
Console.WriteLine(i);
}
Console.ReadLine();
}
Compile: 612.330ms | Execution: 87.397ms | React with ❌ to remove this embed.
ero
ero2y ago
do not use this to measure it lmfao
SWEETPONY
SWEETPONYOP2y ago
87.397ms! heh
ero
ero2y ago
these are incomparable also not useful at all for measurement incomparable to your real usecase i mean this also literally doesn't execute anything as you can see by the lack of any console output
anita
anita2y ago
Hmm, a database call is io bound, so definitely no parallel.for or task.run. Just use normal async await Something like this but with a database call instead of task.delay:
var arr = new int[]{1,2,3,4};
var tasks = new List<Task>();
foreach (var id in arr)
{
tasks.Add(Task.Delay(50));
}

await Task.WhenAll(tasks.ToArray());

var arr = new int[]{1,2,3,4};
var tasks = new List<Task>();
foreach (var id in arr)
{
tasks.Add(Task.Delay(50));
}

await Task.WhenAll(tasks.ToArray());

gerard
gerard2y ago
Also, Console.WriteLine is thread-safe (it's locking the stream). So every thread is waiting for each-other when another thread is done writing the line.
Thinker
Thinker2y ago
Also these didn't even execute the code you wrote
JakenVeina
JakenVeina2y ago
kocha's approach is correct one of two correct possibilities
Sossenbinder
Sossenbinder2y ago
You can also consider Parallel.ForEachAsync if you don't want to flood your db with queries
JakenVeina
JakenVeina2y ago
the one where you START all of the requests synchronously, at roughly the same time, and wait for them all to complete asynchronously
chrispie
chrispie2y ago
I recommend this article I found some time ago, specifically the "parallel, but smarter" example. it's pretty much what kocha said above, but also consider batching: https://www.michalbialecki.com/en/2018/04/19/how-to-send-many-requests-in-parallel-in-asp-net-core/
admin
Michał Białecki Blog
How to send many requests in parallel in ASP.Net Core - Michał Biał...
I want to make 1000 requests! How can I make it really fast? Let’s have a look at 4 approaches and compare their speed. Preparations In order to test different methods of handling requests, I created a very simple ASP.Net Core API, that return user by his id. It fetches them from plain old MSSQL… Continue reading How to send many requests in par...
SWEETPONY
SWEETPONYOP2y ago
private static void Main(string[] args)
{
int max = 100000;

Stopwatch watch1 = new Stopwatch();
List<int> primes1 = new List<int>();
watch1.Start();
for(int i = 2; i < max; i++)
{
if (IsPrime(i)) primes1.Add(i);
}
watch1.Stop();
Console.WriteLine($"Primes found: {primes1.Count} for {watch1.ElapsedMilliseconds} msecs");

ConcurrentBag<int> primes2 = new ConcurrentBag<int>();
Stopwatch watch2 = new Stopwatch();
watch2.Start();
Parallel.For(2, max, i =>
{
if (IsPrime(i)) primes2.Add(i);
});
watch2.Stop();
Console.WriteLine($"Primes found in parallel: {primes2.Count} for {watch2.ElapsedMilliseconds} msecs");
Console.ReadKey();
}

private static bool IsPrime(int num)
{
for(long i = 2; i < num; i++)
{
if (num % i == 0) return false;
}
return true;
}
private static void Main(string[] args)
{
int max = 100000;

Stopwatch watch1 = new Stopwatch();
List<int> primes1 = new List<int>();
watch1.Start();
for(int i = 2; i < max; i++)
{
if (IsPrime(i)) primes1.Add(i);
}
watch1.Stop();
Console.WriteLine($"Primes found: {primes1.Count} for {watch1.ElapsedMilliseconds} msecs");

ConcurrentBag<int> primes2 = new ConcurrentBag<int>();
Stopwatch watch2 = new Stopwatch();
watch2.Start();
Parallel.For(2, max, i =>
{
if (IsPrime(i)) primes2.Add(i);
});
watch2.Stop();
Console.WriteLine($"Primes found in parallel: {primes2.Count} for {watch2.ElapsedMilliseconds} msecs");
Console.ReadKey();
}

private static bool IsPrime(int num)
{
for(long i = 2; i < num; i++)
{
if (num % i == 0) return false;
}
return true;
}
To check whether Parallel really speeds up calculations, I wrote this code to search for all primes less than 100,000 in a "synchronous" and "parallel" style. Collected the found numbers in ConcurrentBag. parallel is faster
cap5lut
cap5lut2y ago
that doesnt mean it will be for the data base access parallel will be better too for the db stuff the bottleneck is the IO between ur application and the db, not the compution power. with parallel u would be doing blocking IO, meaning the thread will be blocked/waiting until it has data to process (basically doing nothing in the mean time) in an async approach it would not be blocking, the thread would do other stuff in the mean time, which would be faster again
JakenVeina
JakenVeina2y ago
yes, parallel is faster for CPU bound work, because it leverages more CPU cores database I/O is I/O-bound work
M B V R K
M B V R K2y ago
You have good choices, first one is using Parallelism and the second one is using Caching
Accord
Accord2y ago
Was this issue resolved? If so, run /close - otherwise I will mark this as stale and this post will be archived until there is new activity.
Want results from more Discord servers?
Add your server