❔ How would i go about making a Load balancer

So im currently learning on how to properly deploy a REST API and i have the following question 1. a domain can only link 1 IP address, so how would i go on about deploying 10 copies of my build to 10 different servers under 1 domain 2. how would i link all of them to run on api.example.com 3. how does the servers decide what server takes on the request
64 Replies
cap5lut
cap5lut16mo ago
ur assumption that a dns entry can have only one ip address is wrong, it can have multiple there is a lot to load balancing, here are some examples: DNS Round Robin: the dns has multiple ip addresses and the dns resolver picks one in round robin manner GeoDNS: the closest ip based on the client ip will be resolved Anycast Routing: multiple servers share the same ip address, the network automatically routes the request to the closest server using the Border Gateway Protocol Global Load Balancer: a reverse proxy sits behind the domain and routes the traffic to the different servers based on some metrics
The Fog from Human Resources
thats the one i want cause i want the server to be used that is in use the least at the given time
cap5lut
cap5lut16mo ago
i dont really have experience with using any in production, but as far as im told nginx is a valid option for that https://docs.nginx.com/nginx/admin-guide/load-balancer/http-load-balancer/
HTTP Load Balancing
Load balance HTTP traffic across web or application server groups, with several algorithms and advanced features like slow-start and session persistence.
The Fog from Human Resources
thanks
Anchy
Anchy16mo ago
second nginx load balancer, its also really easy to setup
IsNotNull
IsNotNull16mo ago
If you want a pure .NET implementation there is also YARP. Its not as mature, and isn't designed to be as full featured as commercial offerings...but its free and really easy to understand if you are already familiar with ASP.NET Core conventions.
The Fog from Human Resources
Honestly I just want anything that can help me use multiple servers I think I'll go with the nginx option since I'm already familiar with it also while im at it couldnt i just make my own load balancer using ASP.NET? i mean after all its just reverse proxy to different servers
IsNotNull
IsNotNull16mo ago
Yes. Sure. That is exactly what YARP is Its just a library you add to an ASP.NET Core project And comes with conventions for making and configuring your own reverse proxy
The Fog from Human Resources
this would also allow me to pick the currently least used server cause i rly dont want to overload an already overloaded server its my actual first attempt on making a functional CDN so i hope it works lmao oooh i see yes thats very helpful
IsNotNull
IsNotNull16mo ago
Nginx is mature and respected. You don't need to second guess that choice if you like it. Sounds like you have a couple options
cap5lut
cap5lut16mo ago
thats indeed good to know, i should play around with yarp as well
The Fog from Human Resources
its just that my main focus with this, is that i can use the currently least loaded server so i wouldnt need most fancy features that nginx provides (probably) ill def. look into both tho both seem like good options
cap5lut
cap5lut16mo ago
well, the nginx configuration isnt that hard either:
upstream backend {
least_conn;
server backend1.example.com;
server backend2.example.com;
}

server {
location / {
proxy_pass http://backend;
}
}
upstream backend {
least_conn;
server backend1.example.com;
server backend2.example.com;
}

server {
location / {
proxy_pass http://backend;
}
}
thats it for the least loaded load balancing of the configuration (ofc i skipped the rest of the reverse proxy stuff, etc)
The Fog from Human Resources
so do i get this right i could have 100 servers, 1 of them is the load balancer running on api.example.com and all the 99 others run on s1.example.com, s2.example.com s3.example.com and so on? and then i just direct api.example.com to one of them?
cap5lut
cap5lut16mo ago
let me expand the configuration:
cap5lut
cap5lut16mo ago
worker_processes 1;

events {
worker_connections 1024;
}

http {
upstream backend { # all servers to distribute the load
least_conn; # use the server with the least connections
server 192.168.0.123;
server example.com;
server some.other.server.org;
}

server { # this is ur api.example.com load balancer server
listen 443 ssl http2;
server_name api.example.com;

ssl_certificate /etc/ssl/certs/api.example.com.crt;
ssl_certificate_key /etc/ssl/certs/api.example.com.key;

ssl_protocols TLSv1.3 TLSv1.2;

location / {
proxy_pass http://backend; # could be https as well
proxy_http_version 1.1;
proxy_set_header Host $host;
# set the other proxy headers here
}
}
}
worker_processes 1;

events {
worker_connections 1024;
}

http {
upstream backend { # all servers to distribute the load
least_conn; # use the server with the least connections
server 192.168.0.123;
server example.com;
server some.other.server.org;
}

server { # this is ur api.example.com load balancer server
listen 443 ssl http2;
server_name api.example.com;

ssl_certificate /etc/ssl/certs/api.example.com.crt;
ssl_certificate_key /etc/ssl/certs/api.example.com.key;

ssl_protocols TLSv1.3 TLSv1.2;

location / {
proxy_pass http://backend; # could be https as well
proxy_http_version 1.1;
proxy_set_header Host $host;
# set the other proxy headers here
}
}
}
The Fog from Human Resources
(my approach for making my own load balancer wouldve looked smth like this: each server wouldve made some sort of request to api.example.com so api.example.com knows every servers IP at all times, api.example.com can then get some data back for analytics, status and also reverse proxy to it )
cap5lut
cap5lut16mo ago
that would be a bare minimum for a ssl http2 load balancer thats using the server with the least amount of connections for the new connection
The Fog from Human Resources
wouldve allowed for a dynamic server thing yknow but idk if thats even a possibility like that
cap5lut
cap5lut16mo ago
u can edit the config file and let nginx reload it all while being up and running iirc
The Fog from Human Resources
tell me more abt the proxy_pass attribute, why does it lead to backend only and what is located on backend
The Fog from Human Resources
is backend a variable
The Fog from Human Resources
for this
cap5lut
cap5lut16mo ago
there is also some health check stuff, but i didnt touch that yet exactly
The Fog from Human Resources
oooh yes i understand now
cap5lut
cap5lut16mo ago
u couldve also called it unga_bunga or whatever ;p
cap5lut
cap5lut16mo ago
backend is basically a server group which also contains the settings for the load balancing (least_conn in this case)
The Fog from Human Resources
so my current idea for a server structure would be this 1 dedicated database server running on db.example.com 1 dedicated media server running on media.example.com 1 dedicated / shared load balancer for api.example.com and then however many servers i want for the actual processing / REST API is that correct
cap5lut
cap5lut16mo ago
yeah
The Fog from Human Resources
yees good thanks a lot
cap5lut
cap5lut16mo ago
note that if these worker servers arent in some kind of vpn or so u might want to use proxy_pass https//backend; instead, so that the traffic between load balancer and worker is encrypted as well
The Fog from Human Resources
and nginx will just simply skip servers that are for example unavailable atm right?
cap5lut
cap5lut16mo ago
yep
The Fog from Human Resources
id always use https anyways
cap5lut
cap5lut16mo ago
well if they r on the same machine its not worth the ssl overhead ;p
The Fog from Human Resources
they probably wont be
cap5lut
cap5lut16mo ago
yeah, then use https
The Fog from Human Resources
except yes i would use the load balancer server as a REST API server as well thats why i stated it would be shared it would be a waste to have an entire server just for redírects
cap5lut
cap5lut16mo ago
maybe also http2 instead of http 1.1, so u get multiplexed streams over the connection
The Fog from Human Resources
also what exactly does this do
cap5lut
cap5lut16mo ago
not entirely sure what worker_connections does yet, but worker_processes is the amount of well worker processes nginx spawns for handling the server
The Fog from Human Resources
also i had this idea i just dunno if its good i wanted to make a GUI program where i can drag my C# project folder into, the program would compile it and send the binary to a list of servers on the server then is a small rest api that will accept the file and set it up properly this way i would never need to connect to each server when new versions come out or smth (i ofc would secure the endpoint somehow)
cap5lut
cap5lut16mo ago
well i guess u will be using docker or alike to run the services, so just add the worker service container to the server group and u can use its own server as well
The Fog from Human Resources
honestly i never used docker i used to run all my shit on raw root SCcrying
cap5lut
cap5lut16mo ago
i would use something like gitlab + ci/cd + docker for that
The Fog from Human Resources
docker has stuff like autostart and auto restart right?
cap5lut
cap5lut16mo ago
that and isolation
The Fog from Human Resources
as long as it prevents me from having to log into each server to drag and drop a server binary
cap5lut
cap5lut16mo ago
imagine u have 3 microservices running on ur machine and all need different versions of the same lib
The Fog from Human Resources
i mean i dont rly like relying on third parties so i barely use libraries but yes yes i know what you mean except for Newtonsoft JSON and maybe RestSharp thats a must have. also what do i do about websockets
cap5lut
cap5lut16mo ago
they arent handled differently afaik
The Fog from Human Resources
does the proxy pass still apply to them?
cap5lut
cap5lut16mo ago
yeah, u just need to set the correct headers it seems:
location / {
proxy_pass http://backend;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection $connection_upgrade;
proxy_set_header Host $host;
}
location / {
proxy_pass http://backend;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection $connection_upgrade;
proxy_set_header Host $host;
}
cap5lut
cap5lut16mo ago
i meant more like system packages and alike
The Fog from Human Resources
Ah Yes yes I'll take a look at docker Any good learning resources recommended?
cap5lut
cap5lut16mo ago
Moby Dock
Docker
Get Started | Docker
Get started with Docker Desktop and join millions of developers in faster, more secure app development using containers and beyond.
The Fog from Human Resources
Bet Imma be back later, I'll leave this post open for some days so I can read back to it
Omnissiah
Omnissiah16mo ago
at work we use haproxy as load balancer
Accord
Accord16mo ago
Was this issue resolved? If so, run /close - otherwise I will mark this as stale and this post will be archived until there is new activity. Was this issue resolved? If so, run /close - otherwise I will mark this as stale and this post will be archived until there is new activity.
Want results from more Discord servers?
Add your server