When creating a sign out route, should post or get request be used? why ?

Hello guys, sorry to disturb you all, I want to create a sign out route in my express server. I was reading a bit and saw that nowadays post request are used instead of get request which we used back in 2010. My question is why has this changed, why post is now used, I know the simple answer is "for security" but what exactly happens when using get or post to sign out a user, how post will handle it in a different way compared to get please
18 Replies
ἔρως
ἔρως3mo ago
post requests aren't usually cached get requests could be cached sending a post request is probably better it absolutely depends on the situation
Faker
FakerOP3mo ago
yeah, read that also, when we say "get requests could be cached", this mean we save information in the browser for it to not send requests each time a user want something but how does it affect security here
ἔρως
ἔρως3mo ago
no it means that somewhere along the way, the get request could be cached for example, cloudflare can cache pages served with get requests and instead of hitting the server, the requests hit the cloudflare cache
Faker
FakerOP3mo ago
hmm like the browser knows that this particular request should trigger this particular action, when sending the request, instead of interacting with the server, it interacts directly with the website cache, like cloudflare in this case ?
ἔρως
ἔρως3mo ago
that has nothing to do with the browser in fact, you don't even need a browser for it just something that does an http request
Faker
FakerOP3mo ago
yep, it just have to be a request ?
ἔρως
ἔρως3mo ago
this isn't something about front-end or back-end: it's something to do with the network and infrastructure
Faker
FakerOP3mo ago
when we send the request, what is cached? the data that the response usually sent ?
ἔρως
ἔρως3mo ago
depends it's whatever you configure and whatever the caching system can cache and was configured to cache
Faker
FakerOP3mo ago
okok but in the case of sign out, why whould we cache it? also, caching is something done automatically ? Like when we say "get request usually cache the data", this means it is done automatically ? we have no control on it ?
ἔρως
ἔρως3mo ago
you wouldn't that's why some might use a post request: to make sure it isn't cached
Faker
FakerOP3mo ago
hmm imagine we have a sign out link... When that web page refreshes/loads, our browser sends a get request to get all contents of current page. When it sends a get request for a particular image, it may happen our image has the endpoint of sign out appended, like normally when we click on the image, we should log out but since it's a get request loading that image, we will automatically be signed out ? Now if we have a link instead like in an anchor tag, why would the link be a problem here please, like we would eventually need to click on the link to just sign out, how does post prevents all that.
ἔρως
ἔρως3mo ago
i never said the link was a problem it isn't a problem you can do whatever you want some may use a post request to evade some caching systems others may stick with a get request but what matters is that going back doesn't show an old page where you're still logged in
13eck
13eck2mo ago
Semantically speaking, GET requests are asking for data (and as epic said, can be cached) while POST is for submitting data. Since you're telling the server to change the state of something a GET request doesn't make sense. Another reason to not use GET is that if you're emailing a log-out link many email services pre-fetch GET requests—to show a preview, or if it's an in-browser email service (like gmail) to cache it so if you click on it the page has already been downloaded and cached. This is fine if you want to display, say, a welcome page. But you don't want gmail to log out the user until they click on the link :p (this is especially problematic with services that use magic links and GET requests). The HTML spec "…defines caching semantics for GET, HEAD, and POST, although the overwhelming majority of cache implementations only support GET and HEAD." –https://httpwg.org/specs/rfc9110.html#rfc.section.9.2.3 Also, cloudflare has a blog post about caching if you'd like to read more.
Faker
FakerOP2mo ago
I had a look at the blog, really insightful, thanks ! But one thing I'm confused. Who/what do the caching? The browser or data centers? From what I read, I think there are different level of caching; one done by the browser, which stores thing locally (in hard-drive? maybe?) but also there is content delivery network servers which store cached data.
13eck
13eck2mo ago
By default, your browser will cache any GET request. The server can, however, send [specific](https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers#caching headers that determine how long the cache is valid for. These help reduce overhead and latency by giving the browser more details on how long to cache the data and/or when to request an updated version of the content. CDNs (content delivery networks) are something set up by the developer to act as an intermediary between the user agent (the browser usually, but for a JSON endpoint it could cache API data) and the server. This does a few things, but the two most important IMO are: 1. reduces load on the server if the responses are cached 1. Allows for data to live closer to the end user For nº2, most (all?) CDNs are distributed and have data centers all around the world, allowing for the end user to connect to the closest data center and GET the resource faster. As an exmaple, if your server is in Germany and the end user is in Vancouver, normally the request would have to travel across North America, traverse the ocean, get to Germany. Then the response follows the same route back. That's a long way, even on a fiber cable! But if there's a data center in, say, San Fransisco, then the end user in Vancouver only needs to connect to SF—a lot shorter trip! Both the response headers and the option of a CDN are up to the developers of the app/service/website/etc and are not "standard" for anything. And some times they just won't work. It's not a good idea to cache the response from a database call, right? If the data is updated before the cache times out you're out of luck (well, there are ways, but that's a more in-depth and complex reply that I am not qualified to speak on).
Jochem
Jochem2mo ago
(CDNs can also cache some / most of your requests too btw, but yeah, that's getting really complex) also, cache invalidation is considered one of the hardest problems in computer science. It's wickedly complex
Faker
FakerOP2mo ago
I have a basic understanding of it, thanks guys, really appreciate, I won't dive any deeper into this topic yet, seems really complex as you guys mentioned :c

Did you find this page helpful?