Does it make sense to use workers to host a chat endpoint?

Since workers are a server-less platform an instance needs to boot up every time you make a request to the endpoint, from a latency POV this is fine because its super fast, but I just don't know if it makes sense for a chat endpoint. Is it wasteful to use it for an app where you are constantly talking back and forth with a chatbot and for every message you send a new instance is being spun up and closed right after?

P.S. this isn't for a peer to peer chat, but with an ai powered chatbot so no need to keep an open connection (e.g. sockets) or is there?
Was this page helpful?