W
Wasp-langβ€’10mo ago
fkx

GPTs - bringing my OpenAI Hosted GPTs to wasp and hosting it for free.

I hope I'm not missing the point and misunderstanding what wasp it. Like many others I built several GPTs and I'm OK with how they turned out except for them requiring the visitor to sign up for a pro subscription to see them They are concept demos so I'l like to have them be no charge to view, or just maybe an email signup or google auth to register and sign in. I'd like some extras such ass a landing page to explain it and maybe a blog, etc. I don't mind rewriting the GPTs as a python or javascript/typescript app calling my openAI API. Would like to see a template for that though. I would like a low cost hosting (fly? railway, netlify, replit, colab, ???) Is wasp the approach I should take or am I missing the point? Any response or reaction is welcome. πŸ‘
6 Replies
fkx
fkxOPβ€’10mo ago
I mean free to the visitor, not free for me πŸ€·β€β™‚οΈ
matijash
matijashβ€’10mo ago
Thanks for the question! When you say you built your GPTs, does that mean your own, custom trained LLM models?
fkx
fkxOPβ€’10mo ago
1) I will be creating custom trained GPT by fine tuning and other methods to refine general LLM for specific domains. But in this case I'm referring tp OpenAI's ChatGPT thing "GPTs". They are easy to create, can be done with only prompts but work much better with Actions and Assistant API. The hitch is they require a visitor to sign up for Pro subscription to see them. While they can be embedded on any website, the visitor still has to subscribe to pro to see any/all "GPTs" It occurs to me that it would be possible to recreate the GPTs with wasp and host on the inexpensive hosts wasp refers to (fly, railway, netlify etc). The app would have to be recoded in wasp with api calls to openAI or other LLM, but that is expected.
MEE6
MEE6β€’10mo ago
Wohooo @fkx, you just became a Waspeteer level 1!
fkx
fkxOPβ€’10mo ago
I just want to hear if anyone sees a technical problem with this approach before I put the work in to find out the hard way. Yeah, I'm llazy.
martinsos
martinsosβ€’10mo ago
@fkx not lazy at all, that is a very good question to ask! So you could certainly do it with Wasp. The "unique" part of your app, that we should reason about to figure out if there are any limitations on the Wasp side, is your logic that will be calling the OpenAI API. You could implement that logic in JS: in that case you can do it directly in your Wasp JS code. You could alternatively decide to do it in python -> in that case you could either run that script as a process on the same server and call it from js, or you could host it as a separate (micro)service that your Wasp app server will be communicating with. In any case, shouldn't be too hard. We actually built something similar: https://usemage.ai/ -> it is an app that creates a codebase for a new Wasp app for you based on the short description of what you want to build. It is actually implemented in Wasp (code is open source), so Mage is a Wasp app, but the core logic for calling OpenAI API is not impelmented in JS, but in Haskell (becuase that allowed us to also ship it with our wasp CLI which is in Haskell), and we call it as a separate process from our Wasp JS server code. So it is basically like the approach I described above: writing "AI" code in python and calling it from the Wasp JS server. I would personally either recommend doing it all in JS as part of Wasp, or doing it as Python script/program that you will spawn from the Wasp JS server code and call/use. In the future you can always easily separte it into separate service if need be, but this way you avoid doing that prematurely and having to manage another deployment of something. TLDR: no, nothing in Wasp that can stop you here!
MAGE GPT Web App Generator ✨ MageGPT
Generate your full-stack React, Node.js and Prisma web app using the magic of GPT and the Wasp full-stack framework.
Want results from more Discord servers?
Add your server