Is there any pre-defined package (or recommended) to create robots.txt?

I was wondering whether there is a quick way to configure robots.txt in Wasp.
7 Replies
kapa.ai
kapa.ai•8mo ago
Hi @Ricardo Batista. I'm kapa.ai, your friendly support bot. I will try to answer your question by searching through the knowledge base. :wave: ---------------------- Hmm, I don't know enough to give you a confident answer yet. The knowledge sources provided do not mention a specific package or a recommended way to create a robots.txt file in a Wasp project. However, you can typically place your robots.txt file in the public directory of your project, as this is where static files are stored in a Wasp project source (https://wasp-lang.dev/docs/tutorial/project-structure). For more detailed information, I would recommend checking the official Wasp documentation or their community support channels. ---------------------- Note: you can ask me a follow up question by @ mentioning me again :speech_balloon: ---------------------- Sorry @Ricardo Batista, I seem to be having a bit of trouble. Don't worry though, I've pinged my support and they will try to fix me up! :technologist: Perhaps try again in 15mins?
Ricardo Batista
Ricardo BatistaOP•8mo ago
I see AHREFs is identifying one, but I cannot identify it in the code
MEE6
MEE6•8mo ago
Wohooo @Ricardo Batista, you just became a Waspeteer level 2!
martinsos
martinsos•8mo ago
@Ricardo Batista , we don't have such support yet, basically it is as the bot said, you can put robots.txt in public/ and that is it. I am curious to better understand do how we could make this better: you were asking if there is a quick way to confgiure robots.txt. What would you expect that to work like? Wasp generating some kind of initial robots.txt for you? Any details you can share here help!
I see AHREFs is identifying one, but I cannot identify it in the code
Hm that is interesting. It is not present in the .wasp/out/web-app/public ? If so, maybe the place where you hosted the Wasp frontend/client is providing a default robots.txt?
miho
miho•8mo ago
It might be down to your hosting provider, they might serve something by default 😄 where did you host your client? @martinsos Oh, you wrote the same thing at the end of your answer! My bad
Ricardo Batista
Ricardo BatistaOP•8mo ago
@martinsos it would be helpful to have a function that automatically creates robots.txt with some rules (e.g., excluding a few URLs to be included). There is a package for Astro that covers this: astro-robots-txt. But I will create it manually indeed. @miho I am using Netlify. I just came across this guide, will have a look at it.
martinsos
martinsos•8mo ago
Thanks @Ricardo Batista , that does sound useful, I will create an issue on GIthub! https://github.com/wasp-lang/wasp/issues/1964
Want results from more Discord servers?
Add your server