Alaanor
Alaanor
Explore posts from servers
RRailway
Created by Alaanor on 6/14/2024 in #✋|help
Reading ~30gb sequentially from the volume make the ram go to ~30gb
Just wanted to tell that I have finally moved back this particular service to railway and it's working great now 👍 thanks again for all
97 replies
RRailway
Created by Alaanor on 11/1/2024 in #✋|help
Are the volume on v2 runtime now real ?
I'll spend some time re-trying this then 😄 thanks
10 replies
RRailway
Created by Alaanor on 11/1/2024 in #✋|help
Are the volume on v2 runtime now real ?
I'm in europe so metal isn't for me yet. But glad to hear
10 replies
RRailway
Created by Alaanor on 11/1/2024 in #✋|help
Are the volume on v2 runtime now real ?
N/A
10 replies
RRailway
Created by Alaanor on 6/14/2024 in #✋|help
Reading ~30gb sequentially from the volume make the ram go to ~30gb
at least the UI said I was on it, I remember you said that it might be a lie from the UI because it would not works with volume, but now we got volume on v2 so I though I could maybe trust that UI
97 replies
RRailway
Created by Alaanor on 6/14/2024 in #✋|help
Reading ~30gb sequentially from the volume make the ram go to ~30gb
@Brody @Angelo I saw v2 volume are now a thing and so I spent the day to setup the stuff and try on railway again but unfortunately it's still stuck this way. Not to complain or anything, just wanted to share that it did not magically fix that, as we though it perhaps would.
97 replies
RRailway
Created by Alaanor on 6/14/2024 in #✋|help
Reading ~30gb sequentially from the volume make the ram go to ~30gb
This is really cool, appreciate the finding a lot. Thanks 👍 I'll be checking the railway changelog for v2 with volume frequently and hopefully one day I can be fully back on railway :)
97 replies
RRailway
Created by Alaanor on 6/14/2024 in #✋|help
Reading ~30gb sequentially from the volume make the ram go to ~30gb
Since I could not find a solution with railway for this particular thing, I bought a server somewhere else, although the disk io isn't as good as railway and this add me some complexity for deployment and monitoring :( But yeah I can't afford adding 200$+ to my monthly billing just to read a few files sometimes. I still use railway for a lot of other stuff and I'm happy with those. I just figured out that I should railway where it is helping me instead of trying to fight it. No hate, I can understand why buff/cached is counted. Just wanted to give an update for future people searching this thread.
97 replies
RRailway
Created by Alaanor on 6/14/2024 in #✋|help
Reading ~30gb sequentially from the volume make the ram go to ~30gb
this is pain
97 replies
RRailway
Created by Alaanor on 6/14/2024 in #✋|help
Reading ~30gb sequentially from the volume make the ram go to ~30gb
I had some hope for a moment with https://linux.die.net/man/2/open O_DIRECT, it works locally, doesn't bump the buff/cache but on railway I get an error, I guess the filesystem doesn't accept this custom flag :sossa:
97 replies
RRailway
Created by Alaanor on 6/14/2024 in #✋|help
Reading ~30gb sequentially from the volume make the ram go to ~30gb
No description
97 replies
RRailway
Created by Alaanor on 6/14/2024 in #✋|help
Reading ~30gb sequentially from the volume make the ram go to ~30gb
I could try a docker image on ubuntu or something but at this point we're back to what nixpack does
97 replies
RRailway
Created by Alaanor on 6/14/2024 in #✋|help
Reading ~30gb sequentially from the volume make the ram go to ~30gb
unfortunately did almost everything but for alpine I need some special target compilation and I can't get it to work easily with my rust code. got too much deps
97 replies
RRailway
Created by Alaanor on 6/14/2024 in #✋|help
Reading ~30gb sequentially from the volume make the ram go to ~30gb
I could give a try
97 replies
RRailway
Created by Alaanor on 6/14/2024 in #✋|help
Reading ~30gb sequentially from the volume make the ram go to ~30gb
are you sure you're did not mix 2 problems 😄 I was on something unrelated to deployment, I don't have deployment problem
97 replies
RRailway
Created by Alaanor on 6/14/2024 in #✋|help
Reading ~30gb sequentially from the volume make the ram go to ~30gb
[build]
builder = "NIXPACKS"

[deploy]
startCommand = "./api"
healthcheckPath = "/health"
healthcheckTimeout = 100

[phases.setup]
nixpkgsArchive = 'a459b363de387c078080b719b30c54d8a79b4a3e'
nixPkgs = ["...", "ffmpeg"]
nixLibs = ["...", "dav1d"]
[build]
builder = "NIXPACKS"

[deploy]
startCommand = "./api"
healthcheckPath = "/health"
healthcheckTimeout = 100

[phases.setup]
nixpkgsArchive = 'a459b363de387c078080b719b30c54d8a79b4a3e'
nixPkgs = ["...", "ffmpeg"]
nixLibs = ["...", "dav1d"]
not sure that matter tho, also I do my build on github action because they're not trivial and I do a final railway up -e {env} -s {service_id} to upload
97 replies
RRailway
Created by Alaanor on 6/14/2024 in #✋|help
Reading ~30gb sequentially from the volume make the ram go to ~30gb
I tried all I could think of really 😕 even with an explicit drop(variable_with_deserialized_data) at the end of each loop. Even after my job is done and clean it doesn't drop. I still want to precise that locally the same job never goes above a few mb of ram, on the same dataset.
97 replies
RRailway
Created by Alaanor on 6/14/2024 in #✋|help
Reading ~30gb sequentially from the volume make the ram go to ~30gb
Well if that doesn't sound like a bug/misuse to you, then I got my answer I guess 😄
97 replies
RRailway
Created by Alaanor on 6/14/2024 in #✋|help
Reading ~30gb sequentially from the volume make the ram go to ~30gb
locally it is discarded right after each read, so my ram never go up. I can run this program with 1gb or less ram
97 replies
RRailway
Created by Alaanor on 6/14/2024 in #✋|help
Reading ~30gb sequentially from the volume make the ram go to ~30gb
once I have read and made use of the information how can I discard and let the unused ram go back ?
97 replies