β FTP Uploading Files bigger than 2GB
this is quite a difficult one to put to paper, but i have teams/discord and if need me and I can live share the code. this may be an easier option
in my WinForms Application I have the ability to upload using the WinSCP API. now part of this is its generates a file list so that if I push the repository through again it will only upload new files or modified files. to do this it puts the whole file in memory to generate some kind of checksum of the file using File Information (I'm still not sure how this code works as I have used some of it from a Game Updater.
when I wasn't using files over 2GB this was working flawlessly, how ever I'm now moving on to push larger files I'm reaching the 2GB limit with File.ReadAllBytes (and the 32bit windows memory issue)
so some research suggests I should be using File Stream, so I've modified my code for that but now its not copying the byte data over correctly. I tried a couple 5gb+ files and it only wrote between 1gb and 1.7gb on the FTP server.
I think it will be something completely obvious I but I'm loosing the will to live lol!!!!!
my byte reading code.
TYIA , any and all help appreciated
18 Replies
Don't try to ftp such a large file; break it up into segments; send smaller segments and reassemble on the other end
il be honest im not sure how to do that and my brains fried from getting this far lol
so a bit more reserching suggest i should be using byte buffers.?
something like...
the above is pretty old school and hand-jammed; I just wouldn't try and send a 2GB file over FTP; that's a long time for something to go wrong and have to resend a lot of bytes again; also FTP is unencrypted; just an fyi.
ive got tls setup, will eventually switch out to SFTP
il take a proper gander at this and see what i can do, thank you for helping
NB: there are a number of gotchas with a many-files approach; make sure you name files appropriately and with justified zeros; you'll want enough padding that sorting works appropriately; you'll need something to reassemble those files on the other side.
Hazel | γΈγγγ
REPL Result: Success
Console Output
Compile: 608.439ms | Execution: 42.594ms | React with β to remove this embed.
@Mayor McCheese wouldn't using
.Chunk
prevent the need for do-while
(as in you could use foreach
instead)?
Granted I think its an extra allocation, so there's that.i shall go away and look at both examples and see if i can work out how to progress
thank you both
Yeah probably; I said it was old school π
I'm curious now lol
This is why I have a sandbox π
Because I didn't know Chunk existed until Nox mentioned it over the weekend
But that presupposes you can read 2gb into memory π you should be able to; but honestly I tend to shy aware from reading all the bytes at once
That's fair π
Was this issue resolved? If so, run
/close
- otherwise I will mark this as stale and this post will be archived until there is new activity.thanks all, turns out uploading wasnt the issue the code was writing the file to another location using a byte array
ive done this as a alternative now ,which has given me another issue when it comes to compressing the files but il put that in my next thread lol
I'd still be wary of uploading a massive file via ftp; ftp is notorious for unexpected disconnects.
im using WinSCP for ftp management now
I don't know if that's more reliable or not
Was this issue resolved? If so, run
/close
- otherwise I will mark this as stale and this post will be archived until there is new activity.