Bug : EAGAIN when use `plasmo package`
EAGAIN when using
plasmo package
It then starts to use a lot of memory52 Replies
@lab @LoXatoR fyi
Some resource related issues?
What is strange it is throwing there --
at new Worker (node:internal/worker:291:19)
at wk (file:///Users/dp/work/extension/node_modules/fflate/esm/index.mjs:22:13)
at wrkr (file:///Users/dp/work/extension/node_modules/fflate/esm/index.mjs:893:12)
at astrmify (file:///Users/dp/work/extension/node_modules/fflate/esm/index.mjs:930:13)
at new AsyncDeflate (file:///Users/dp/work/extension/node_modules/fflate/esm/index.mjs:1086:9)
at new AsyncZipDeflate (file:///Users/dp/work/extension/node_modules/fflate/esm/index.mjs:1949:18)
at file:///Users/dp/work/extension/node_modules/plasmo/dist/index.js:166:10563
at new Promise (<anonymous>)
at Qr (file:///Users/dp/work/extension/node_modules/plasmo/dist/index.js:166:10341)
at async Zh (file:///Users/dp/work/extension/node_modules/plasmo/dist/index.js:166:11510)
AsyncZipDeflate
@dp has reached level 1. GG!
What would you advise to debug @lab what makes this resource issue? I tried --verbose but nothing; We tried with a VM with 64GB or 128GB and it fails
Seems like simultaneous read/write at the same place or maybe resource unavailable?
Actually yeah try running
plasmo build
then plasmo package
Separatelywe already did this; I will add flag verbose; we're running this on github actions;
cf.
Try running them in 2 steps
Ok will try after this attempt
Ok will retry with two different steps --
as it failed here
That's an entirely diff error now :-?
And it seems package is using v0.84.1
(which might contain some err btw, iirc, I downgraded some parcel stuffs bc it wasn't working correctly :d )
I will bump to latest; (this is github actions runners; First log from my machine but it crashes/freezes)
Yes it's not exiting and keep going on Github Runner -- Need to stop the job manually
I am running a new build with plasmo 0.85.2 now
hmm, fyi the package command is basically a simple zip implementation that stream assets/files into the right place
GitHub
plasmo/cli/plasmo/src/commands/package.ts at main Ā· PlasmoHQ/plasmo
š§© The Browser Extension Framework. Contribute to PlasmoHQ/plasmo development by creating an account on GitHub.
You can also just try using the zip command over the build directory
Ha nice; let me see
@dp has reached level 2. GG!
I've seen it works with a 30MB bundle
so not sure what's going on there, esp when
build
worked
if you can debug and take a look at the package source, feel free to PR!Ok it seems our build dir is 402mb; compressed manually with zip makes it 114mb.
I will look why this is aas big as that
It seems the build node_modules directory include typescript, parcel... and lot of things (sass, react-dev-tools) ; probably we need to move to dev dependencies
I see the plasmo build CLI just call parcel -- but we can't pass much options to parcel
such as
yarn build --reporter @parcel/reporter-bundle-analyzer
https://github.com/PlasmoHQ/plasmo/blob/main/cli/plasmo/src/commands/build.ts#L38
WOuld it be possible to pass parcel functions?GitHub
plasmo/cli/plasmo/src/commands/build.ts at main Ā· PlasmoHQ/plasmo
š§© The Browser Extension Framework. Contribute to PlasmoHQ/plasmo development by creating an account on GitHub.
If you'd like to analyze your bundle, you can use the --bundle-buddy flag, combined with --source-maps to generate a Bundle Buddy
Just read it ; nice
Ok we just zip it using
zip
and it workedWe are also facing issue using manual api to upload the zip & sign it. We are able to successfully upload the zip but during sign call it's timing out. Getting below error.
An error occurred with your deployment
FUNCTION_INVOCATION_TIMEOUT
@lab
@abhishekjha how big is your extension BTW?
the zipped bundle
Extended the timeout so you should be able to try again for longer
100mb I believe
But we'l try to reduce it
@lab Still getting timeout
I suggest you reduce the size of your extension - 100mb is too high and I don't think we should support that.
Yeah it seems your extension was too large for us to generate a signed bundle for
Yes it seems our build was keeping node_modules and creating some nested node_modules that was enlarging the directory before zipping.
I think you should try to exclude them and bundle those assets
extension doesn't play well with dynamic scripts loading
any idea about this thread - https://discord.com/channels/946290204443025438/1219788890353827860 @lab ?
Also, it takes around 30 - 45s on average for
plasmo dev
to recompile and if I stop the dev server and run it again, it keeps re-creating the build and placing it in the existing build folder, thus creating multiple levels of nesting as seen in the picThat should not be the case, unless there's some crazy introspection/recursion dependency
I'd strip out and see which import is messing it up for you
@lab We started getting 500 error for sign api.
@abhishekjha has reached level 1. GG!
I saw a couple of 500 reported - it seems the key were incorrect (or that it was not able to resolve to your extension ID)
I downloaded a new private key & tried to sign it
But got same error
did you change anything in your code?
Nope.
Same pipeline were working fine so far but since few days we are getting 500 error
What's the extension ID?
Just now I retried for eanafdjekpnbkamplmoinkeojkjinakn
I got 500 error
Hmm, seems like your zip was too large
it's over 100MB
It works from UI but not from api
Can we increase the limit ?
@lab Can we increaes the limit please ?
Appear it's not the file size that's the issue, but rather the signing operation that our backend is doing took way longer than the allowed time for the serverless function to execute...
Tho I think it's really both
Any ETA of the fix ?
Just pushed a limit bump to the memory limit for the signing function, should be up soon
Okay
@abhishekjha has reached level 2. GG!
if the signing operation cannot fit within the 3GB box, you will need to upload manually
@lab Do you think we should also increase the timeout of lambda function ?
I suspect lambda function might be timing out
It's at max rn for us (300s)
ohhh