Issue running service using onnx models
Getting the following error when deploying a fastify server that uses Onnyx - a quick google search brings up a few threads from GH where folks are running into this issue on Vercel (i.e. here and here due to memory issues. Since this isn't serverless I don't think memory issues mentioned are applicable. Code works locally completely fine
GitHub
[Web] Failed to load model because protobuf parsing failed. · Issue...
Describe the issue This happens when executing this.session = await InferenceSession.create(./${this.model_name}); the model is converted from pytorch and the size is 63M.(Not sure if it can be upl...
GitHub
[Question] Issue with deploying model to Vercel using NextJS and tR...
Hi I'm trying to deploy my model to Vercel via NextJS and tRPC and have the .cache folder generated using the postinstall script // @ts-check let fs = require("fs-extra"); let path = ...
Solution:Jump to solution
Actually this can be marked as closed - I was in a monorepo and didn't explicitly include Onnyx in my server package. Did that, upgraded versions, and that seemed to fix it
3 Replies
Project ID:
93bdb69d-3975-4e76-bc29-f940b0028664
93bdb69d-3975-4e76-bc29-f940b0028664
Solution
Actually this can be marked as closed - I was in a monorepo and didn't explicitly include Onnyx in my server package. Did that, upgraded versions, and that seemed to fix it