R
Railway3mo ago
caelin

Issue running service using onnx models

Getting the following error when deploying a fastify server that uses Onnyx - a quick google search brings up a few threads from GH where folks are running into this issue on Vercel (i.e. here and here due to memory issues. Since this isn't serverless I don't think memory issues mentioned are applicable. Code works locally completely fine
Error: Failed to load model because protobuf parsing failed.

at OnnxruntimeSessionHandler (/app/node_modules/@xenova/transformers/node_modules/onnxruntime-node/lib/backend.ts:16:30)
Error: Failed to load model because protobuf parsing failed.

at OnnxruntimeSessionHandler (/app/node_modules/@xenova/transformers/node_modules/onnxruntime-node/lib/backend.ts:16:30)
GitHub
[Web] Failed to load model because protobuf parsing failed. · Issue...
Describe the issue This happens when executing this.session = await InferenceSession.create(./${this.model_name}); the model is converted from pytorch and the size is 63M.(Not sure if it can be upl...
GitHub
[Question] Issue with deploying model to Vercel using NextJS and tR...
Hi I'm trying to deploy my model to Vercel via NextJS and tRPC and have the .cache folder generated using the postinstall script // @ts-check let fs = require("fs-extra"); let path = ...
Solution:
Actually this can be marked as closed - I was in a monorepo and didn't explicitly include Onnyx in my server package. Did that, upgraded versions, and that seemed to fix it
Jump to solution
3 Replies
Percy
Percy3mo ago
Project ID: 93bdb69d-3975-4e76-bc29-f940b0028664
caelin
caelinOP3mo ago
93bdb69d-3975-4e76-bc29-f940b0028664
Solution
caelin
caelin3mo ago
Actually this can be marked as closed - I was in a monorepo and didn't explicitly include Onnyx in my server package. Did that, upgraded versions, and that seemed to fix it
Want results from more Discord servers?
Add your server