?Missing? API Docs for MAX/gRPC/Kserve?
First of all. Stunning work (MAX/Mojo). Thank you for sharing with the world 🙂
We're looking to integrate/run tests with (for stream processing to multi-model-realtime/inference)
https://docs.modular.com/max/serve/
ISSUE: In last 24hrs, Seems API docs for Max/API/Serve etc no longer available
https://docs.modular.com/max/api/mojo/serve/kserve/server/GRPCServer
Is Max/Serve/APIs being depreciated? or something else
Thanks in advance.
Intro to Serving | Modular Docs
Learn how to serve and deploy MAX Engine
3 Replies
Thanks for your interest! Yes, it docs were pulled a while ago. Sorry for the inconvenience! We'll share more soon. Stay tuned!
Noted. Thanks. I'll stay tuned 🙂
Just to make sure my new question wasn't misunderstood in my confusion between Serve in Max 'Vs' Mojo APIs recent doc changes (and whilst I still have my tabs open from Monday 26th Aug)
Max Serve: 'you can serve your model using NVIDIA's Triton Server'
https://docs.modular.com/max/serve/
However the following Mojo APIs below for Serve are now, no longer available?
e.g. max/api/mojo/serve
serve
* batch
* Batch
* Batchable
* Batcher
* http
* server
* PythonBatch
* PythonServer
* service
* PythonService
* kserve
* client
* GRPCClient
* mux
* FileModel
* MuxInferenceService
* server
* GRPCServer
* service
* InferenceService
* types
* InferenceBatch
* InferenceRequest
* InferenceResponse
* metrics
* util
* callbacks
* config
* debug
* stats
(I simply ask, as we were looking at ways load streaming blockchain data via a 'Geyser based gRPC interface' (or Kafka) to the mentioned APIs or MAX inference engine, with multi-prediction-models in TorchScript format/PyTorch GIVEN Tensor Flow is now deprecated)
Thanks in advance.
Intro to Serving | Modular Docs
Learn how to serve and deploy MAX Engine
Yes, correct! MAX serve Mojo API was decommissioned.