CI/CD for ML
Inference Server
Computational infrastructure designed to serve machine learning model predictions in real-time or batch mode, typically exposing REST or gRPC endpoints for model inference.
← Geri