Run open LLMs on Aqueduct. Learn more →

The Aqueduct logo.

Product

PricingTeamCareers
2
BlogDocs

Real-Time Inference with Aqueduct

Create REST endpoints to serve your models scalably and in your cloud.

We're working on building support for real-time prediction serving in Aqueduct. If you're interested in discussing the details further, please join our community Slack to participate in the discussion.

Stay up to date with Aqueduct

Aqueduct

Why AqueductOpen SourceDocumentationResources

Company

AboutBlogCareers
2

Try Aqueduct today

See how Aqueduct can help untangle the MLOps Knot.

The Aqueduct logo.

© 2023 Aqueduct, Inc. All rights reserved.