Contents

multi-model-server 1.1.11

0

Multi Model Server is a tool for serving neural net models for inference

Multi Model Server is a tool for serving neural net models for inference

Stars: 973, Watchers: 973, Forks: 227, Open Issues: 102

The awslabs/multi-model-server repo was created 6 years ago and the last code push was 3 months ago.
The project is popular with 973 github stars!

How to Install multi-model-server

You can install multi-model-server using pip

pip install multi-model-server

or add it to a project with poetry

poetry add multi-model-server

Package Details

Author
Trinity team
License
Apache License Version 2.0
Homepage
https://github.com/awslabs/multi-model-server
PyPi:
https://pypi.org/project/multi-model-server/
GitHub Repo:
https://github.com/awslabs/multi-model-server
No  multi-model-server  pypi packages just yet.

Errors

A list of common multi-model-server errors.

Code Examples

Here are some multi-model-server code examples and snippets.

GitHub Issues

The multi-model-server package has 102 open issues on GitHub

  • Remove ineffective log4j 1 references from code
  • [Q] GPU support

See more issues on GitHub

Related Packages & Articles

thinc 8.2.3

A refreshing functional take on deep learning, compatible with your favorite libraries

spacy 3.7.4

Industrial-strength Natural Language Processing (NLP) in Python

mediapipe 0.10.11

MediaPipe is the simplest way for researchers and developers to build world-class ML solutions and applications for mobile, edge, cloud and the web.

kornia 0.7.2

Open Source Differentiable Computer Vision Library for PyTorch

horovod 0.28.1

Horovod is a powerful distributed training framework for Python that allows you to train deep learning models across multiple GPUs and servers quickly and efficiently. It falls under the category of distributed computing libraries. Built on top of TensorFlow, PyTorch, and other popular deep learning frameworks, Horovod simplifies the process of scaling up your model training by handling the complexities of distributed training under the hood.