Contents

parallelformers 1.2.7

0

An Efficient Model Parallelization Toolkit for Deployment

An Efficient Model Parallelization Toolkit for Deployment

Stars: 777, Watchers: 777, Forks: 61, Open Issues: 28

The tunib-ai/parallelformers repo was created 3 years ago and the last code push was 1 years ago.
The project is popular with 777 github stars!

How to Install parallelformers

You can install parallelformers using pip

pip install parallelformers

or add it to a project with poetry

poetry add parallelformers

Package Details

Author
TUNiB
License
Homepage
https://github.com/tunib-ai/parallelformers
PyPi:
https://pypi.org/project/parallelformers/
GitHub Repo:
https://github.com/tunib-ai/parallelformers

Classifiers

  • Scientific/Engineering/Artificial Intelligence
  • Software Development/Libraries
No  parallelformers  pypi packages just yet.

Errors

A list of common parallelformers errors.

Code Examples

Here are some parallelformers code examples and snippets.

GitHub Issues

The parallelformers package has 28 open issues on GitHub

  • GPU행업 이슈
  • GPT models hang on large token generation. Lower performance?

See more issues on GitHub

Related Packages & Articles

parmap 1.7.0

map and starmap implementations passing additional arguments and parallelizing if possible

horovod 0.28.1

Horovod is a powerful distributed training framework for Python that allows you to train deep learning models across multiple GPUs and servers quickly and efficiently. It falls under the category of distributed computing libraries. Built on top of TensorFlow, PyTorch, and other popular deep learning frameworks, Horovod simplifies the process of scaling up your model training by handling the complexities of distributed training under the hood.