Adeepspeed 0.9.2
0
DeepSpeed library
Contents
DeepSpeed library
Stars: 35070, Watchers: 35070, Forks: 4063, Open Issues: 1109The microsoft/DeepSpeed
repo was created 4 years ago and the last code push was Yesterday.
The project is extremely popular with a mindblowing 35070 github stars!
How to Install adeepspeed
You can install adeepspeed using pip
pip install adeepspeed
or add it to a project with poetry
poetry add adeepspeed
Package Details
- Author
- DeepSpeed Team
- License
- MIT
- Homepage
- http://deepspeed.ai
- PyPi:
- https://pypi.org/project/Adeepspeed/
- Documentation:
- https://deepspeed.readthedocs.io
- GitHub Repo:
- https://github.com/microsoft/DeepSpeed
Classifiers
Related Packages
Errors
A list of common adeepspeed errors.
Code Examples
Here are some adeepspeed
code examples and snippets.
GitHub Issues
The adeepspeed package has 1109 open issues on GitHub
- [BUG]
matmul_ext_update_autotune_table
atexit error - [BUG] Unexpected caculations at backward pass with ZeRO-Infinity SSD offloading
- update ut/doc for glm/codegen
- Multi-node and multi-GPU fine-tuning error: ncclInternalError
- Zero Stage-2 Frozen Layers[BUG]
- [PROBLEM] P2p recv waiting for data will cause other threads under the same process to be unable to perform any operations
- Spread layers more uniformly when using partition_uniform
- Issue with DeepSpeed Inference - Multiple Processes for Model Loading and Memory Allocation
- [BUG] CPU Adam failing
- [BUG] Cannot increase batch size more than 1 with ZeRO-Infinity SSD offloading
- [REQUEST] please provide clear working installation guide
- load linear layer weight with dtype from ckpt
- [QNA] How can i choose adam between fused and cpu?
- Refactor autoTP inference for HE
- [BUG] No runnable example for MoE / PR-MoE GPT inference