Contents

Adeepspeed 0.9.2

0

DeepSpeed library

DeepSpeed library

Stars: 32364, Watchers: 32364, Forks: 3810, Open Issues: 1025

The microsoft/DeepSpeed repo was created 4 years ago and the last code push was 17 minutes ago.
The project is extremely popular with a mindblowing 32364 github stars!

How to Install adeepspeed

You can install adeepspeed using pip

pip install adeepspeed

or add it to a project with poetry

poetry add adeepspeed

Package Details

Author
DeepSpeed Team
License
MIT
Homepage
http://deepspeed.ai
PyPi:
https://pypi.org/project/Adeepspeed/
Documentation:
https://deepspeed.readthedocs.io
GitHub Repo:
https://github.com/microsoft/DeepSpeed

Classifiers

No  adeepspeed  pypi packages just yet.

Errors

A list of common adeepspeed errors.

Code Examples

Here are some adeepspeed code examples and snippets.

GitHub Issues

The adeepspeed package has 1025 open issues on GitHub

  • [BUG] matmul_ext_update_autotune_table atexit error
  • [BUG] Unexpected caculations at backward pass with ZeRO-Infinity SSD offloading
  • update ut/doc for glm/codegen
  • Multi-node and multi-GPU fine-tuning error: ncclInternalError
  • Zero Stage-2 Frozen Layers[BUG]
  • [PROBLEM] P2p recv waiting for data will cause other threads under the same process to be unable to perform any operations
  • Spread layers more uniformly when using partition_uniform
  • Issue with DeepSpeed Inference - Multiple Processes for Model Loading and Memory Allocation
  • [BUG] CPU Adam failing
  • [BUG] Cannot increase batch size more than 1 with ZeRO-Infinity SSD offloading
  • [REQUEST] please provide clear working installation guide
  • load linear layer weight with dtype from ckpt
  • [QNA] How can i choose adam between fused and cpu?
  • Refactor autoTP inference for HE
  • [BUG] No runnable example for MoE / PR-MoE GPT inference

See more issues on GitHub

Related Packages & Articles

deepspeed 0.14.0

DeepSpeed is a Python package developed by Microsoft that provides a deep learning optimization library designed to scale across multiple GPUs and servers. It is capable of training models with billions or even trillions of parameters, achieving excellent system throughput and efficiently scaling to thousands of GPUs.

DeepSpeed is particularly useful for training and inference of large language models, and it falls under the category of Machine Learning Frameworks and Libraries. It is designed to work with PyTorch and offers system innovations such as Zero Redundancy Optimizer (ZeRO), 3D parallelism, and model-parallelism to enable efficient training of large models.

fastai 2.7.14

fastai simplifies training fast and accurate neural nets using modern best practices

auto-gptq 0.7.1

An easy-to-use LLMs quantization package with user-friendly apis, based on GPTQ algorithm.

detectors 0.1.11

This is a comprehensive library for generalized Out-of-Distribution (OOD) detection research. It provides over 20 detection methods, evaluation pipelines, OOD datasets, and model architectures integrated with timm. Additionally, it offers fast OOD evaluation metrics, multi-layer detection methods, and pipelines for open set recognition and covariate drift detection.

flwr 1.8.0

Flower: A Friendly Federated Learning Framework

deeplake 3.9.0

Deep Lake is a Database for AI powered by a unique storage format optimized for deep-learning and Large Language Model (LLM) based applications. It simplifies the deployment of enterprise-grade LLM-based products by offering storage for all data types (embeddings, audio, text, videos, images, pdfs, annotations, etc.), querying and vector search, data streaming while training models at scale, data versioning and lineage for all workloads, and integrations with popular tools such as LangChain, LlamaIndex, Weights & Biases, and many more.

huggingface-hub 0.22.2

Client library to download and publish models, datasets and other repos on the huggingface.co hub