Contents

jaxlib 0.4.34

0

XLA library for JAX

XLA library for JAX

Stars: 30198, Watchers: 30198, Forks: 2766, Open Issues: 1771

The jax-ml/jax repo was created 5 years ago and the last code push was 18 hours ago.
The project is extremely popular with a mindblowing 30198 github stars!

How to Install jaxlib

You can install jaxlib using pip

pip install jaxlib

or add it to a project with poetry

poetry add jaxlib

Package Details

Author
JAX team
License
Apache-2.0
Homepage
https://github.com/jax-ml/jax
PyPi:
https://pypi.org/project/jaxlib/
GitHub Repo:
https://github.com/google/jax

Classifiers

No  jaxlib  pypi packages just yet.

Errors

A list of common jaxlib errors.

Code Examples

Here are some jaxlib code examples and snippets.

GitHub Issues

The jaxlib package has 1771 open issues on GitHub

  • Incorrect gradient of function with segment_prod
  • Allow comparing NamedShape to None
  • Implement jax2tf scatter_* ops with no XLA
  • unjitted_loop_body is not constantly re-compiled in the JIT tutorial
  • Add GDA to the API pages
  • [sparse] accept nse argument to sparse.empty()
  • Efficient argmin/argmax for bool type arrays
  • Fix auto-generated docstrings for JIT-compiled functions
  • jax.numpy.nanpercentile with axis as tuple
  • introduce custom_batching.sequential_vmap
  • Jitted function sometimes doesn't distinguish static arguments with different type
  • Jax profiler won't work with Cuda 11.5
  • jnp.[nan]argmin/max: implement keepdims
  • jax.numpy: add where and initial arguments to nan reductions
  • jax2tf no_xla implementation for scatter_*, advice requested

See more issues on GitHub

Related Packages & Articles

jax 0.4.34

Differentiate, compile, and transform Numpy code.

flax 0.9.0

Flax: A neural network library for JAX designed for flexibility

thinc 9.1.1

A refreshing functional take on deep learning, compatible with your favorite libraries

keras 3.6.0

Keras is a deep learning API written in Python, running on top of the machine learning platform TensorFlow. The core data structures of Keras are layers and models. The philosophy is to keep simple things simple, while allowing the user to be fully in control when they need to (the ultimate control being the easy extensibility of the source code via subclassing).