Contents

test-tube 0.7.5

0

Experiment logger and visualizer

Experiment logger and visualizer

Stars: 735, Watchers: 735, Forks: 75, Open Issues: 27

The williamFalcon/test-tube repo was created 6 years ago and the last code push was 1 years ago.
The project is popular with 735 github stars!

How to Install test-tube

You can install test-tube using pip

pip install test-tube

or add it to a project with poetry

poetry add test-tube

Package Details

Author
William Falcon
License
Homepage
https://github.com/williamFalcon/test_tube
PyPi:
https://pypi.org/project/test-tube/
GitHub Repo:
https://github.com/williamFalcon/test_tube
No  test-tube  pypi packages just yet.

Errors

A list of common test-tube errors.

Code Examples

Here are some test-tube code examples and snippets.

Related Packages & Articles

keras 3.2.0

Keras is a deep learning API written in Python, running on top of the machine learning platform TensorFlow. The core data structures of Keras are layers and models. The philosophy is to keep simple things simple, while allowing the user to be fully in control when they need to (the ultimate control being the easy extensibility of the source code via subclassing).

onnx 1.16.0

Open Neural Network Exchange (ONNX) is an open ecosystem that empowers AI developers to choose the right tools as their project evolves. ONNX provides an open source format for AI models, both deep learning and traditional ML. It defines an extensible computation graph model, as well as definitions of built-in operators and standard data types. Currently we focus on the capabilities needed for inferencing (scoring).

horovod 0.28.1

Horovod is a powerful distributed training framework for Python that allows you to train deep learning models across multiple GPUs and servers quickly and efficiently. It falls under the category of distributed computing libraries. Built on top of TensorFlow, PyTorch, and other popular deep learning frameworks, Horovod simplifies the process of scaling up your model training by handling the complexities of distributed training under the hood.