a460a35362
* Add PyTorch numpy functionality * Add dtype conversion * Partial fix for PyTorch numpy tests * small logic fix * Revert numpy_test * Add tensor conversion to numpy * Fix some arithmetic tests * Fix some torch functions for numpy compatibility * Fix pytorch ops for numpy compatibility, add TODOs * Fix formatting * Implement nits and fix dtype standardization * Add pytest skipif decorator and fix nits * Fix formatting and rename dtypes map * Split tests by backend * Merge space * Fix dtype issues from new type checking * Implement torch.full and torch.full_like numpy compatible * Implements logspace and linspace with tensor support for start and stop * Replace len of shape with ndim * Fix formatting * Implement torch.trace * Implement eye k diagonal arg * Implement torch.tri * Fix formatting issues * Fix torch.take dimensionality * Add split functionality * Revert torch.eye implementation to prevent conflict * Implement all padding modes |
||
---|---|---|
examples | ||
integration_tests | ||
keras_core | ||
shell | ||
.gitignore | ||
demo_functional.py | ||
jax_integration_test.py | ||
jax_training_scratchpad.py | ||
LICENSE | ||
pip_build.py | ||
pyproject.toml | ||
README.md | ||
requirements.txt | ||
setup.cfg | ||
setup.py | ||
tf_integration_test.py |
Keras Core: a new multi-backend Keras
Keras Core is a new multi-backend implementation of the Keras API, with support for TensorFlow, JAX, and PyTorch.
Backwards compatibility
Keras Core is intend to work as a drop-in replacement for tf.keras
(when using the TensorFlow backend).
In addition, Keras models can consume datasets in any format, regardless of the backend you're using: you can train your models with your existing tf.data.Dataset pipelines or Torch DataLoaders.
Why use Keras Core?
- Write custom components (e.g. layers, models, metrics) that you can move across framework boundaries.
- Make your code future-proof by avoiding framework lock-in.
- As a PyTorch user: get access to power of Keras, at last!
- As a JAX user: get access to a fully-featured, battle-tested modeling and training library.
Credits
TODO