Go to file
Haifeng Jin 860b1ca4da add adam for torch (#531)
Co-authored-by: Haifeng Jin <haifeng-jin@users.noreply.github.com>
2023-07-18 12:36:10 -07:00
benchmarks Update symbolic_arguments.py (#513) 2023-07-18 21:57:40 +05:30
examples Convert to Keras Core: Token Learner (#528) 2023-07-18 14:19:11 -04:00
guides Added JAX distributed training guide. (#464) 2023-07-12 13:47:28 -07:00
integration_tests Integration test fix 2023-07-10 11:09:26 -07:00
keras_core add adam for torch (#531) 2023-07-18 12:36:10 -07:00
shell Add benchmark for layers (#231) 2023-05-31 21:46:39 -07:00
.gitignore Minor torch fixes 2023-05-15 13:21:37 -07:00
conftest.py Merge branch 'main' of github.com:keras-team/keras-core 2023-06-26 16:06:26 -07:00
LICENSE Add license. 2023-04-09 13:00:24 -07:00
pip_build.py Fix build 2023-07-16 10:19:45 -07:00
pyproject.toml Merge branch 'main' of github.com:keras-team/keras-core 2023-04-21 23:16:51 -07:00
README.md Fix typo in README (#441) 2023-07-11 22:32:40 +05:30
requirements-common.txt Merge branch 'main' of github.com:keras-team/keras-core 2023-07-16 16:15:32 -07:00
requirements-cuda.txt Merge branch 'main' of github.com:keras-team/keras-core 2023-07-16 16:15:32 -07:00
requirements.txt Merge branch 'main' of github.com:keras-team/keras-core 2023-07-16 16:15:32 -07:00
setup.cfg fix lint (#168) 2023-05-14 18:41:50 +00:00
setup.py Fix build 2023-07-16 14:28:49 -07:00

Keras Core: A new multi-backend Keras

Keras Core is a new multi-backend implementation of the Keras API, with support for TensorFlow, JAX, and PyTorch.

WARNING: At this time, this package is experimental. It has rough edges and not everything might work as expected. We are currently hard at work improving it.

Once ready, this package will become Keras 3.0 and subsume tf.keras.

Local installation

Keras Core is compatible with Linux and MacOS systems. To install a local development version:

  1. Install dependencies:
pip install -r requirements.txt
  1. Run installation command from the root directory.
python pip_build.py --install

Note that Keras Core strictly requires TensorFlow, in particular because it uses tf.nest to handle nested Python structures. In the future, we will make all backend frameworks optional.

Configuring your backend

You can export the environment variable KERAS_BACKEND or you can edit your local config file at ~/.keras/keras.json to configure your backend. Available backend options are: "tensorflow", "jax", "torch". Example:

export KERAS_BACKEND="jax"

In Colab, you can do:

import os
os.environ["KERAS_BACKEND"] = "jax"

import keras_core as keras

Backwards compatibility

Keras Core is intended to work as a drop-in replacement for tf.keras (when using the TensorFlow backend). Just take your existing tf.keras code, change the keras imports to keras_core, make sure that your calls to model.save() are using the up-to-date .keras format, and you're done.

If your tf.keras model does not include custom components, you can start running it on top of JAX or PyTorch immediately.

If it does include custom components (e.g. custom layers or a custom train_step()), it is usually possible to convert it to a backend-agnostic implementation in just a few minutes.

In addition, Keras models can consume datasets in any format, regardless of the backend you're using: you can train your models with your existing tf.data.Dataset pipelines or PyTorch DataLoaders.

Why use Keras Core?

  • Run your high-level Keras workflows on top of any framework -- benefiting at will from the advantages of each framework, e.g. the scalability and performance of JAX or the production ecosystem options of TensorFlow.
  • Write custom components (e.g. layers, models, metrics) that you can use in low-level workflows in any framework.
    • You can take a Keras model and train it in a training loop written from scratch in native TF, JAX, or PyTorch.
    • You can take a Keras model and use it as part of a PyTorch-native Module or as part of a JAX-native model function.
  • Make your ML code future-proof by avoiding framework lock-in.
  • As a PyTorch user: get access to power and usability of Keras, at last!
  • As a JAX user: get access to a fully-featured, battle-tested, well-documented modeling and training library.