Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

tensor flow deprecates alpha in LeakyRelu in favor of negative_slope #117

Open
stnava opened this issue May 23, 2024 · 4 comments
Open

tensor flow deprecates alpha in LeakyRelu in favor of negative_slope #117

stnava opened this issue May 23, 2024 · 4 comments

Comments

@stnava
Copy link
Member

stnava commented May 23, 2024

solution is probably just global search and replace things like this:

LeakyReLU(alpha=0.2)

with:

LeakyReLU(negative_slope=0.2)

just a note.

@ntustison
Copy link
Member

Thanks @stnava . We should probably try to start migrating to keras 3. Do you or @cookpa see any issues with this?

@ntustison
Copy link
Member

Nm. I think the move might still be a bit premature.

@stnava
Copy link
Member Author

stnava commented May 23, 2024

ChatGPT Summary ---- see point 1 ---- very interesting:

Keras 3 introduces several new features and improvements, making it a significant upgrade from its previous versions. Here are some of the key highlights:

  1. Multi-Backend Support: Keras 3 acts as a "super-connector," allowing you to run your Keras workflows on TensorFlow, JAX, or PyTorch. This flexibility enables developers to choose the best tool for their specific tasks without changing the codebase【11†source】【13†source】.

  2. Performance Optimization: By default, Keras 3 leverages XLA (Accelerated Linear Algebra) compilation, optimizing mathematical computations for faster execution on hardware like GPUs and TPUs. This allows for more efficient model training and experimentation【11†source】.

  3. Expanded Ecosystem: Keras 3 supports a wide range of pretrained models across different backends, including models from Keras Applications, KerasCV, and KerasNLP. This includes popular models like BERT, T5, and YOLOv8【10†source】.

  4. Cross-Framework Data Pipelines: Keras 3 enables seamless integration with various data loading and preprocessing frameworks, such as TensorFlow's tf.data.Dataset, PyTorch's DataLoader, NumPy arrays, and Pandas dataframes. This cross-framework compatibility fosters greater flexibility in model training【10†source】.

  5. Stateless API: Keras 3 introduces a stateless API for layers, models, metrics, and optimizers, which is particularly useful for JAX's requirement for stateless functions. This makes Keras more compatible with functional programming paradigms【10†source】【11†source】.

  6. Progressive Disclosure of Complexity: Keras 3 maintains its user-friendly design by allowing users to start with simple workflows and progressively access more advanced features as needed. This design principle supports both beginners and advanced users, providing a smooth learning curve and flexibility in model development【11†source】【13†source】.

  7. Improved Distributed Training: Keras 3 enhances distributed training capabilities with new APIs for data and model parallelism. This includes tools for sharding models across multiple devices, making it easier to train large models on distributed hardware setups【10†source】【14†source】.

These features collectively make Keras 3 a more versatile, efficient, and user-friendly deep learning framework. For detailed guides on migrating from Keras 2 to Keras 3 and utilizing the new features, you can refer to the official Keras documentation【12†source】.

@ntustison
Copy link
Member

THanks @stnava. Yeah, I'm sure it's great. I tried training with it last night but didn't get very far. But we should definitely start thinking about a migration strategy.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants