Announcing Keras-MXNet v2.2

Contributors of Keras-MXNet are pleased to announce the release of v2.2.0 which brings a number of key improvements to the package. Most notably, the package has been updated to include the changes brought in by Keras v2.2.0. Massive backend design updates and a simplification of the API are the key highlights here. You can now use the new and simple Model API from Keras and get the high-performance of MXNet at the same time.

Keras-MXNet further improves the coverage of Keras operators with an MXNet backend, bringing the number of unsupported operators down to just 15. Critical operators like depthwise_conv2D, separable_conv2D, and conv1D with causal padding are supported by the MXNet backend in this release. Using these operators, you can now use MobileNet and Xception models with high-performance Keras-MXNet. Other layer-wise changes have been introduced too, such as deprecating the Merge layer in favor of the Concatenate layer.
 Although no major performance optimizations were announced in v2.2.0, the MXNet backend continues to be highly scalable and performant (as described in these benchmarks for the previous release). As before, we mark RNN support as experimental here. We also fix several bugs in this release and now support Custom Loss and Custom Callbacks.

Quick to install

Trying out the Keras-MXNet takes only a minute. First, install keras-mxnet:

pip install keras-mxnet

If you’re using GPUs, install MXNet with CUDA 9 support:

pip install mxnet-cu90

If you’re using CPU-only, install basic MXNet:

pip install mxnet-mkl

Then train your Keras models with the MXNet backend and witness the speed increase! The Keras examples work out-of-the-box. To test out training at scale with multi-GPU training, run the CIFAR10 multi-GPU script. Usage of this script is covered in AWS blog post’s CNN tutorial. The script expects four GPUs, but can be updated with the number of GPUs you’re running.

If you’re a Keras user, and you like where this is going, join the project, provide feedback, or pitch in on a feature you want to see. As an open source project, these great features are free to use, and are influenced and improved by open source community’s involvement. There are calls for contribution to enhance RNN support, which is currently experimental. Also, make sure you follow Apache MXNet to keep posted on new features, like details on how you can use MXNet Model Server to serve your Keras-MXNet models!

Thanks to Lai Wei, Kalyanee Chendke, and Thom Lane.

Source: Deep Learning on Medium