Apache MXNet

Apache MXNet
Developer(s) Apache Software Foundation
Repository Edit this at Wikidata
Written in C++, Python, R, Julia, JavaScript, Scala, Go, Perl
Operating system Windows, macOS, Linux
Type Library for machine learning and deep learning
License Apache License 2.0
Website mxnet.apache.org

Apache MXNet is a modern open-source deep learning software framework, used to train, and deploy deep neural networks. It is scalable, allowing for fast model training, and supports a flexible programming model and multiple programming languages (including C++, Python, Julia, Matlab, JavaScript, Go, R, Scala, Perl, and Wolfram Language.)

The MXNet library is portable and can scale to multiple GPUs[1] and multiple machines. MXNet is supported by public cloud providers including Amazon Web Services (AWS)[2] and Microsoft Azure.[3] Amazon has chosen MXNet as its deep learning framework of choice at AWS.[4][5] Currently, MXNet is supported by Intel, Dato, Baidu, Microsoft, Wolfram Research, and research institutions such as Carnegie Mellon, MIT, the University of Washington, and the Hong Kong University of Science and Technology.[6]

Features

Apache MXNet is a lean, flexible, and ultra-scalable deep learning framework that supports state of the art in deep learning models, including convolutional neural networks (CNNs) and long short-term memory networks (LSTMs).

Scalable

MXNet is designed to be distributed on dynamic cloud infrastructure, using a distributed parameter server (based on research at Carnegie Mellon University, Baidu, and Google[7]), and can achieve almost linear scale with multiple GPUs or CPUs.

Flexible

MXNet supports both imperative and symbolic programming, which makes it easier for developers that are used to imperative programming to get started with deep learning. It also makes it easier to track, debug, save checkpoints, modify hyperparameters, such as learning rate or perform early stopping.

Multiple languages

Supports C++ for the optimized backend to get the most of the GPU or CPU available, and Python, R, Scala, Julia, Perl, MATLAB and JavaScript for a simple frontend for the developers.

Portable

Supports an efficient deployment of a trained model to low-end devices for inference, such as mobile devices (using Amalgamation [8]]), Internet of things devices (using AWS Greengrass), serverless computing (using AWS Lambda) or containers. These low-end environments can have only weaker CPU or limited memory (RAM), and should be able to use the models that were trained on a higher-level environment (GPU based cluster, for example).

See also

References

  1. "Building Deep Neural Networks in the Cloud with Azure GPU VMs, MXNet and Microsoft R Server". Retrieved 13 May 2017.
  2. "Apache MXNet on AWS - Deep Learning on the Cloud". Amazon Web Services, Inc. Retrieved 13 May 2017.
  3. "Building Deep Neural Networks in the Cloud with Azure GPU VMs, MXNet and Microsoft R Server". Microsoft TechNet Blogs. Retrieved 6 September 2017.
  4. "MXNet - Deep Learning Framework of Choice at AWS - All Things Distributed". www.allthingsdistributed.com. Retrieved 13 May 2017.
  5. "Amazon Has Chosen This Framework to Guide Deep Learning Strategy". Fortune. Retrieved 13 May 2017.
  6. "MXNet, Amazon's deep learning framework, gets accepted into Apache Incubator". Retrieved 2017-03-08.
  7. "Scaling Distributed Machine Learning with the Parameter Server" (PDF). Retrieved 2014-10-08.
  8. [https://mxnet.incubator.apache.org/faq/smart_device.html Amalgamation
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.