Comparison of deep learning software

From Wikipedia, the free encyclopedia
Jump to navigation Jump to search

The following table compares notable software frameworks, libraries and computer programs for deep learning.


Deep learning software by name[edit]

Software Creator Initial Release Software license[a] Open source Platform Written in Interface OpenMP support OpenCL support CUDA support Automatic differentiation[1] Has pretrained models Recurrent nets Convolutional nets RBM/DBNs Parallel execution (multi node) Actively Developed
BigDL Jason Dai (Intel) 2016 Apache 2.0 Yes Apache Spark Scala Scala, Python No Yes Yes Yes
Caffe Berkeley Vision and Learning Center 2013 BSD Yes Linux, macOS, Windows[2] C++ Python, MATLAB, C++ Yes Under development[3] Yes Yes Yes[4] Yes Yes No ?
Chainer Preferred Networks 2015 BSD Yes Linux, macOS Python Python No No Yes Yes Yes Yes Yes No Yes Yes
Deeplearning4j Skymind engineering team; Deeplearning4j community; originally Adam Gibson 2014 Apache 2.0 Yes Linux, macOS, Windows, Android (Cross-platform) C++, Java Java, Scala, Clojure, Python (Keras), Kotlin Yes No[5] Yes[6][7] Computational Graph Yes[8] Yes Yes Yes Yes[9]
Dlib Davis King 2002 Boost Software License Yes Cross-Platform C++ C++ Yes No Yes Yes Yes No Yes Yes Yes
Intel Data Analytics Acceleration Library Intel 2015 Apache License 2.0 Yes Linux, macOS, Windows on Intel CPU[10] C++, Python, Java C++, Python, Java[10] Yes No No Yes No Yes Yes
Intel Math Kernel Library Intel Proprietary No Linux, macOS, Windows on Intel CPU[11] C[12] Yes[13] No No Yes No Yes[14] Yes[14] No
Keras François Chollet 2015 MIT license Yes Linux, macOS, Windows Python Python, R Only if using Theano as backend Can use Theano or Tensorflow as backends Yes Yes Yes[15] Yes Yes Yes Yes[16] Yes
MATLAB + Neural Network Toolbox MathWorks Proprietary No Linux, macOS, Windows C, C++, Java, MATLAB MATLAB No No Train with Parallel Computing Toolbox and generate CUDA code with GPU Coder[17] No Yes[18][19] Yes[18] Yes[18] No With Parallel Computing Toolbox[20] Yes
Microsoft Cognitive Toolkit (CNTK) Microsoft Research 2016 MIT license[21] Yes Windows, Linux[22] (macOS via Docker on roadmap) C++ Python (Keras), C++, Command line,[23] BrainScript[24] (.NET on roadmap[25]) Yes[26] No Yes Yes Yes[27] Yes[28] Yes[28] No[29] Yes[30] Yes
Apache MXNet Apache Software Foundation 2015 Apache 2.0 Yes Linux, macOS, Windows,[31][32] AWS, Android,[33] iOS, JavaScript[34] Small C++ core library C++, Python, Julia, Matlab, JavaScript, Go, R, Scala, Perl Yes On roadmap[35] Yes Yes[36] Yes[37] Yes Yes Yes Yes[38] Yes
Neural Designer Artelnics Proprietary No Linux, macOS, Windows C++ Graphical user interface Yes No No ? ? No No No ?
OpenNN Artelnics 2003 GNU LGPL Yes Cross-platform C++ C++ Yes No Yes ? ? No No No ?
PyTorch Adam Paszke, Sam Gross, Soumith Chintala, Gregory Chanan (Facebook) 2016 BSD Yes Linux, macOS, Windows Python, C, CUDA Python Yes Via separately maintained package[39][40][41] Yes Yes Yes Yes Yes Yes Yes
Apache SINGA Apache Incubator 2015 Apache 2.0 Yes Linux, macOS, Windows C++ Python, C++, Java No Supported in V1.0 Yes ? Yes Yes Yes Yes Yes
TensorFlow Google Brain 2015 Apache 2.0 Yes Linux, macOS, Windows,[42] Android C++, Python, CUDA Python (Keras), C/C++, Java, Go, JavaScript, R,[43] Julia, Swift No On roadmap[44] but already with SYCL[45] support Yes Yes[46] Yes[47] Yes Yes Yes Yes Yes
Theano Université de Montréal 2007 BSD Yes Cross-platform Python Python (Keras) Yes Under development[48] Yes Yes[49][50] Through Lasagne's model zoo[51] Yes Yes Yes Yes[52] No
Torch Ronan Collobert, Koray Kavukcuoglu, Clement Farabet 2002 BSD Yes Linux, macOS, Windows,[53] Android,[54] iOS C, Lua Lua, LuaJIT,[55] C, utility library for C++/OpenCL[56] Yes Third party implementations[57][58] Yes[59][60] Through Twitter's Autograd[61] Yes[62] Yes Yes Yes Yes[63] No
Wolfram Mathematica Wolfram Research 1988 Proprietary No Windows, macOS, Linux, Cloud computing C++, Wolfram Language, CUDA Wolfram Language Yes No Yes Yes Yes[64] Yes Yes Yes Under Development Yes
  1. ^ Licenses here are a summary, and are not taken to be complete statements of the licenses. Some libraries may use other libraries internally under different licenses

Related software[edit]

See also[edit]

References[edit]

  1. ^ Atilim Gunes Baydin; Barak A. Pearlmutter; Alexey Andreyevich Radul; Jeffrey Mark Siskind (20 February 2015). "Automatic differentiation in machine learning: a survey". arXiv:1502.05767 [cs.LG].
  2. ^ "Microsoft/caffe". GitHub.
  3. ^ "OpenCL Caffe".
  4. ^ "Caffe Model Zoo".
  5. ^ "Support for Open CL · Issue #27 · deeplearning4j/nd4j". GitHub.
  6. ^ "N-Dimensional Scientific Computing for Java".
  7. ^ "Comparing Top Deep Learning Frameworks". Deeplearning4j.
  8. ^ Chris Nicholson; Adam Gibson. "Deeplearning4j Models".
  9. ^ Deeplearning4j. "Deeplearning4j on Spark". Deeplearning4j.
  10. ^ a b Intel® Data Analytics Acceleration Library (Intel® DAAL) | Intel® Software
  11. ^ Intel® Math Kernel Library (Intel® MKL) | Intel® Software
  12. ^ Deep Neural Network Functions
  13. ^ Using Intel® MKL with Threaded Applications | Intel® Software
  14. ^ a b Intel® Xeon Phi™ Delivers Competitive Performance For Deep Learning—And Getting Better Fast | Intel® Software
  15. ^ https://keras.io/applications/
  16. ^ Does Keras support using multiple GPUs? · Issue #2436 · fchollet/keras
  17. ^ "GPU Coder - MATLAB & Simulink". MathWorks. Retrieved 13 November 2017.
  18. ^ a b c "Neural Network Toolbox - MATLAB". MathWorks. Retrieved 13 November 2017.
  19. ^ "Deep Learning Models - MATLAB & Simulink". MathWorks. Retrieved 13 November 2017.
  20. ^ "Parallel Computing Toolbox - MATLAB". MathWorks. Retrieved 13 November 2017.
  21. ^ "CNTK/LICENSE.md at master · Microsoft/CNTK · GitHub". GitHub.
  22. ^ "Setup CNTK on your machine". GitHub.
  23. ^ "CNTK usage overview". GitHub.
  24. ^ "BrainScript Network Builder". GitHub.
  25. ^ ".NET Support · Issue #960 · Microsoft/CNTK". GitHub.
  26. ^ "How to train a model using multiple machines? · Issue #59 · Microsoft/CNTK". GitHub.
  27. ^ https://github.com/Microsoft/CNTK/issues/140#issuecomment-186466820
  28. ^ a b "CNTK - Computational Network Toolkit". Microsoft Corporation.
  29. ^ url=https://github.com/Microsoft/CNTK/issues/534
  30. ^ "Multiple GPUs and machines". Microsoft Corporation.
  31. ^ "Releases · dmlc/mxnet". Github.
  32. ^ "Installation Guide — mxnet documentation". Readthdocs.
  33. ^ "MXNet Smart Device". ReadTheDocs.
  34. ^ "MXNet.js". Github.
  35. ^ "Support for other Device Types, OpenCL AMD GPU · Issue #621 · dmlc/mxnet". GitHub.
  36. ^ https://mxnet.readthedocs.io/
  37. ^ "Model Gallery". GitHub.
  38. ^ "Run MXNet on Multiple CPU/GPUs with Data Parallel". GitHub.
  39. ^ https://github.com/hughperkins/pytorch-coriander
  40. ^ https://github.com/pytorch/pytorch/issues/488
  41. ^ https://github.com/pytorch/pytorch/issues/488#issuecomment-273626736
  42. ^ https://developers.googleblog.com/2016/11/tensorflow-0-12-adds-support-for-windows.html
  43. ^ interface), JJ Allaire (R; RStudio; Eddelbuettel, Dirk; Golding, Nick; Tang, Yuan; Tutorials), Google Inc (Examples and (2017-05-26), tensorflow: R Interface to TensorFlow, retrieved 2017-06-14
  44. ^ "tensorflow/roadmap.md at master · tensorflow/tensorflow · GitHub". GitHub. January 23, 2017. Retrieved May 21, 2017.
  45. ^ "OpenCL support · Issue #22 · tensorflow/tensorflow". GitHub.
  46. ^ https://www.tensorflow.org/
  47. ^ https://github.com/tensorflow/models
  48. ^ "Using the GPU — Theano 0.8.2 documentation".
  49. ^ http://deeplearning.net/software/theano/library/gradient.html
  50. ^ https://groups.google.com/d/msg/theano-users/mln5g2IuBSU/gespG36Lf_QJ
  51. ^ "Recipes/modelzoo at master · Lasagne/Recipes · GitHub". GitHub.
  52. ^ Using multiple GPUs — Theano 0.8.2 documentation
  53. ^ https://github.com/torch/torch7/wiki/Windows
  54. ^ "GitHub - soumith/torch-android: Torch-7 for Android". GitHub.
  55. ^ "Torch7: A Matlab-like Environment for Machine Learning" (PDF).
  56. ^ "GitHub - jonathantompson/jtorch: An OpenCL Torch Utility Library". GitHub.
  57. ^ "Cheatsheet". GitHub.
  58. ^ "cltorch". GitHub.
  59. ^ "Torch CUDA backend". GitHub.
  60. ^ "Torch CUDA backend for nn". GitHub.
  61. ^ https://github.com/twitter/torch-autograd
  62. ^ "ModelZoo". GitHub.
  63. ^ https://github.com/torch/torch7/wiki/Cheatsheet#distributed-computing--parallel-processing
  64. ^ http://resources.wolframcloud.com/NeuralNetRepository