From b25139f6f243edc7f419f3ece4aea48495c13303 Mon Sep 17 00:00:00 2001 From: Lv Tao Date: Thu, 13 Sep 2018 00:19:14 +0800 Subject: [PATCH 1/2] fix build from souce doc for mkldnn backend --- docs/install/build_from_source.md | 8 ++++---- docs/install/ubuntu_setup.md | 10 +++++++++- 2 files changed, 13 insertions(+), 5 deletions(-) diff --git a/docs/install/build_from_source.md b/docs/install/build_from_source.md index 6c0a4dab251a..2a1f9e33e041 100644 --- a/docs/install/build_from_source.md +++ b/docs/install/build_from_source.md @@ -40,7 +40,7 @@ MXNet supports multiple mathematical backends for computations on the CPU: * [Apple Accelerate](https://developer.apple.com/documentation/accelerate) * [ATLAS](http://math-atlas.sourceforge.net/) * [MKL](https://software.intel.com/en-us/intel-mkl) (MKL, MKLML) -* [MKLDNN](https://github.com/intel/mkl-dnn) +* [MKL-DNN](https://github.com/intel/mkl-dnn) * [OpenBLAS](http://www.openblas.net/) Usage of these are covered in more detail in the [build configurations](#build-configurations) section. @@ -92,13 +92,13 @@ The following lists show this order by library and `cmake` switch. For desktop platforms (x86_64): -1. MKLDNN (submodule) | `USE_MKLDNN` +1. MKL-DNN (submodule) | `USE_MKLDNN` 2. MKL | `USE_MKL_IF_AVAILABLE` 3. MKLML (downloaded) | `USE_MKLML` 4. Apple Accelerate | `USE_APPLE_ACCELERATE_IF_AVAILABLE` | Mac only 5. OpenBLAS | `BLAS` | Options: Atlas, Open, MKL, Apple -Note: If `USE_MKL_IF_AVAILABLE` is set to False then MKLML and MKLDNN will be disabled as well for configuration +Note: If `USE_MKL_IF_AVAILABLE` is set to False then MKLML and MKL-DNN will be disabled as well for configuration backwards compatibility. For embedded platforms (all other and if cross compiled): @@ -129,7 +129,7 @@ It has following flavors: -* MKLDNN is a separate open-source library, it can be used separately from MKL or MKLML. It is +* MKL-DNN is a separate open-source library, it can be used separately from MKL or MKLML. It is shipped as a subrepo with MXNet source code (see 3rdparty/mkldnn or the [mkl-dnn project](https://github.com/intel/mkl-dnn)) Since the full MKL library is almost always faster than any other BLAS library it's turned on by default, diff --git a/docs/install/ubuntu_setup.md b/docs/install/ubuntu_setup.md index 432310dd763d..3beb1e62af09 100644 --- a/docs/install/ubuntu_setup.md +++ b/docs/install/ubuntu_setup.md @@ -70,7 +70,7 @@ pip install mxnet-cu92mkl Alternatively, you can use the table below to select the package that suits your purpose. -| MXNet Version | Basic | CUDA | MKL | CUDA/MKL | +| MXNet Version | Basic | CUDA | MKL-DNN | CUDA/MKL-DNN | |-|-|-|-|-| | Latest | mxnet | mxnet-cu92 | mxnet-mkl | mxnet-cu92mkl | @@ -167,6 +167,14 @@ If building on CPU and using OpenBLAS: make -j $(nproc) USE_OPENCV=1 USE_BLAS=openblas ``` +If building on CPU and using MKL and MKL-DNN (make sure MKL is installed according to [Math Library Selection](build_from_source.html#math-library-selection) and [MKL-DNN README](https://github.com/apache/incubator-mxnet/blob/master/MKLDNN_README.md)): + +```bash + git clone --recursive https://github.com/apache/incubator-mxnet.git + cd mxnet + make -j $(nproc) USE_OPENCV=1 USE_BLAS=mkl USE_MKLDNN=1 +``` + If building on GPU and you want OpenCV and OpenBLAS (make sure you have installed the [CUDA dependencies first](#cuda-dependencies)): ```bash From 547c9bad7915620f785375ca4921c377c0dc07b6 Mon Sep 17 00:00:00 2001 From: Lv Tao Date: Thu, 13 Sep 2018 00:26:42 +0800 Subject: [PATCH 2/2] fix doc --- docs/install/build_from_source.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/install/build_from_source.md b/docs/install/build_from_source.md index 2a1f9e33e041..4f0235fa926a 100644 --- a/docs/install/build_from_source.md +++ b/docs/install/build_from_source.md @@ -130,7 +130,7 @@ It has following flavors: by the cmake script (see cmake/DownloadMKLML.cmake).--> * MKL-DNN is a separate open-source library, it can be used separately from MKL or MKLML. It is - shipped as a subrepo with MXNet source code (see 3rdparty/mkldnn or the [mkl-dnn project](https://github.com/intel/mkl-dnn)) + shipped as a subrepo with MXNet source code (see 3rdparty/mkldnn or the [MKL-DNN project](https://github.com/intel/mkl-dnn)) Since the full MKL library is almost always faster than any other BLAS library it's turned on by default, however it needs to be downloaded and installed manually before doing `cmake` configuration.