The installation of DeepFlame is simple and requires OpenFOAM-7, LibCantera, and PyTorch.
If Ubuntu is used as the subsystem, please use Ubuntu:20.04 instead of the latest version. OpenFOAM-7 accompanied by ParaView 5.6.0 is not available for Ubuntu-latest.
First install OpenFOAM-7 if it is not already installed.
sudo sh -c "wget -O - https://dl.openfoam.org/gpg.key | apt-key add -" sudo add-apt-repository http://dl.openfoam.org/ubuntu sudo apt-get update sudo apt-get -y install openfoam7
OpenFOAM-7 and ParaView-5.6.0 will be installed in the
There is a commonly seen issue when installing OpenFOAM via
apt-get install with an error message:
could not find a distribution template for Ubuntu/focal. To resolve this issue, you can refer to issue#54.
LibCantera and PyTorch can be easily installed via conda. If you have compatible platform, run the following command to install DeepFlame.
conda create -n deepflame python=3.8 conda activate deepflame conda install -c cantera libcantera-devel conda install pytorch torchvision torchaudio pytorch-cuda=11.6 -c pytorch -c nvidia conda install pybind11 conda install -c conda-forge easydict
Please go to PyTorch’s official website to check your system compatability and choose the installation command line that is suitable for your platform.
Miniconda3/envs/deepflame directory and make sure the install was successful (lib/ include/ etc. exist).
1. Source your OpenFOAM-7 bashrc to configure the $FOAM environment.
This depends on your own path for OpenFOAM-7 bashrc.
If you have installed using
apt-get install, then:
If you compiled from source following the official guide, then:
Check your environment using
echo $FOAM_ETC and you should get the directory path for your OpenFOAM-7 bashrc you just used in the above step.
2. Clone the DeepFlame repository:
git clone https://github.com/deepmodeling/deepflame-dev.git
3. Configure the DeepFlame environment:
cd deepflame-dev . configure.sh --use_pytorch source ./bashrc
Check your environment using
echo $DF_ROOT and you should get the path for the
1.3. Build and Install
Finally you can build and install DeepFlame:
You may come accross an error regarding shared library
libmkl_rt.so.2 when libcantera is installed through cantera channel. If so, go to your conda environment and check the existance of
libmkl_rt.so.1, and then link
cd ~/miniconda3/envs/deepflame/lib ln -s libmkl_rt.so.1 libmkl_rt.so.2
If you have compiled DeepFlame successfully, you should see the print message in your terminal:
1.4. Other Options
DeepFlame also provides users with LibTorch and CVODE (no DNN version) options.
1. If you choose to use LibTorch (C++ API for Torch), first create the conda env and install LibCantera:
conda create -n df-libtorch conda activate df-libtorch conda install -c cantera libcantera-devel
Then you can pass your own libtorch path to DeepFlame.
cd deepflame-dev . configure.sh --libtorch_dir /path/to/libtorch/ source ./bashrc . install.sh
Some compiling issues may happen due to system compatability. Instead of using conda installed Cantera C++ lib and the downloaded Torch C++ lib, try to compile your own Cantera and Torch C++ libraries.
2. If you just need DeepFlame’s CVODE solver without DNN model, just install LibCantera via conda.
conda create -n df-notorch conda activate df-notorch conda install -c cantera libcantera-devel
If the conda env
df-notorch is activated, install DeepFlame by running:
cd deepflame-dev . configure.sh source ./bashrc . install.sh
df-notorch not activated (or you have a self-complied libcantera), specify the path to your libcantera:
. configure.sh --libcantera_dir /your/path/to/libcantera/ source ./bashrc . install.sh