Coder Social home page Coder Social logo

lmns3d / portage Goto Github PK

View Code? Open in Web Editor NEW

This project forked from laristra/portage

0.0 1.0 0.0 18.15 MB

Home Page: http://portage.lanl.gov

License: Other

CMake 1.19% C++ 63.60% Shell 2.34% Roff 27.67% Python 0.15% Objective-C 5.04% Dockerfile 0.01%

portage's Introduction

Build Status codecov.io

portage

The portage library provides a framework for general purpose data remapping - between meshes, between particles, or between meshes and particles - in computational physics applications. Remapping is facilitated through the use of user-supplied wrappers around meshes/particle swarms with their data. The remap algorithm is organized in three phases operating on the wrappers corresponding to the original mesh/particles: search for intersection candidates, calculate the intersection with candidates, then interpolate the results onto the new mesh or particle swarm. Algorithms for each of the phases can be customized (e.g. order of accuracy of the interpolation) and, through the wrappers, take advantage of hybrid parallelism (MPI+X).

Getting Started

To obtain a copy of portage and its submodules from GitHub, clone recursively:

git clone --recursive https://github.com/laristra/portage

If you are familiar with Docker, take a look at our Dockerfile for a working build environment. In particular, the Dockerfile builds off of the portage-buildenv Dockerfile, and uses our travis.yml file with Travis CI.

Prerequisites

Portage uses standard C++11 features, so a fairly modern compiler is needed. We regularly test with Intel 18.0.1, GCC 6.4.0, and GCC 7.3.0.
Utilizing the full capabilities of portage will require an MPI implementation; we regularly test with OpenMPI 2.1.2 The build system requires CMake version 3.0+.

The following libraries are also required (see examples below):

  • LAPACKE (3.8.0+)

  • Either Boost (1.68.0+) or Thrust (1.6.0+): We wrap some features of either one of these packages. If you would like to run with OpenMP or TBB threads, then you must use Thrust.

Portage provides wrappers for a few third-party mesh types. Building support for these is optional:

  • Jali:

    We regularly test with verison 1.0.0. You will need to set the Jali_Dir CMake variable if you wish to build support for Jali and its tests (see examples below).

  • FleCSI Burton Specialization:

    The Burton specialization in the flecsi-sp repository is built on top of FleCSI. You will need both projects to build support for the Burton mesh specialization and its tests. You will need to set ENABLE_FleCSI=True and add the FleCSI and FleCSI-sp install paths to the CMAKE_PREFIX_PATH; see examples below. Both FleCSI packages are under constant development. This version of portage is known to work with hash 374b56b of the FleCSI stable branch, and hash e78c594 of the FleCSI-SP stable branch.

The documentation is built using doxygen (1.8+).

For more details regarding CMake settings, see the documentation page.

Installing

In the simplest case where you have the appropriate versions mentioned above and Boost and LAPACKE are in the usual locations that CMake searches, then the build step is:

portage $ mkdir build
portage $ cd build
portage/build $ cmake -DENABLE_APP_TESTS=True ..
portage/build $ make

This compiles the serial code and about a dozen application tests. To run the tests, simply execute

portage/build $ make test

If you wish to install the code into the CMAKE_INSTALL_PREFIX then simply execute

portage/build $ make install

To build the documentation, one would configure with the -DENABLE_DOXYGEN=True flag, and then make doxygen.

See the examples below, or the documentation for more build instructions.

License

This project is licensed under a modified 3-clause BSD license - see the LICENSE file for details.

Release

This software has been approved for open source release and has been assigned LA-CC-16-084.



Example builds

Below we list copy & paste instructions for several local machines; we have a script that parses this README file to execute the examples below to ensure they build.

Darwin

Execute the following from the portage root directory:

# machine=darwin-fe

# VERSION NUMBERS
INTEL_VERSION=18.0.3
MPI_VERSION=3.1.3
JALI_VERSION=1.0.5
TANGRAM_VERSION=0.9.8
XMOF2D_VERSION=0.9.5
BOOST_VERSION=1.68.0

BUILD_PREFIX=/usr/projects/ngc/private

# load the correct boost, compiler, and openmpi
module purge
module load cmake openmpi/${MPI_VERSION}-intel_${INTEL_VERSION} boost/${BOOST_VERSION}

cmake \
    -D CMAKE_BUILD_TYPE=Release \
    -D ENABLE_UNIT_TESTS=True \
    -D ENABLE_APP_TESTS=True \
    -D ENABLE_MPI=True \
    -D Jali_DIR:FILEPATH=${BUILD_PREFIX}/jali/${JALI_VERSION}-intel-${INTEL_VERSION}-openmpi-${MPI_VERSION}/lib \
    -D TANGRAM_DIR:FILEPATH=${BUILD_PREFIX}/tangram/${TANGRAM_VERSION}-intel-${INTEL_VERSION}-openmpi-${MPI_VERSION} \
    -D XMOF2D_DIR:FILEPATH=${BUILD_PREFIX}/xmof2d/${XMOF2D_VERSION}-intel-${INTEL_VERSION}/share/cmake \
    -D LAPACKE_DIR=${BUILD_PREFIX}/lapack/3.8.0-patched-intel-${INTEL_VERSION} \
    ..

make -j16
ctest -j16 --output-on-failure

Snow

Execute the following from the portage root directory:

# machine=sn-fey
. /usr/share/lmod/lmod/init/sh
module load intel/18.0.5 openmpi/2.1.2 cmake
JALI_INSTALL_PREFIX=/usr/projects/ngc/private/jali/1.0.5-intel-18.0.5-openmpi-2.1.2
TANGRAM_INSTALL_PREFIX=/usr/projects/ngc/private/tangram/0.9.8-intel-18.0.5-openmpi-2.1.2
XMOF2D_INSTALL_PREFIX=/usr/projects/ngc/private/xmof2d/0.9.5-intel-18.0.5
LAPACKE_DIR=/usr/projects/ngc/private/lapack/3.8.0-patched-intel-18.0.5
mkdir build
cd build
cmake \
    -D CMAKE_BUILD_TYPE=Release \
    -D ENABLE_UNIT_TESTS=True \
    -D ENABLE_APP_TESTS=True \
    -D ENABLE_MPI=True \
    -D Jali_DIR:FILEPATH=$JALI_INSTALL_PREFIX/lib \
    -D TANGRAM_DIR:FILEPATH=$TANGRAM_INSTALL_PREFIX \
    -D XMOF2D_DIR:FILEPATH=$XMOF2D_INSTALL_PREFIX/share/cmake \
    -D LAPACKE_DIR=$LAPACKE_DIR \
    ..
make -j4
ctest -j4 --output-on-failure

If you want to build an app for performance testing, you should include Thrust and TCMalloc in your build. The cmake command for this is:

# machine=sn-fey::thrust
. /usr/share/lmod/lmod/init/sh
module load intel/18.0.5 openmpi/2.1.2 cmake
JALI_INSTALL_PREFIX=/usr/projects/ngc/private/jali/1.0.5-intel-18.0.5-openmpi-2.1.2
TANGRAM_INSTALL_PREFIX=/usr/projects/ngc/private/tangram/0.9.8-intel-18.0.5-openmpi-2.1.2-thrust
XMOF2D_INSTALL_PREFIX=/usr/projects/ngc/private/xmof2d/0.9.5-intel-18.0.5
LAPACKE_DIR=/usr/projects/ngc/private/lapack/3.8.0-patched-intel-18.0.5
mkdir build-thrust
cd build-thrust
cmake \
   -D CMAKE_BUILD_TYPE=Release \
   -D ENABLE_UNIT_TESTS=True \
   -D ENABLE_APP_TESTS=True \
   -D ENABLE_MPI=True \
   -D Jali_DIR:FILEPATH=$JALI_INSTALL_PREFIX/lib \
   -D TANGRAM_DIR:FILEPATH=$TANGRAM_INSTALL_PREFIX \
   -D XMOF2D_DIR:FILEPATH=$XMOF2D_INSTALL_PREFIX/share/cmake \
   -D ENABLE_THRUST=True \
   -D THRUST_DIR:FILEPATH=/usr/projects/ngc/private/include \
   -D ENABLE_TCMALLOC=True \
   -D TCMALLOC_LIB:FILEPATH=/usr/lib64/libtcmalloc.so \
   -D LAPACKE_DIR=$LAPACKE_DIR \
   ..
make -j4
ctest -j4 --output-on-failure

Varan

Execute the following from the portage root directory:

# machine=varan
export MODULEPATH=""
. /opt/local/packages/Modules/default/init/sh
module load intel/18.0.1 openmpi/2.1.2 cmake
JALI_INSTALL_PREFIX=/usr/local/codes/ngc/private/jali/1.0.5-intel-18.0.1-openmpi-2.1.2
TANGRAM_INSTALL_PREFIX=/usr/local/codes/ngc/private/tangram/0.9.8-intel-18.0.1-openmpi-2.1.2
XMOF2D_INSTALL_PREFIX=/usr/local/codes/ngc/private/xmof2d/0.9.5-intel-18.0.1
LAPACKE_DIR=/usr/local/codes/ngc/private/lapack/3.8.0-patched-intel-18.0.1/
LAPACKE_INCLUDE_DIR=$LAPACKE_DIR/include
LAPACKE_LIBRARY_DIR=$LAPACKE_DIR
mkdir build
cd build
cmake \
    -D CMAKE_BUILD_TYPE=Debug \
    -D ENABLE_UNIT_TESTS=True \
    -D ENABLE_APP_TESTS=True \
    -D ENABLE_MPI=True \
    -D Jali_DIR:FILEPATH=$JALI_INSTALL_PREFIX/lib \
    -D TANGRAM_DIR:FILEPATH=$TANGRAM_INSTALL_PREFIX \
    -D XMOF2D_DIR:FILEPATH=$XMOF2D_INSTALL_PREFIX/share/cmake \
    -D LAPACKE_DIR=$LAPACKE_DIR \
    ..
make -j2
ctest -j2 --output-on-failure

If you want to build an app that uses FleCSI, you can link against a built verison of FleCSI on Varan. An example is below:

# machine=varan::flecsi
export MODULEPATH=""
. /opt/local/packages/Modules/default/init/sh
module load gcc/6.4.0 openmpi/2.1.2 cmake
FLECSI_INSTALL_PREFIX=/usr/local/codes/ngc/private/flecsi/374b56b-gcc-6.4.0
FLECSISP_INSTALL_PREFIX=/usr/local/codes/ngc/private/flecsi-sp/e78c594-gcc-6.4.0
TANGRAM_INSTALL_PREFIX=/usr/local/codes/ngc/private/tangram/0.9.8-gcc-6.4.0-openmpi-2.1.2
XMOF2D_INSTALL_PREFIX=/usr/local/codes/ngc/private/xmof2d/0.9.5-gcc-6.4.0
LAPACKE_DIR=/usr/local/codes/ngc/private/lapack/3.8.0-patched-gcc-6.4.0
LAPACKE_INCLUDE_DIR=$LAPACKE_DIR/include
LAPACKE_LIBRARY_DIR=$LAPACKE_DIR
mkdir build-flecsi
cd build-flecsi
cmake \
    -D CMAKE_C_COMPILER=`which mpicc` \
    -D CMAKE_CXX_COMPILER=`which mpiCC` \
    -D CMAKE_BUILD_TYPE=Debug \
    -D ENABLE_UNIT_TESTS=True \
    -D ENABLE_APP_TESTS=True \
    -D ENABLE_MPI=True \
    -D ENABLE_FleCSI=True \
    -D CMAKE_PREFIX_PATH="$FLECSI_INSTALL_PREFIX;$FLECSISP_INSTALL_PREFIX" \
    -D TANGRAM_DIR:FILEPATH=$TANGRAM_INSTALL_PREFIX \
    -D XMOF2D_DIR:FILEPATH=$XMOF2D_INSTALL_PREFIX/share/cmake \
    -D LAPACKE_DIR=$LAPACKE_DIR \
    ..
make -j2
ctest -j2 --output-on-failure

portage's People

Contributors

nray avatar junghans avatar angelaherring avatar cmsquared avatar cferenba avatar ekikinzon avatar hobywan avatar ktsai7 avatar

Watchers

James Cloos avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.