Compiling ParaView3 for Cray supercomputers

From KitwarePublic
Jump to navigationJump to search

This wiki page is currently a work in progress.

Objectives and Overview

Our goal is to run pvbatch on Cray massively parallel processor systems. Pvbatch is ParaView's MPI-enabled batch application. It reads batch scripts written in python and distributes the work across many processors. Pvbatch will be built when you compile ParaView 3. Before you can compile ParaView 3 you must compile CMake, OSMesa, and Python. The entire process will take about three hours to complete and you should have at least one gigabyte of workspace.

These instructions are intended for Cray MPP systems running the Catamount operating system. Specifically, these instructions have been tested on the Bigben XT3 supercomputer at Pittsburgh Supercomputing Center.

Terminology

These terms are probably self explanatory, but just to clarify...

  • front end node - the computer/shell you log into and work on.
  • native operating system - the operating system that runs on the front end node (SuSE linux for example)
  • native build - software that executes on the front end nodes
  • compute node - the computers/processors running scientific computation
  • catamount - the operating system that runs on the compute nodes
  • catamount build - software that has been cross compiled to execute on the compute node

Build steps

You will log into a shell on a front end node. You will download the source code and then compile CMake, OSMesa, Python, and ParaView3. Some of these packages must be compiled twice- one native version and one cross compiled version. The steps are:

  1. Compile a CMake native build.
  2. Compile an OSMesa native build.
  3. Compile an OSMesa catamount build.
  4. Compile a Python native build.
  5. Compile a Python catamount build.
  6. Compile a ParaView 3 native build.
  7. Compile a ParaView 3 catamount build.
  • Step 2 is optional if your front end system already has OSMesa installed.
  • Step 4 is optional if your front end system already has Python installed.

Why are the native builds required?

During the ParaView build process helper binaries are compiled and executed to generate source files for future build targets. When you cross compile ParaView the helper binaries cannot execute since they are non-native to the front end node you are working on. The solution is to build a native version of ParaView first, and then tell CMake to use the native helper binaries while cross compiling.

Additional information

The instructions on this wiki page detail the steps required to build the software but do not provide additional information. Some concepts used but not explained on this wiki page are TryRunResults and Toolchain files. You may find these pages very helpful:

Compilers

The front end nodes have more than one compiler installed. We will use the PGI and GNU compilers. At Bigben, the PGI compiler is the default compiler when you log in. You can switch compilers like this:

## switch from PGI to GNU compiler
module switch PrgEnv-pgi PrgEnv-gnu

## switch from GNU to PGI compiler
module switch PrgEnv-gnu PrgEnv-pgi

Toolchains

When you cross compile with CMake you will input a toolchain file. You will use one toolchain file when you use the PGI compiler, and a different toolchain file when you use GNU.

For more information see the CMake/CrayXT3 page.

Directory structure

Setup your directories however you'd like. Path names on this wiki page are usually given in two forms, a general form and an example form, where ~/ is your home directory:

General form                      Example form

<install-dir>                     ~/install
<catamount-install-dir>           ~/install-catamount
<toolchain-dir>                   ~/toolchains

<paraview-source-dir>             ~/projects/paraview/ParaView3
<paraview-native-build-dir>       ~/projects/paraview/build-native

...                               ...

Here is how my directory tree looks:

~/
  install/
    bin/
    include/
    lib/

  install-catamount/
    bin/
    include/
    lib/

  toolchains/

  projects/
    cmake/
      CMake/
      build-native/

    mesa/
      mesa-native/
      mesa-catamount/

    python/
      python-for-cmake/
      build-native/
      build-catamount/

    paraview/
      ParaView3/
      build-native/
      build-catamount/

Note, some of these directories will be created automatically when you extract archives or checkout code from cvs/svn. The install directories and subdirectories are created automatically when you run "make install" commands. Here is a command you could use to set up the directory tree:

cd ~/
mkdir toolchains projects projects/cmake projects/cmake/build projects/mesa projects/python projects/python/build-native projects/python/build-catamount projects/paraview projects/paraview/build-native projects/paraview/build-catamount 

Compiling CMake

CMake home page

Getting the source

You will need the latest version of CMake from CVS.

cd ~/projects/cmake
cvs -d :pserver:anonymous@www.cmake.org:/cvsroot/CMake login
## respond with password: cmake
cvs -d :pserver:anonymous@www.cmake.org:/cvsroot/CMake co CMake

Native build

It shouldn't matter which compiler you use to build CMake. I used the default PGI compiler.

General build command:

cd <cmake-build-dir>
<cmake-src-dir>/bootstrap --prefix=<native-install-dir>
make
make install

Example build command:

cd ~/projects/cmake/build
~/projects/cmake/CMake/bootstrap --prefix=~/install
make
make install

Compiling OSMesa

You will download the Mesa source code and compile the OSMesa target. OSMesa (off screen mesa) allows rendering with the OpenGL API directly into main memory instead of using system display memory. The native build is only required if your native system does not have OSMesa already installed. At Bigben, OSMesa was found at /usr/lib64/libOSMesa.so with headers in /usr/include.

Mesa home page.

Getting the source

You can download the Mesa source directly using wget. In case the url changes, here is the Mesa download page

cd ~/projects/mesa
wget http://easynews.dl.sourceforge.net/sourceforge/mesa3d/MesaLib-7.0.2.tar.gz
tar -zxf MesaLib-7.0.2.tar.gz

Native build

Use the PGI compiler. Since Mesa uses an in-source build you might want to copy the source dir before you start.

cd ~/projects/mesa
cp -r Mesa-7.0.2 mesa-native

## edit mesa-native/configs/default
##
## replace line:   INSTALL_DIR = /usr/local
## with:           INSTALL_DIR = ~/install
## or:             INSTALL_DIR = <native-install-dir>

cd mesa-native
make linux-osmesa
make install


Catamount build

Use the PGI compiler. Since Mesa uses an in-source build you might want to copy the source dir before you start.

cd ~/projects/mesa
cp -r Mesa-7.0.2 mesa-catamount

## edit mesa-catamount/configs/default
##
## replace line:   INSTALL_DIR = /usr/local
## with:           INSTALL_DIR = ~/install-catamount
## or:             INSTALL_DIR = <catamount-install-dir>

cd mesa-catamount
make catamount-osmesa-pgi
make install


Compiling Python

CMake files for building Python can be checked out from the ParaView repository. The native python build is only required if your system doesn't already have python libraries and binaries installed. On Bigben, python was located at /usr/lib64/libpython2.3.so and /usr/bin/python2.3.

Getting the source

These instructions use Python from the subversion repository. It is possible to use Python release 2.5.1 and apply a patch, more details are here. The following commands grab Python source from subversion and place it into a directory named python-with-cmake. Next cvs downloads CMake files directly into the python-with-cmake directory.

cd ~/projects/python
svn co http://svn.python.org/projects/python/trunk python-with-cmake

cvs -d :pserver:anoncvs@www.paraview.org:/cvsroot/ParaView3 login
## respond with empty password
cvs -d :pserver:anoncvs@www.paraview.org:/cvsroot/ParaView3 co -d python-with-cmake ParaView3/Utilities/CMakeBuildForPython

Native build

You can use the PGI compiler.

General build command:

cd <python-build-native-dir>
<native-install-dir>/bin/ccmake <python-source-dir> -DCMAKE_INSTALL_PREFIX=<native-install-dir>

## configure with ccmake

make
make install

Example build command:

cd ~/projects/python/build-native
~/install/bin/ccmake ~/projects/python/python-with-cmake -DCMAKE_INSTALL_PREFIX=~/install

## configure with ccmake

make
make install


Catamount build

You can use the PGI compiler. When configuring with CMake:

  • Confirm all MODULE__*_SHARED options are off
  • Turn off MODULE__pwd_ENABLE
  • Turn off ENABLE_IPV6
  • Turn off WITH_THREAD

General build command:

cd <python-build-catamount-dir>
<native-install-dir>/bin/ccmake -DCMAKE_TOOLCHAIN_FILE=~<toolchain-dir>/Toolchain-Catamount-gcc.cmake -DCMAKE_INSTALL_PREFIX=<catamount-install-dir> -C <python-source-dir>/CMake/TryRunResults-Python-catamount-gcc.cmake <python-source-dir>

## configure with ccmake

make
make install

Example build command:

cd ~/projects/python/build-catamount

~/install/bin/ccmake -DCMAKE_TOOLCHAIN_FILE=~/toolchains/Toolchain-Catamount-gcc.cmake -DCMAKE_INSTALL_PREFIX=~/install-catamount -C ~/projects/python/python-with-cmake/CMake/TryRunResults-Python-catamount-gcc.cmake ~/projects/python/python-with-cmake/

## configure with ccmake

make
make install

Compiling ParaView3

Getting the source

cd ~/projects/paraview
cvs -d :pserver:anoncvs@www.paraview.org:/cvsroot/ParaView3 login
## respond with empty password
cvs -d :pserver:anoncvs@www.paraview.org:/cvsroot/ParaView3 co ParaView3

Native build

You will build a native version of ParaView but do not need to install it. Use the PGI compiler. When configuring ccmake:

  • turn on BUILD_SHARED_LIBS
  • turn on PARAVIEW_ENABLE_PYTHON
  • watch out for these X11 pitfalls.

General build command:

cd <paraview-native-build-dir>
<native-install-dir>/bin/ccmake -DPARAVIEW_BUILD_QT_GUI=0 <paraview-source-dir>

## configure with ccmake

make

Example build command:

cd ~/projects/paraview/build-native

~/install/bin/ccmake -DPARAVIEW_BUILD_QT_GUI=0 ~/projects/paraview/ParaView3

## configure with ccmake

make

Catamount build

When configuring with CMake:

  • turn on PARAVIEW_ENABLE_PYTHON
  • turn on PARAVIEW_USE_MPI
  • turn OFF VTK_USE_METAIO
  • confirm VTK_OPENGL_HAS_OSMESA: ON
  • confirm VTK_NO_PYTHON_THREADS: ON
  • confirm BUILD_SHARED_LIBS: OFF
  • confirm OSMESA_LIBRARY is the one you cross compiled and installed locally.
  • confirm PYTHON_LIBRARY is the one you cross compiled and installed locally.
  • set PYTHON_EXECUTABLE to a native python binary, NOT a cross compiled python binary

General build command:

cd <paraview-catamount-build-dir>
<native-install-dir>/bin/ccmake -DCMAKE_TOOLCHAIN_FILE=<toolchain-dir>/Toolchain-Catamount-pgi.cmake -DParaView3CompileTools_DIR=<paraview-native-build-dir> -DPARAVIEW_BUILD_QT_GUI=0 -C <paraview-source-dir>/CMake/TryRunResults-ParaView3-catamount-gcc.cmake <paraview-source-dir>

## configure with ccmake

make

Example build command:

cd ~/projects/paraview/build-catamount
~/install/bin/ccmake -DCMAKE_TOOLCHAIN_FILE=~/toolchains/Toolchain-Catamount-pgi.cmake -DParaView3CompileTools_DIR=~/projects/paraview/build-native -DPARAVIEW_BUILD_QT_GUI=0 -C ~/projects/paraview/ParaView3/CMake/TryRunResults-ParaView3-catamount-gcc.cmake ~/projects/paraview/ParaView3

## configure with ccmake

make

Testing

Here is a simple python script to test pvbatch, coloredSphere.py:

from paraview.servermanager import *

Connect()

sphere =sources.SphereSource()
sphere.ThetaResolution = 100
sphere.PhiResolution = 100
filter = filters.ProcessIdScalars()
filter.Input = sphere
view = CreateRenderView()

display = CreateRepresentation(filter, view)
lt = rendering.PVLookupTable()
display.LookupTable = lt
display.ColorAttributeType = 0; # Point Data
display.ColorArrayName = "ProcessId"
lt.RGBPoints = [0.0, 0, 0, 1, 1, 1, 0, 0]
lt.ColorSpace = 1 ; # HSV
view.StillRender()
view.ResetCamera()
view.StillRender()

view.WriteImage("/usr/users/6/bgeveci/coloredSphere.png","vtkPNGWriter");

Note the script contains an absolute path to write its output file, coloredSphere.png. The script could be run with the command:

mpirun -np 2 pvbatch coloredSphere.py

But on Bigben you do not enter the mpirun command directly. Instead the Bigben system wraps all jobs in a job script. On Bigben, the job script coloredSphere.job might look like:

#!/bin/sh
#PBS -l size=2
#PBS -l walltime=30
#PBS -j oe
#PBS -q debug

set echo

pbsyod -size $PBS_O_SIZE ${HOME}/projects/paraview/build-catamount/bin/pvbatch ${HOME}/coloredSphere.py

The script is submitted by typing:

qsub coloredSphere.job

You can check the status of submitted jobs by typing:

qstat -a

More information about running jobs at Bigben can be found here.

Pitfalls

X11 Pitfalls

The native operating system may or may not have X11 installed. If X11 is not found, you need to compile the native ParaView build with OSMesa support.

To enable OSMesa:

  • erase the contents of OPENGL_gl_LIBRARY, make it an empty string
  • confirm OSMESA_LIBRARY is found
  • confirm OSMESA_INCLUDE_DIR is found

Even if you think you are using OSMesa and not X11, if CMake finds X11 libraries it might try to use them. Even if the cache lists X11 libraries as not found, the CMake procedure CHECK_FUNCTION_EXISTS might find them and decide to use them.

You can fix this by:

  • Set internal CMakeCache variable CMAKE_USE_GLX_PROC_ADDRESS to 0
  • Fix the checks in <paraview-source-dir>/VTK/Rendering/CMakeLists.txt so that VTK_NO_EXTENSION_LOADING gets set to 1.
  • Confirm that <paraview-build-dir>/VTK/Rendering/vtkOpenGLExtensionManagerConfigure.h has VTK_NO_EXTENSION_LOADING defined and all other definitions are commented out or undefined.

TODO

Add toolchain files.

Explain why TryRunResults-ParaView3-catamount-gcc.cmake has 'gcc' in the filename and not 'pgi'.

Add lines to TryRunResults-ParaView3-catamount-gcc.cmake: SET(HAVE_PTHREAD_H FALSE CACHE BOOL "No usable pthread.h" FORCE) SET(CMAKE_HAVE_PTHREAD_H FALSE CACHE BOOL "No usable pthread.h" FORCE)

Fix problem: native python binary built with gcc cannot execute when pgi compiler is loaded (when you are building paraview).

Perhaps X11 pitfalls needs more details