|
SEARCH
TOOLBOX
LANGUAGES
Forum Menu
BLAS, LAPACK not found while compiling NWchem 6.5 against Intel MKL on a LINUX cluster
From NWChem
Viewed 448 times, With a total of 12 Posts
|
Clicked A Few Times
Threads 1
Posts 11
|
|
5:02:23 AM PDT - Mon, Jul 6th 2015 |
|
Hi there,
I am struggling in getting NWChem compiled on a LINUX cluster using Intel's MKL (composer XE 2013 SP1) because it does not find the BLAS/LAPACK libraires.
Here it is some of the Intel/MPI environmental variables as seen by the compilation script:
%%%
python -V
Python 2.7.8 :: Anaconda 2.1.0 (64-bit)
ifort --version
ifort (IFORT) 14.0.0 20130728
Copyright (C) 1985-2013 Intel Corporation. All rights reserved.
echo $MKLROOT
/opt/intel/composer_xe_2013_sp1.0.080/mkl
mpirun --version
Intel(R) MPI Library for Linux* OS, Version 4.1 Update 1 Build 20130522
Copyright (C) 2003-2013, Intel Corporation. All rights reserved.
echo $I_MPI_ROOT
/opt/intel//impi/4.1.1.036
echo $PATH
/opt/intel//impi/4.1.1.036/intel64/bin:/opt/intel/composer_xe_2013_sp1.0.080/bin/intel64:/opt/intel/composer_xe_2013_sp1.0.080/mpirt/bin/intel64:/opt/intel/composer_xe_2013_sp1.0.080/debugger/gdb/intel64_mic/py27/bin:/opt/intel/composer_xe_2013_sp1.0.080/debugger/gdb/intel64/py27/bin:/opt/intel/composer_xe_2013_sp1.0.080/bin/intel64:/opt/intel/composer_xe_2013_sp1.0.080/bin/intel64_mic:/opt/intel/composer_xe_2013_sp1.0.080/debugger/gui/intel64:/usr/local/applications/nanoscopium/anaconda/bin:/usr/lib64/qt-3.3/bin:/usr/local/bin:/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/sbin:/usr/local/applications/ansys_inc/v145/ansys/bin:/usr/local/applications/Fiji.app:/usr/local/applications/Wolfram/Mathematica/9.0/bin:/usr/local/applications/Matlab/2013a/bin:/usr/local/applications/opera/Opera_17/bin:/usr/local/applications/wien2k/14.2:/usr/local/applications/wien2k/14.2/SRC_structeditor/bin:/usr/local/applications/wien2k/14.2/SRC_IRelast/script-elastic:.:/usr/local/applications/wien2k/14.2:.:/usr/local/applications/quantumwise/VNL-ATK-2014.2/bin
echo $LD_LIBRARY_PATH
/opt/intel//impi/4.1.1.036/intel64/lib:/opt/intel/composer_xe_2013_sp1.0.080/compiler/lib/intel64:/opt/intel/composer_xe_2013_sp1.0.080/mpirt/lib/intel64:/opt/intel/composer_xe_2013_sp1.0.080/ipp/../compiler/lib/intel64:/opt/intel/composer_xe_2013_sp1.0.080/ipp/lib/intel64:/opt/intel/composer_xe_2013_sp1.0.080/compiler/lib/intel64:/opt/intel/composer_xe_2013_sp1.0.080/mkl/lib/intel64:/opt/intel/composer_xe_2013_sp1.0.080/tbb/lib/intel64/gcc4.4:/usr/local/applications/opera/mesa-7.10.1/lib64::/usr/local/applications/ansys_inc/v145/ansys/lib/linx64:/usr/local/applications/opera/Opera_17/lib:/usr/lib/
%%%
the linking options where provider by the IntelĀ® Math Kernel Library Link Line Advisor
export BLASOPT="-L${MKLROOT}/lib/intel64 -lmkl_scalapack_ilp64 -lmkl_intel_ilp64 -lmkl_core -lmkl_sequential -lmkl_blacs_intelmpi_ilp64 -lpthread -lm"
export SCALAPACK=" -L${MKLROOT}/lib/intel64 -lmkl_scalapack_ilp64 -lmkl_intel_ilp64 -lmkl_core -lmkl_sequential -lmkl_blacs_intelmpi_ilp64 -lpthread -lm"
and if I look at the ${MKLROOT}/lib/intel64 directory, I see the libraries in question:
[diaz-ortiz@idai nwchem-6.5]$ls -1 ${MKLROOT}/lib/intel64/ | grep ilp64
libfftw3x_cdft_ilp64.a
libmkl_blacs_ilp64.a
libmkl_blacs_intelmpi_ilp64.a
libmkl_blacs_intelmpi_ilp64.so*
libmkl_blacs_openmpi_ilp64.a
libmkl_blacs_sgimpt_ilp64.a
libmkl_blas95_ilp64.a
libmkl_gf_ilp64.a
libmkl_gf_ilp64.so*
libmkl_intel_ilp64.a
libmkl_intel_ilp64.so*
libmkl_lapack95_ilp64.a
libmkl_scalapack_ilp64.a
libmkl_scalapack_ilp64.so*
with DDOT in
[diaz-ortiz@idai nwchem-6.5]$nm ${MKLROOT}/lib/intel64/libmkl_intel_ilp64.so |grep DDOT
00000000000e7904 T DDOT
00000000000e3450 T DDOTI
However, while compiling I keep getting this message:
configure: WARNING: BLAS library not found, using internal BLAS
configure: WARNING: LAPACK library not found, using internal LAPACK
with the corresponding compile-time error:
tce_residual_t1.F(176): error #6404: This name does not have a type, and must have an explicit type. [DDOT]
residual = ddot(size,dbl_mb(k_r1),1,dbl_mb(k_r1),1)
compilation aborted for tce_residual_t1.F (code 1)
Any ideas what might be the cause of my troubles?
Thanks.
Alejandro
|
|
|
|
Clicked A Few Times
Threads 1
Posts 11
|
|
5:13:21 AM PDT - Mon, Jul 6th 2015 |
|
Compilation script
|
Here it is script I am using (in my failed attempts) to compile NWChem 6.5:
export NWCHEM_TOP=/work/informatique/sr/diaz-ortiz/Programs_Users/nwchem-6.5
export NWCHEM_TARGET=LINUX64
export ARMCI_NETWORK="MPI-MT"
export USE_MPI=y
export USE_MPIF=y
export USE_MPIF4=y
export MPI_LOC="${I_MPI_ROOT}"
export MPI_LIB="${I_MPI_ROOT}/intel64/lib -L${I_MPI_ROOT}/intel64/lib"
export MPI_INCLUDE="${I_MPI_ROOT}/intel64/include"
export LIBMPI="-lmpigf -lmpi -lmpigi -ldl -lrt -lpthread"
export NWCHEM_MODULES="all python"
export USE_NOFSCHECK=TRUE
export USE_NOIO=TRUE
export LARGE_FILES=TRUE
export MRCC_THEORY=TRUE
export PYTHONHOME=/usr/local/applications/nanoscopium/anaconda
export PYTHONVERSION=2.7
export USE_PYTHON64=y
export PYTHONLIBTYPE=so
export HAS_BLAS=yes
export USE_SCALAPACK=y
export BLASOPT="-L${MKLROOT}/lib/intel64 -lmkl_scalapack_ilp64 -lmkl_intel_ilp64 -lmkl_core -lmkl_sequential -lmkl_blacs_intelmpi_ilp64 -lpthread -lm"
export SCALAPACK=" -L${MKLROOT}/lib/intel64 -lmkl_scalapack_ilp64 -lmkl_intel_ilp64 -lmkl_core -lmkl_sequential -lmkl_blacs_intelmpi_ilp64 -lpthread -lm"
export FC=ifort
export CC=icc
cd $NWCHEM_TOP/src
make realclean
cd $NWCHEM_TOP/src
make nwchem_config
make FC=ifort CC=icc FOPTIONS="-i8 -I${MKLROOT}/include" FOPTIMIZE=-O3
|
|
|
-
Edoapra Forum:Admin, Forum:Mod, bureaucrat, sysop
|
|
Forum Vet
Threads 4
Posts 936
|
|
10:33:20 AM PDT - Mon, Jul 6th 2015 |
|
Please report the first upload the file $NWCHEM_TOP/src/tools/build/config.log to a website where we can read it
|
|
|
|
Clicked A Few Times
Threads 1
Posts 11
|
|
|
|
|
Clicked A Few Times
Threads 1
Posts 11
|
|
7:03:09 AM PDT - Fri, Jul 10th 2015 |
|
Also, I forgot to add that the DDOT error disappeared once I started with a fresh installation for 64bit. However, the MKL BLAS/LAPACK are never found while building.
|
|
|
-
Edoapra Forum:Admin, Forum:Mod, bureaucrat, sysop
|
|
Forum Vet
Threads 4
Posts 936
|
|
10:27:25 AM PDT - Fri, Jul 10th 2015 |
|
The configure script cannot find the blas library since your BLASOPT definition includes Scalapack and this definition requires MPI libraries for a successful link process.
export BLASOPT="-L${MKLROOT}/lib/intel64 -lmkl_scalapack_ilp64 -lmkl_intel_ilp64 -lmkl_core -lmkl_sequential -lmkl_blacs_intelmpi_ilp64 -lpthread -lm"
My suggestion is to use the following BLASOPT definition (that I got from the very same Intel Adviso website you mentioned by unselecting the Scalapack component)
e
xport BLASOPT="-Wl,--start-group ${MKLROOT}/lib/intel64/libmkl_intel_ilp64.a ${MKLROOT}/lib/intel64/libmkl_core.a ${MKLROOT}/lib/intel64/libmkl_sequential.a -Wl,--end-group -lpthread -lm"
|
|
|
|
Clicked A Few Times
Threads 1
Posts 11
|
|
2:29:58 AM PDT - Mon, Jul 13th 2015 |
|
Hi,
Thanks for your reply and the suggestion for BLASOPT. I am will try it immediately.
I will let you know how it goes.
Alejandro
|
|
|
|
Clicked A Few Times
Threads 1
Posts 11
|
|
2:34:33 AM PDT - Mon, Jul 13th 2015 |
|
Hi,
I selected Scalapack because I want to build NWChem for MPI (intel MPI to be precise). Should I understand that unselecting scalapack but defining all the MPI-related variables will still build a nwchem with MPI support?
Best,
Alejandro
Quote:Edoapra Jul 10th 5:27 pmThe configure script cannot find the blas library since your BLASOPT definition includes Scalapack and this definition requires MPI libraries for a successful link process.
export BLASOPT="-L${MKLROOT}/lib/intel64 -lmkl_scalapack_ilp64 -lmkl_intel_ilp64 -lmkl_core -lmkl_sequential -lmkl_blacs_intelmpi_ilp64 -lpthread -lm"
My suggestion is to use the following BLASOPT definition (that I got from the very same Intel Adviso website you mentioned by unselecting the Scalapack component)
e
xport BLASOPT="-Wl,--start-group ${MKLROOT}/lib/intel64/libmkl_intel_ilp64.a ${MKLROOT}/lib/intel64/libmkl_core.a ${MKLROOT}/lib/intel64/libmkl_sequential.a -Wl,--end-group -lpthread -lm"
|
|
|
|
Clicked A Few Times
Threads 1
Posts 11
|
|
2:38:29 AM PDT - Mon, Jul 13th 2015 |
|
While building NWChem with my initial options, it failed to find the MKL BLAS, but used instead the BLAS from NWChem, and finished the compilation without errors. The binaries it produced are
$NWCHEM_TOP/bin/$NWCHEM_TARGET/nwchem
$NWCHEM_TOP/bin/$NWCHEM_TARGET/depend.x
notice that there are no "parallel" file.
with the following static libraires
$NWCHEM_TOP/lib/$NWCHEM_TARGET/
libanalyze.a libcphf.a libetrans.a libnwctask.a libnwxc.a libproperty.a libspace.a
libband.a libdangchang.a libgradients.a libnwcutil.a libofpw.a libpspw.a libstepper.a
libblas.a libddscf.a libguess.a libnwdft.a liboptim.a libqhop.a libtce.a
libbq.a libdntmc.a libhessian.a libnwints.a libpaw.a libqmd.a libvib.a
libcafe.a libdplot.a liblapack.a libnwmd.a libpeigs.a libqmmm.a libvscf.a
libccca.a libdrdy.a libmcscf.a libnwpw.a libperfm.a librimp2.a
libccsd.a libdriver.a libmoints.a libnwpwlib.a libpfft.a libselci.a
libcons.a libesp.a libmp2.a libnwpython.a libprepar.a libsolvation.a
Is this the expected output for both bin and lib?
Best,
Alejandro
|
|
|
-
Edoapra Forum:Admin, Forum:Mod, bureaucrat, sysop
|
|
Forum Vet
Threads 4
Posts 936
|
|
11:52:26 AM PDT - Mon, Jul 13th 2015 |
|
Yes,
The files you report are the one that one expects to see at the end of a successful compilation.
The "parallel" file is no longer used, since NWChem is launched using mpirun as any other MPI based application
Quote:Alexdiazortiz Jul 13th 1:38 amWhile building NWChem with my initial options, it failed to find the MKL BLAS, but used instead the BLAS from NWChem, and finished the compilation without errors. The binaries it produced are
$NWCHEM_TOP/bin/$NWCHEM_TARGET/nwchem
$NWCHEM_TOP/bin/$NWCHEM_TARGET/depend.x
notice that there are no "parallel" file.
with the following static libraires
$NWCHEM_TOP/lib/$NWCHEM_TARGET/
libanalyze.a libcphf.a libetrans.a libnwctask.a libnwxc.a libproperty.a libspace.a
libband.a libdangchang.a libgradients.a libnwcutil.a libofpw.a libpspw.a libstepper.a
libblas.a libddscf.a libguess.a libnwdft.a liboptim.a libqhop.a libtce.a
libbq.a libdntmc.a libhessian.a libnwints.a libpaw.a libqmd.a libvib.a
libcafe.a libdplot.a liblapack.a libnwmd.a libpeigs.a libqmmm.a libvscf.a
libccca.a libdrdy.a libmcscf.a libnwpw.a libperfm.a librimp2.a
libccsd.a libdriver.a libmoints.a libnwpwlib.a libpfft.a libselci.a
libcons.a libesp.a libmp2.a libnwpython.a libprepar.a libsolvation.a
Is this the expected output for both bin and lib?
Best,
Alejandro
|
|
|
|
Clicked A Few Times
Threads 1
Posts 11
|
|
6:33:27 AM PDT - Fri, Jul 24th 2015 |
|
Hi Edoapra,
I re-built NWChem with MKL options as you suggested. This time it found the BLAS and SCALAPACK but not the LAPACK (!):
configure: Checks for BLAS,LAPACK,ScaLAPACK
configure:
configure: Attempting to locate BLAS library
checking for BLAS with user-supplied flags... yes
configure: Attempting to locate LAPACK library
configure: WARNING: LAPACK library not found, using internal LAPACK
configure: Attempting to locate SCALAPACK library
checking for SCALAPACK with user-supplied flags... yes
checking whether SCALAPACK implements pdsyevr... yes
configure:
I have put the the install script, the output of the make and the config.log in the links below.
If you are so kind to look at them, maybe you can spot what's wrong with parametrization of MKL.
Thanks.
Alejandro
https://owncloud.synchrotron-soleil.fr/public.php?service=files&t=dfecb4e425055a062229...
https://owncloud.synchrotron-soleil.fr/public.php?service=files&t=a662f4c312908cb2cee9...
https://owncloud.synchrotron-soleil.fr/public.php?service=files&t=072118b75713903466dd...
|
|
|
-
Edoapra Forum:Admin, Forum:Mod, bureaucrat, sysop
|
|
Forum Vet
Threads 4
Posts 936
|
|
9:45:21 AM PDT - Fri, Jul 24th 2015 |
|
Don't worry about lapack. This failure is not going to have any impact on the rest of the NWChem compilation.
|
|
|
|
Clicked A Few Times
Threads 1
Posts 11
|
|
4:59:24 AM PDT - Mon, Jul 27th 2015 |
|
Hi Edoapra,
Thanks for the feedback on LAPACK. Indeed the compilation of NWChem passed OK even with the LAPACK warning.
However, the built seems to have some problems, depending on the host (maybe it is only one problem that manifest differently depending on the host configuration).
If I try to run NWChem on a host with RHEL5
Linux isei 2.6.18-194.11.4.el5 #1 SMP Fri Sep 17 04:57:05 EDT 2010 x86_64 x86_64 x86_64 GNU/Linux
the run-time errot I get is
[diaz-ortiz@isei mytests]$nwchem dft_feco5.nw
nwchem: /lib64/libc.so.6: version `GLIBC_2.7' not found (required by nwchem)
If on the other hand I submit the job to a host with CentOS6
Linux isei113.hpc 2.6.32-504.1.3.el6.centos.plus.x86_64 #1 SMP Tue Nov 11 23:01:20 UTC 2014 x86_64 x86_64 x86_64 GNU/Linux
the run-time errot I get is
[diaz-ortiz@isei113 mytests]$ nwchem dft_feco5.nw
argument 1 = dft_feco5.nw
0:ARMCI is built w/ ARMCI_NETWORK=MPI_MT but the provided MPI threading level is MPI_THREAD_SINGLE not MPI_THREAD_MULTIPLE: 1
forrtl: severe (174): SIGSEGV, segmentation fault occurred
Image PC Routine Line Source
nwchem 0000000003642769 Unknown Unknown Unknown
nwchem 00000000036410E0 Unknown Unknown Unknown
nwchem 00000000035F0212 Unknown Unknown Unknown
nwchem 00000000035986B3 Unknown Unknown Unknown
nwchem 000000000359E81B Unknown Unknown Unknown
libpthread.so.0 00002AFDA7F21710 Unknown Unknown Unknown
libc.so.6 00002AFDA94B1E2C Unknown Unknown Unknown
libc.so.6 00002AFDA94B35E0 Unknown Unknown Unknown
libc.so.6 00002AFDA94AE21E Unknown Unknown Unknown
libc.so.6 00002AFDA94B918A Unknown Unknown Unknown
nwchem 0000000003432002 Unknown Unknown Unknown
nwchem 000000000343315D Unknown Unknown Unknown
nwchem 000000000341CF90 Unknown Unknown Unknown
nwchem 000000000050A119 Unknown Unknown Unknown
nwchem 0000000000509F46 Unknown Unknown Unknown
libc.so.6 00002AFDA9488D5D Unknown Unknown Unknown
nwchem 0000000000509E39 Unknown Unknown Unknown
In the case of host with RHEL5, I understand the error message because
[diaz-ortiz@isei001 mytests]$ldd --version
ldd (GNU libc) 2.5
Copyright (C) 2006 Free Software Foundation, Inc.
This is free software; see the source for copying conditions. There is NO
warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.
Written by Roland McGrath and Ulrich Drepper.
However in the case of host with CentOS6:
[diaz-ortiz@isei113 mytests]$ ldd --version
ldd (GNU libc) 2.12
Copyright (C) 2010 Free Software Foundation, Inc.
This is free software; see the source for copying conditions. There is NO
warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.
Written by Roland McGrath and Ulrich Drepper.
The GLIBC should be OK.
Is this error in the CentOS6 hots telling me that I should redefine the OpenMP variables?
For instance, like this
export MKL_NUM_THREADS=4
export MKL_DYNAMIC=FALSE
export OMP_NUM_THREADS=1
I look forward to your reply. Thanks.
|
|
|
AWC's:
2.5.10 MediaWiki - Stand Alone Forum Extension Forum theme style by: AWC
| |