From NWChem
			Viewed 3647 times, With a total of 2 Posts
												
			
                  
        
            
                |  | 
            
                | 
                        
                            | Gets AroundThreads 17 Posts 73
 |  | 
		                    
		                        | 2:56:57 PM PST - Tue, Feb 4th 2014 |  |  
		                        | I'm trying to build a non-mpi version of nwchem on CentOS6 i386: 
 $ rpm -q gcc-gfortran openmpi atlas
 gcc-gfortran-4.4.7-4.el6.i686
 openmpi-1.5.4-2.el6.i686
 atlas-3.8.4-2.el6.i686
 $  uname -rvm
 2.6.32-431.3.1.el6.i686 #1 SMP Fri Jan 3 18:53:30 UTC 2014 i686
 
 
 cd /tmp
su -c "yum -y install http://download.fedoraproject.org/pub/epel/6/x86_64/epel-release-6-8.noarch.rpm"
su -c "yum -y install wget gcc-gfortran openmpi-devel python-devel atlas-devel"
wget http://www.nwchem-sw.org/download.php?f=Nwchem-src-2014-01-28.tar.gz -O Nwchem-src-2014-01-28.tar.gz
tar zxf Nwchem-src-2014-01-28.tar.gz
cd nwchem-src-2014-01-28/src
sh ../../compile.sh 2>&1| tee ../build.log 
 with the following /tmp/compile.sh (works with nwchem-6.3.revision2-src.2013-10-17):
 
 
 export NWCHEM_TOP=/tmp/nwchem-src-2014-01-28
export NWCHEM_TARGET=LINUX
export CC=gcc
export FC=gfortran
export LARGE_FILES=TRUE
export USE_NOFSCHECK=TRUE
export PYTHONHOME=/usr
export PYTHONVERSION=2.7
export PYTHONLIBTYPE=so
export HAS_BLAS=yes
export BLASOPT='-L/usr/lib/atlas -lf77blas -lcblas -latlas'
export MAKE=/usr/bin/make
export ARMCI_NETWORK=SOCKETS
export TCGSSH=ssh
$MAKE nwchem_config NWCHEM_MODULES="all python" 2>&1 | tee ../make_nwchem_config_serial.log
export MAKEOPTS=""
$MAKE ${MAKEOPTS} 2>&1
 Result:
 
 
 ./util/util_nwchem_version.bash
which: no svn in (/usr/lib/qt-3.3/bin:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/root/bin)
Making libraries in tools
   You must set USE_MPI and
   have a working MPI installation
   to compile NWChem
 |  | 
 
        |  | 
        
              
        
            
                | 
                        
                            | 
                                  
                                     Huub  Forum:Admin, Forum:Mod, NWChemDeveloper, bureaucrat, sysop
                                    
                                   |  | 
            
                | 
                        
                            | Forum RegularThreads 1 Posts 185
 |  | 
		                    
		                        | 4:30:09 PM PDT - Tue, Apr 8th 2014 |  |  
		                        | Hi Marcin, 
 Indeed non-MPI builds are no longer supported. At the moment there are only a few changes that really require MPI and they could easily be undone manually. However, going forward we will not support any non-MPI builds. NWChem is explicitly meant to be a parallel code and MPI is the defacto standard library for distributed memory parallel programming. Now we explicitly require this library to be present.
 
 Best wishes,
 
 Huub
 |  | 
 
        |  | 
        
              
        
            
                |  | 
            
                | 
                        
                            | Just Got HereThreads 1 Posts 2
 |  | 
		                    
		                        | 5:27:27 AM PST - Sun, Feb 15th 2015 |  |  
		                        | Quote:Huub Apr 8th 3:30 pm 
 Indeed non-MPI builds are no longer supported. At the moment there are only a few changes that really require MPI and they could easily be undone manually. However, going forward we will not support any non-MPI builds.
 Well, the documentation page of the latest version of NWchem 6.5 (as of 15.02.2015) should be seriously updated.
 
 Except from the current documentation page:
 
     MPI variables needed to compile. For a single processor system, these environment variables do not have to be defined. 
 
 USE_MPI 	Set to "y" to indicate that NWChem should be compiled with MPI
 USE_MPIF 	Set to "y" for the NWPW module to use fortran-bindings of MPI (Generally set when USE_MPI is set)
 USE_MPIF4 	Set to "y" for the NWPW module to use Integer*4 fortran-bindings of MPI. (Generally set when USE_MPI is set on most platforms)
 LIBMPI 	Name of the MPI library that should be linked with -l (eg. -lmpich)
 MPI_LIB 	Directory where the MPI library resides
 MPI_INCLUDE 	Directory where the MPI include files reside
 
 I am on a single processor system and I have the full-stack Intel Cluster Studio (which provides icc, icpc, ifort, mkl etc.) installed.
 
 Since the way forward is parallel, the USE_MPI variable should be scrapped off. Instead, requiring MPI on the system should be a base requirement that is checked when executing make.
 |  | 
 
        |  | 
        
      
        	
            
                AWC's:
                 2.5.10 MediaWiki - Stand Alone Forum Extension
Forum theme style by: AWC