jcduell_at_lbl_dot_gov
Date: Thu May 27 2004 - 10:26:02 PDT
On Wed, May 26, 2004 at 04:16:23PM +0000, Ben Allan wrote:
> setenv MPI_CC /usr/bin/gcc
> setenv MPI_CFLAGS "-I/usr/local/lampi/include"
> setenv MPI_LIBS /usr/local/lampi/lib/libmpi.a
>
> we get
> "checking for working MPI configuration... no"
Hi Ben,
Could you do a couple quick things for me?
1) Download the 'stable' build from our website, and see if it builds:
    http://mantis.lbl.gov/upc/berkeley_upc-stable.tar.gz
   You're better off with our stable build than the 1.1.0 release
   anyway--at this point it's more reliable for a wider range of UPC
   code than our official release (which we're going to update soon).
2) Email us the 'config.log' file in the 'gasnet' subdirectory of your
   build tree (NOT the one in the top level directory).  It will contain
   any compiler/linker errors from the failed attempt to use your
   MPI_CC, etc., to compile a trivial MPI program (just MPI_Init and
   MPI_Finalize).  If you're feeling helpful, you could even take a
   quick peek at the error message (just search for 'working MPI' and
   the error message(s) should follow soon after), and tell me if it
   looks like anything obvious, given your MPI compiler and system.
I don't know if we've built against LAMPI before, so this is probably
just a minor glitch somewhere.
Cheers,
-- 
Jason Duell             Future Technologies Group
<jcduell_at_lbl_dot_gov>       Computational Research Division
Tel: +1-510-495-2354    Lawrence Berkeley National Laboratory