(no subject)

From: ¬Ð˾´ (xingjinglu_at_gmail_dot_com)
Date: Tue Oct 28 2008 - 15:42:02 PST

  • Next message: Brett Worth: "Re: Problem with "--without-translator""
    Paul,
    
      Thank you for your reply, I have tried your advice, the problem is still
    there. 
    
    For 1) I add the GASNET_MPI_SPAWNER=ssh it the upc.sh, but it did notint to
    the problem; else if I use "vapi-conduit"it will cause other problem, like
    lack the library for vapi.
    
     
    
    For 2) ,  I changed the MPIRUN_CMD=/home/autopar/mpich/mpirun  -N %M %P
    ..... in GASNET_MPI.pl  under the directory of /upc/dbg/bin  and /upc
    /opt/bin ,but it doesn't matter.
    
        And I have tried install the upc with the network of vapi, but there are
    still the same problem.
    
         
    
         Now, I guess the problem maybe casued by the mpich.
    
     
    
    When I install upc: 
    
    #/bin/sh
    
    export PATH=/home/autopar/mpich2/bin:$PATH
    
    export LD_LIBRARY_PATH=/home/autopar/mpich2/lib:$LD_LIBRARY_PATH
    
    bash
    
     
    
    Configure  CC=gcc  CXX=g++ MPI_CC=mpicc -enable-mpi
    --prefix=/home/autopar/upc
    
     
    
     
    
    Then make  and make install  
    
    Wish your reply.
    
    ----------------------------------------------------------------------------
    ------------
    
    Eric,
    
     
    
     
    
    1) Lines 3-6 of the warnings tell you you should be looking to use the
    
    native "vapi-conduit" for communiction not MPI. I am not sure why it is
    
    not getting used by default, but passing "-network vapi" to upcc should
    
    ensure it gets used. You may also want/need to set
    
    GASNET_VAPI_SPAWNER=ssh in your environment, because the rest of your
    
    problem seems linked to mpirun (which is the default way to launch
    
    processes w/ vapi-conduit).
    
     
    
    2) The first warning tells you we can't control process layout with your
    
    mpirun. You should consider leaving off the '-N 4' when you run to rid
    
    yourself of the warning. The problem of getting the process run on the
    
    right nodes should be solved one of two ways:
    
    2a) Switch to vapi-conduit and set GASNET_VAPI_SPAWNER=ssh in your
    
    environment as mentioned above
    
    or
    
    2b) Figure out what environment variables or command line flags are
    
    needed to get a "hello world" MPI application run the way you want. Then
    
    set MPIRUN_CMD as needed. Once you know how to run MPI apps the way you
    
    want, we can help with the MPIRUN_CMD details.
    
     
    
    Let us know if you need more help.
    
     
    
    -Paul
    
     
    
     
    
    luxingjing wrote:
    
    > 
    
    > Hi,
    
    > 
    
    > After I intstalled upc2.6.0 version and the network is mpich1.2.7 version;
    
    > 
    
    > I compile and run the program like below:
    
    > 
    
    > Upcc -T=32 hello.c ¨Co hello
    
    > 
    
    > Upcrun ¨CN 4 hello
    
    > 
    
    > But in fact, all the threads are layed out on one node only. My
    
    > environment set shows below:
    
    > 
    
    > export
    
    >
    LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/home/autopar/mpich1.2.7/lib:/home/autopar/
    upc_mpi1/dbg/lib:/home/autopar/upc_mpi1/opt/lib
    
    > 
    
    > export PATH=/home/autopar/mpich1.2.7/bin:/home/autopar/upc_mpi1/bin:$PATH
    
    > 
    
    > export UPC_NODES="node17 node18 node19 node20"
    
    > 
    
    > And the the warnings are:
    
    > 
    
    > WARNING: Don't know how to control process->node layout with your mpirun
    
    > 
    
    > WARNING: PROCESS LAYOUT MIGHT NOT MATCH YOUR REQUEST
    
    > 
    
    > WARNING: Using GASNet's mpi-conduit, which exists for portability
    
    > convenience.
    
    > 
    
    > WARNING: Support was detected for native GASNet conduits: vapi
    
    > 
    
    > WARNING: You should *really* use the high-performance native GASNet
    
    > conduit
    
    > 
    
    > WARNING: if communication performance is at all important in this
    
    > program run.
    
    > 
    
    > UPCR: UPC thread 31 of 32 on gnode20 (process 31 of 32, pid=19952)
    
    > 
    
    > UPCR: UPC thread 2 of 32 on gnode20 (process 2 of 32, pid=19718)
    
    > 
    
    > UPCR: UPC thread 7 of 32 on gnode20 (process 7 of 32, pid=19758)
    
    > 
    
    > UPCR: UPC thread 15 of 32 on gnode20 (process 15 of 32, pid=19822)
    
    > 
    
    > UPCR: UPC thread 27 of 32 on gnode20 (process 27 of 32, pid=19920)
    
    > 
    
    > UPCR: UPC thread 29 of 32 on gnode20 (process 29 of 32, pid=19936)
    
    > 
    
    > UPCR: UPC thread 25 of 32 on gnode20 (process 25 of 32, pid=19904)
    
    > 
    
    > UPCR: UPC thread 13 of 32 on gnode20 (process 13 of 32, pid=19806)
    
    > 
    
    > UPCR: UPC thread 24 of 32 on gnode20 (process 24 of 32, pid=19894)
    
    > 
    
    > UPCR: UPC thread 22 of 32 on gnode20 (process 22 of 32, pid=19878)
    
    > 
    
    > UPCR: UPC thread 23 of 32 on gnode20 (process 23 of 32, pid=19886)
    
    > 
    
    > UPCR: UPC thread 21 of 32 on gnode20 (process 21 of 32, pid=19870)
    
    > 
    
    > UPCR: UPC thread 9 of 32 on gnode20 (process 9 of 32, pid=19774)
    
    > 
    
    > UPCR: UPC thread 19 of 32 on gnode20 (process 19 of 32, pid=19854)
    
    > 
    
    > UPCR: UPC thread 11 of 32 on gnode20 (process 11 of 32, pid=19790)
    
    > 
    
    > UPCR: UPC thread 8 of 32 on gnode20 (process 8 of 32, pid=19766)
    
    > 
    
    > UPCR: UPC thread 20 of 32 on gnode20 (process 20 of 32, pid=19862)
    
    > 
    
    > UPCR: UPC thread 26 of 32 on gnode20 (process 26 of 32, pid=19912)
    
    > 
    
    > UPCR: UPC thread 18 of 32 on gnode20 (process 18 of 32, pid=19846)
    
    > 
    
    > UPCR: UPC thread 28 of 32 on gnode20 (process 28 of 32, pid=19928)
    
    > 
    
    > UPCR: UPC thread 0 of 32 on gnode20 (process 0 of 32, pid=19706)
    
    > 
    
    > UPCR: UPC thread 12 of 32 on gnode20 (process 12 of 32, pid=19798)
    
    > 
    
    > UPCR: UPC thread 1 of 32 on gnode20 (process 1 of 32, pid=19710)
    
    > 
    
    > UPCR: UPC thread 17 of 32 on gnode20 (process 17 of 32, pid=19838)
    
    > 
    
    > UPCR: UPC thread 30 of 32 on gnode20 (process 30 of 32, pid=19944)
    
    > 
    
    > UPCR: UPC thread 6 of 32 on gnode20 (process 6 of 32, pid=19750)
    
    > 
    
    > UPCR: UPC thread 5 of 32 on gnode20 (process 5 of 32, pid=19742)
    
    > 
    
    > UPCR: UPC thread 14 of 32 on gnode20 (process 14 of 32, pid=19814)
    
    > 
    
    > UPCR: UPC thread 10 of 32 on gnode20 (process 10 of 32, pid=19782)
    
    > 
    
    > UPCR: UPC thread 4 of 32 on gnode20 (process 4 of 32, pid=19734)
    
    > 
    
    > UPCR: UPC thread 16 of 32 on gnode20 (process 16 of 32, pid=19830)
    
    > 
    
    > UPCR: UPC thread 3 of 32 on gnode20 (process 3 of 32, pid=19726)
    
    > 
    
    > Hello World from Thread 0 (of 32 THREADS)
    
    > 
    
    > Hello World from Thread 30 (of 32 THREADS)
    
    > 
    
    > Hello World from Thread 2 (of 32 THREADS)
    
    > 
    
    > Hello World from Thread 25 (of 32 THREADS)
    
    > 
    
    > Hello World from Thread 23 (of 32 THREADS)
    
    > 
    
    > Hello World from Thread 1 (of 32 THREADS)
    
    > 
    
    > Hello World from Thread 24 (of 32 THREADS)
    
    > 
    
    > Hello World from Thread 10 (of 32 THREADS)
    
    > 
    
    > Hello World from Thread 4 (of 32 THREADS)
    
    > 
    
    > Hello World from Thread 14 (of 32 THREADS)
    
    > 
    
    > Hello World from Thread 26 (of 32 THREADS)
    
    > 
    
    > Hello World from Thread 9 (of 32 THREADS)
    
    > 
    
    > Hello World from Thread 31 (of 32 THREADS)
    
    > 
    
    > Hello World from Thread 5 (of 32 THREADS)
    
    > 
    
    > Hello World from Thread 7 (of 32 THREADS)
    
    > 
    
    > Hello World from Thread 8 (of 32 THREADS)
    
    > 
    
    > Hello World from Thread 15 (of 32 THREADS)
    
    > 
    
    > Hello World from Thread 20 (of 32 THREADS)
    
    > 
    
    > Hello World from Thread 18 (of 32 THREADS)
    
    > 
    
    > Hello World from Thread 17 (of 32 THREADS)
    
    > 
    
    > Hello World from Thread 27 (of 32 THREADS)
    
    > 
    
    > Hello World from Thread 16 (of 32 THREADS)
    
    > 
    
    > Hello World from Thread 6 (of 32 THREADS)
    
    > 
    
    > Hello World from Thread 22 (of 32 THREADS)
    
    > 
    
    > Hello World from Thread 21 (of 32 THREADS)
    
    > 
    
    > Hello World from Thread 12 (of 32 THREADS)
    
    > 
    
    > Hello World from Thread 11 (of 32 THREADS)
    
    > 
    
    > Hello World from Thread 13 (of 32 THREADS)
    
    > 
    
    > Hello World from Thread 3 (of 32 THREADS)
    
    > 
    
    > Hello World from Thread 28 (of 32 THREADS)
    
    > 
    
    > Hello World from Thread 29 (of 32 THREADS)
    
    > 
    
    > Hello World from Thread 19 (of 32 THREADS)
    
    > 
    
    > Wish your help!
    
    > 
    
    > Thank you!
    
    > 
    
    > Yours Eric.
    
    > 
    
     
    
     
    
    -- 
    
    Paul H. Hargrove                          PHHargrove_at_lbl_dot_gov
    
    Future Technologies Group                 
    
    HPC Research Department                   Tel: +1-510-495-2352
    
    Lawrence Berkeley National Laboratory     Fax: +1-510-486-6900
    
     
    
     
    
     
    

  • Next message: Brett Worth: "Re: Problem with "--without-translator""