Index of /download/dist/upc-tests/CGBENCH/MPI

[ICO]NameLast modifiedSize

[PARENTDIR]Parent Directory  -
[DIR]CG/2022-10-28 13:54 -
[DIR]MPI_dummy/2022-10-28 13:54 -
[DIR]common/2022-10-28 13:54 -
[DIR]config/2022-10-28 13:54 -
[DIR]sys/2022-10-28 13:54 -
[   ]Makefile2022-10-28 13:54 1.4K
[TXT]README2022-10-28 13:54 3.7K

######################################
# NAS Parallel Benchmarks 2          #
# MPI/F77/C                          #
# Revision 2.3        10/30/97       #
# NASA Ames Research Center          #
# [email protected]                   #
# http://www.nas.nasa.gov/NAS/NPB/   #
######################################

==========================================
INSTALLATION

    For documentation on installing and running the NAS Parallel
    Benchmarks, refer to subdirectory Doc.

==========================================
BACKGROUND

Information on NPB 2.3, including the technical report, the          
original specifications, source code, results and information        
on how to submit new results, is available at:                       
                                                                     
       http://www.nas.nasa.gov/NAS/NPB/                              
                                                                     
Send comments or suggestions to  [email protected]                    
Send bug reports to              [email protected]               
                                                                     
      NAS Parallel Benchmarks Group                                  
      NASA Ames Research Center                                      
      Mail Stop: T27A-1                                              
      Moffett Field, CA   94035-1000                                 
                                                                     
      E-mail:  [email protected]                                      
      Fax:     (415) 604-3957                                        
                                                                     

============================================

NPB 2.X PLANS

The NAS Parallel Benchmark 2 Team plans a staged release of modified
and new benchmarks in the coming months. These new releases correct
deficiencies in the preliminary release, 2.0. We do not intend to make
more releases beyond those mentioned here unless there is a
substantial problem.

Summary of (planned) releases:

     2.1 New LU and BT.
     2.2 - IS in C.
         - EP.
         - New FT with a 2-D decomposition, plus additional optimizations.
     2.3 CG Benchmark.

Status of NPB 2.3 codes:
 - We consider SP, BT, LU, MG EP, FT, and IS completely stable
 - CG is now in "Beta" release.

The accompanying report specifies the CLASS C test cases for both 
NPB 1.0 and NPB 2.x, including verification numbers.

=============================

Changes in 2.3

- Added CG

Changes is 2.3b3: (Oct, '97): rhs.f changed in BT, so that inner
loops are direction of first index 

=============================

Changes in 2.2

- Added IS in C
- Modified FT so that it can be run on a number of processors
  larger than the last array dimension
- Added EP 
- Encoded CLASS C size and verification numbers
- Removed include file mpifrag.f containing executable statements
- Added different versions of random number generators

=============================

Changes in 2.1

- Modified SP to report more accurate flop count
- Modified BT to report correct flop count (bt.f)
- Modified BT (x_solve.f) 5x5 block tridiagonal routine are unrolled
- Modified LU routines blts.f and buts.f to do column based relaxation
  (rather than diagonal based), and routine exchange_1.f to communicate at 
  edges of processor (rather than at ends of each subsequent diagonal).
  Result is significantly less communication.  
- Modified LU routines blts.f and buts.f by unrolling the loops of the 
  5x5 block tridiagonal sections.
- Added comments in config/NAS examples
- Removed an unnecessary echo from sys/make.common
- Added mail and email address for benchmark output in print_results.f

============================