Skip to Content.
Sympa Menu

charm - Re: [charm] Building PETSc on top of Charm++'s AMPI

charm AT lists.cs.illinois.edu

Subject: Charm++ parallel programming system

List archive

Re: [charm] Building PETSc on top of Charm++'s AMPI


Chronological Thread 
  • From: Dominik Heller <dominik.heller1 AT gmail.com>
  • To: Phil Miller <mille121 AT illinois.edu>
  • Cc: "charm AT cs.uiuc.edu" <charm AT cs.uiuc.edu>, Andrew Seidl <aseidl AT gmail.com>
  • Subject: Re: [charm] Building PETSc on top of Charm++'s AMPI
  • Date: Fri, 30 Jan 2015 14:53:43 -0600
  • List-archive: <http://lists.cs.uiuc.edu/pipermail/charm/>
  • List-id: CHARM parallel programming system <charm.cs.uiuc.edu>

Hi Phil,
I'm ccing Andrew, who is working on this as well,

it'd be great working on this together.
I tried some stuff but I'm not that familiar with the whole build process and seem to struggle with some basic things.

Running
./configure --with-cc=$CB/ampicc --with-cxx=$CB/ampicxx --without-fc --CFLAGS='-default-to-aout -G' --CXXFLAGS='-default-to-aout -G' --CC_LINKER_FLAGS="-default-to-aout,-G"
fails on checkSharedLinker (even with --known-mpi-shared-libraries=0)
===============================================================================
             Configuring PETSc to compile on your system                       
===============================================================================
TESTING: checkSharedLinker from config.setCompilers(config/BuildSystem/config/se*******************************************************************************
                    UNABLE to EXECUTE BINARIES for ./configure
-------------------------------------------------------------------------------
[Errno 2] No such file or directory: '/tmp/petsc-vyJhGC/config.setCompilers/libconftest.so'
*******************************************************************************

I also tried pointing --with-mpi-lib/--with-mpi-include to the charm folders but to no avail.

Thanks,
Dominik


Am 30.01.2015 um 12:36 schrieb Phil Miller:
I tried to do this several years ago, as part of a side project, but put it on the back burner after only a little bit of progress. Here are the (very terse) notes I made at the time:
  • ./configure --with-cc=$CB/ampicc --with-cxx=$CB/ampicxx --without-fc --CFLAGS='-default-to-aout -G' --CXXFLAGS='-default-to-aout -G' --CC_LINKER_FLAGS="-default-to-aout,-G"
  • Modified the configure system to use int main(int argc, char** argv) instead of int main(void) for compatibility with AMPI_Main's prototype
  • Huge volume of warnings about redefinition of MPI_* functions between our mpi.h and PETSc's headers. Disabled the call-counting instrumentation in the ifdef to avoid this
  • Disabled MPI_Init_thread usage due to lack of definition of MPI_THREAD_FUNNELED
  • Disabled MPI_Win_create in packages/MPI.py, since it's not fully implemented
  • Basic configuration test in MPI.py modified to look for AMPI_Init and AMPI_Comm_create, with no options for additional libraries. Otherwise, it would try to pick up the system's libmpi.a, which was causing all sorts of horrible things.
  • A single-rank test passes, the first 2-rank test fails.

If this is of substantial interest to you, we can probably arrange to have a member of our lab work with you a bit further to get PETSc working on AMPI.



On Fri, Jan 30, 2015 at 11:20 AM, Dominik Heller <dominik.heller1 AT gmail.com> wrote:
Hi,

I'm trying to get PETSc to build on top of Charm++'s Adaptive MPI.
So far I've had no success trying a variety of parameters for the PETSc configuration script.
Will this be possible at all, with AMPI only being a subset of MPI1.1 and bringing its own suite of compile scripts as well as charmrun?

Thanks,
Dominik
_______________________________________________
charm mailing list
charm AT cs.uiuc.edu
http://lists.cs.uiuc.edu/mailman/listinfo/charm





Archive powered by MHonArc 2.6.16.

Top of Page