charm AT lists.siebelschool.illinois.edu
Subject: Charm++ parallel programming system
List archive
- From: Phil Miller <mille121 AT illinois.edu>
- To: Dominik Heller <dominik.heller1 AT gmail.com>
- Cc: "charm AT cs.uiuc.edu" <charm AT cs.uiuc.edu>
- Subject: Re: [charm] Building PETSc on top of Charm++'s AMPI
- Date: Fri, 30 Jan 2015 12:36:10 -0600
- List-archive: <http://lists.cs.uiuc.edu/pipermail/charm/>
- List-id: CHARM parallel programming system <charm.cs.uiuc.edu>
I tried to do this several years ago, as part of a side project, but put it on the back burner after only a little bit of progress. Here are the (very terse) notes I made at the time:
./configure --with-cc=$CB/ampicc --with-cxx=$CB/ampicxx --without-fc --CFLAGS='-default-to-aout -G' --CXXFLAGS='-default-to-aout -G' --CC_LINKER_FLAGS="-default-to-aout,-G"
- Modified the configure system to use
int main(int argc, char** argv)
instead ofint main(void)
for compatibility withAMPI_Main
's prototype - Huge volume of warnings about redefinition of
MPI_*
functions between our mpi.h and PETSc's headers. Disabled the call-counting instrumentation in the ifdef to avoid this - Disabled
MPI_Init_thread
usage due to lack of definition ofMPI_THREAD_FUNNELED
- Disabled MPI_Win_create in
packages/MPI.py
, since it's not fully implemented - Basic configuration test in MPI.py modified to look for AMPI_Init and AMPI_Comm_create, with no options for additional libraries. Otherwise, it would try to pick up the system's libmpi.a, which was causing all sorts of horrible things.
- A single-rank test passes, the first 2-rank test fails.
If this is of substantial interest to you, we can probably arrange to have a member of our lab work with you a bit further to get PETSc working on AMPI.
On Fri, Jan 30, 2015 at 11:20 AM, Dominik Heller <dominik.heller1 AT gmail.com> wrote:
Hi,
I'm trying to get PETSc to build on top of Charm++'s Adaptive MPI.
So far I've had no success trying a variety of parameters for the PETSc configuration script.
Will this be possible at all, with AMPI only being a subset of MPI1.1 and bringing its own suite of compile scripts as well as charmrun?
Thanks,
Dominik
_______________________________________________
charm mailing list
charm AT cs.uiuc.edu
http://lists.cs.uiuc.edu/mailman/listinfo/charm
- [charm] Building PETSc on top of Charm++'s AMPI, Dominik Heller, 01/30/2015
- Re: [charm] [ppl] Building PETSc on top of Charm++'s AMPI, Totoni, Ehsan, 01/30/2015
- Re: [charm] Building PETSc on top of Charm++'s AMPI, Phil Miller, 01/30/2015
- Re: [charm] Building PETSc on top of Charm++'s AMPI, Dominik Heller, 01/30/2015
Archive powered by MHonArc 2.6.16.