Skip to Content.
Sympa Menu

charm - Re: [charm] Is AMPI support MPI_Waitall?

charm AT lists.siebelschool.illinois.edu

Subject: Charm++ parallel programming system

List archive

Re: [charm] Is AMPI support MPI_Waitall?


Chronological Thread 
  • From: 张凯 <zhangk1985 AT gmail.com>
  • To: Phil Miller <mille121 AT illinois.edu>
  • Cc: charm AT cs.uiuc.edu
  • Subject: Re: [charm] Is AMPI support MPI_Waitall?
  • Date: Fri, 29 Jan 2010 01:30:29 +0800
  • List-archive: <http://lists.cs.uiuc.edu/pipermail/charm>
  • List-id: CHARM parallel programming system <charm.cs.uiuc.edu>

i think u got the problem i suffered.

thanks for your reply.

best regards.

Zhang Kai

2010/1/29 Phil Miller
<mille121 AT illinois.edu>

> On Thu, Jan 28, 2010 at 07:50, 张凯
> <zhangk1985 AT gmail.com>
> wrote:
> > hi:
> >
> > I am a beginner of AMPI and trying to run a MPI program using it. But i
> > found a little problem.
> >
> > Here(
> >
> http://www.mcs.anl.gov/research/projects/mpi/usingmpi/examples/advmsg/nbodypipe_c.htm
> )
> > you can find an example of a MPI program. I have successfully built
> > and
> > run it using both MPICH and intel MPI.
> >
> > However, when i running it with AMPI, i found that the program was
> blocked
> > by MPI_Waitall function and never return again.
> >
> > I just run it with ++local +p2 +vp2 options. Did i miss other options? or
> > misconfig AMPI?
>
> I'm seeing the same effect as you describe on a net-linux-x86_64 build
> of AMPI from the latest charm sources. We'll look into this and get
> back to you.
>
> For reference, the attached code (with added prints) produces the
> following:
>
> $ ./charmrun nbp +vp 4 20 +p4
> Charm++: scheduler running in netpoll mode.
> Charm++> cpu topology info is being gathered.
> Charm++> Running on 1 unique compute nodes (8-way SMP).
> Iteration 9
> Iteration 9:0 a
> Iteration 9
> Iteration 9:0 a
> Iteration 9
> Iteration 9:0 a
> Iteration 9
> Iteration 9:0 a
> Iteration 9:0 b
> Iteration 9:0 b
> Iteration 9:0 b
> Iteration 9:0 b
>




Archive powered by MHonArc 2.6.16.

Top of Page