charm AT lists.siebelschool.illinois.edu
Subject: Charm++ parallel programming system
List archive
- From: Phil Miller <mille121 AT illinois.edu>
- To: Mustafa Abdul Jabbar <musbar AT gmail.com>
- Cc: "charm AT cs.uiuc.edu" <charm AT cs.uiuc.edu>
- Subject: Re: [charm] Synchronized SDAG entry
- Date: Thu, 20 Mar 2014 07:45:29 -0500
- List-archive: <http://lists.cs.uiuc.edu/pipermail/charm/>
- List-id: CHARM parallel programming system <charm.cs.uiuc.edu>
As a quick expedient, you could use CkCallbackResumeThread either directly with the all-to-all routine (possibly a reduction from that array after they're all done) or with quiescence detection (QD).
On Thu, Mar 20, 2014 at 4:07 AM, Mustafa Abdul Jabbar <musbar AT gmail.com> wrote:
Hello,I am porting an MPI based code to a Charm++ based one without major code reforms.Problem is from within the array node I want to do blocking all to all communication without using call backs to trigger completion, because those will make the code a big mess given its current structure. I have the following SDAG entryentry void allToAllBodyCount(){atomic{transportBodyCountToAll(); // will broadcast to all nodes}for(count = 0; count < numChares; ++count){when transportBodyCount(int bodyCount, int sender) atomic{processBodyCount(bodyCount,sender);}}}I am calling allToAllBodyCount() locally from the array node, and I want to wait until I have processed incoming messages.I have read about the [sync] methods, but there's no evidence that an SDAG method can be made sync since the SDAG translates the code above to a sort of Future Completion as I noticed in the generated header fileAny workaround?
_______________________________________________
charm mailing list
charm AT cs.uiuc.edu
http://lists.cs.uiuc.edu/mailman/listinfo/charm
- [charm] Synchronized SDAG entry, Mustafa Abdul Jabbar, 03/20/2014
- Re: [charm] Synchronized SDAG entry, Phil Miller, 03/20/2014
- Re: [charm] Synchronized SDAG entry, Nikhil Jain, 03/20/2014
- Re: [charm] Synchronized SDAG entry, Phil Miller, 03/20/2014
Archive powered by MHonArc 2.6.16.