charm AT lists.siebelschool.illinois.edu
Subject: Charm++ parallel programming system
List archive
- From: "Ortega, Bob" <bobo AT mail.smu.edu>
- To: Ronak Buch <rabuch2 AT illinois.edu>
- Cc: "charm AT lists.cs.illinois.edu" <charm AT lists.cs.illinois.edu>, Nitin Bhat <nitin AT hpccharm.com>
- Subject: Re: [charm] FW: Projections
- Date: Thu, 10 Dec 2020 20:31:43 +0000
- Accept-language: en-US
- Arc-authentication-results: i=1; mx.microsoft.com 1; spf=pass smtp.mailfrom=mail.smu.edu; dmarc=pass action=none header.from=mail.smu.edu; dkim=pass header.d=mail.smu.edu; arc=none
- Arc-message-signature: i=1; a=rsa-sha256; c=relaxed/relaxed; d=microsoft.com; s=arcselector9901; h=From:Date:Subject:Message-ID:Content-Type:MIME-Version:X-MS-Exchange-SenderADCheck; bh=tHhwUxfzymB0FP9TyW4S7imUatk0zG0oKmlpyHb0Zsw=; b=F+czIqhHrbWewiKBwcgpWF89WhSFQBwErTVI6hUPRAHh2SPaTMUBI4gMTMVVw8MSrDAAyMOrTLHZZCFtOpkvC65Z+UoyhFZCWvDsW9RLje3kvu3l4HTxCmo5EmQALeFho8g/segBRTSxVsYUYcGMqSjk/CDMCdKoU/Dte/YxepkeaaTj8dzw5oxT9jxqccSpEeR66t7zrPjdanMDRH2x3EZRxDElgmxnS3TZEDvsAAJ60LbIfKHxWGKu0v4ITvo5GBhT9LwS9N+NwOhnIMGa4aEOcjPre/nWwN7rhBqIfaXX9J7wW3LqaPsbFiB3Yanw8uhYk1tb+kE+5JXpPBY9mA==
- Arc-seal: i=1; a=rsa-sha256; s=arcselector9901; d=microsoft.com; cv=none; b=oOa3+2itE6b7fjyjEUHmESUnHR4WnW2oqB0XYSHuh0BtAdXMkHB7FuTqredRr/2GpDBRbreDKBm+ONk3eErrXOOv51cBy6RA+1HoTLvntd110ueASfTPBr/hoMGk5eckcSiTTk5lC9uN+8+knQVbxJmnLyamELPi5t1WxWnTledDA9kPvjvrfdKGakUA9yoBQMSc6JVwjEDFXt6cwUaZAFHKTOLc8tk/bAlIoHNVIlzObfNM6xYx1jaPKNG3JX6bC30OFCr4dA07g3FEpi4IU0NFWPnA5ztsXWdhFtdC7mCyfr5XGe6w9xglvuIm1aD/12eSVgRJPJdwO6n9XY3azA==
- Authentication-results: illinois.edu; spf=pass smtp.mailfrom=bobo AT mail.smu.edu; dkim=pass header.s=selector2-smu365-onmicrosoft-com header.d=smu365.onmicrosoft.com; dmarc=none
Ronak,
Thank you for the quick reply.
Well, I’m using srun to run NAMD. Here’s the command,
date;time srun -n 36 -N 2 -p fp-gpgpu-3 --mem=36GB ./namd2.prj stmv/stmv.namd >namd2.prj.fp-gpgpu-3.6.log;date
How can I submit a similar charmrun command targeting 36 processors, 2 nodes, the fp-gpgpu-3 queue partition, 36GB of memory and +logsize of 10000000?
Oh, I’m not getting the exception anymore and unfortunately, during that run, I didn’t log the results to a file.
If it occurs again, I’ll forward the log file.
Thanks, Bob
From: Ronak Buch <rabuch2 AT illinois.edu>
Hi Bob,
Regarding the +logsize parameter, it is a runtime parameter, not a compile time parameter, so you shouldn't add it to the Makefile, you should add to your run command (e.g. ./charmrun +p2 ./namd <namd input file name> +logsize 10000000).
Regarding the exception you're seeing, I'm not sure why that's happening, it's likely due to some issue in initialization. Would it be possible for you to share the generated logs for debugging?
Thanks, Ronak
On Thu, Dec 10, 2020 at 12:36 PM Ortega, Bob <bobo AT mail.smu.edu> wrote:
|
- [charm] FW: Projections, Ortega, Bob, 12/10/2020
- <Possible follow-up(s)>
- Re: [charm] FW: Projections, Ronak Buch, 12/10/2020
- Re: [charm] FW: Projections, Ortega, Bob, 12/10/2020
- Message not available
- Re: [charm] FW: Projections, Ronak Buch, 12/11/2020
- Message not available
- Message not available
- Message not available
- Message not available
- Message not available
- Re: [charm] FW: Projections, Ortega, Bob, 12/14/2020
- Message not available
- Message not available
- Message not available
- Re: [charm] FW: Projections, Ronak Buch, 12/11/2020
- Message not available
- Message not available
- Message not available
- Message not available
- Message not available
- Message not available
- Re: [charm] FW: Projections, Ronak Buch, 12/14/2020
- Re: [charm] FW: Projections, Ortega, Bob, 12/14/2020
- Message not available
- Message not available
- Message not available
- [charm] FW: FW: Projections, Ortega, Bob, 12/15/2020
- Message not available
- Message not available
Archive powered by MHonArc 2.6.19.