Skip to Content.
Sympa Menu

charm - Re: [charm] AMPI porting I/O issue

charm AT lists.cs.illinois.edu

Subject: Charm++ parallel programming system

List archive

Re: [charm] AMPI porting I/O issue


Chronological Thread 
  • From: Nikhil Jain <nikhil AT illinois.edu>
  • To: <m.pavanakumar AT gmail.com>
  • Cc: "charm AT cs.illinois.edu" <charm AT cs.illinois.edu>, "Kale, Laxmikant V" <kale AT illinois.edu>
  • Subject: Re: [charm] AMPI porting I/O issue
  • Date: Wed, 20 Feb 2013 14:58:33 -0600
  • List-archive: <http://lists.cs.uiuc.edu/pipermail/charm/>
  • List-id: CHARM parallel programming system <charm.cs.uiuc.edu>

Hi Pavanakumar,

Can you provide more details on compilation issues that you are facing in building HDF5 with AMPI? I just tried building HDF5 with AMPI and could see some of the issues that you might have encountered; however, with some tweaks to the build script/Makefile of HDF5, I think one can build HDF5 with AMPI. 

At the very least, I will mention that AMPI ships with a version of ROMIO that can be built by passing 
--with-romio as a build flag. I just checked in a fix to handle a scenario that was earlier preventing this flag from being effective.

--Nikhil


--
Nikhil Jain, nikhil AT illinois.edu, http://charm.cs.uiuc.edu/people/nikhil
Doctoral Candidate @ CS, UIUC
 


From: "Kale, Laxmikant V" <kale AT illinois.edu>
Date: Tuesday, February 19, 2013 12:53 PM
To: Nikhil Jain <nikhil AT illinois.edu>
Subject: FW: [charm] AMPI porting I/O issue
Resent-From: Nikhil Jain <nikhil AT illinois.edu>
Resent-Date: Tue, 19 Feb 2013 12:54:40 -0600


-- 
Laxmikant (Sanjay) Kale         http://charm.cs.uiuc.edu
Professor, Computer Science     kale AT illinois.edu
201 N. Goodwin Avenue           Ph:  (217) 244-0094
Urbana, IL  61801-2302          FAX: (217) 265-6582

On 2/19/13 12:35 AM, "Pavanakumar Mohanamuraly" <m.pavanakumar AT gmail.com> wrote:

Hi

1) I am porting one of our in-house CFD solver (MPI based) to AMPI. 

2) The solver uses a single network mounted file for mesh and solution data

3) The individual MPI process (based on local decomposition information) read/write to this single file in parallel.

4) I am unable to successfully compile HDF5 library (v1.8.10-p1) using AMPI and hence cannot use the HDF5 file I/O

Can someone suggest a work around ?

Regards,

--
Pavanakumar Mohanamuraly



Archive powered by MHonArc 2.6.16.

Top of Page