Skip to Content.
Sympa Menu

charm - RE: [charm] Scalable Deep Learning using Charm++

charm AT lists.cs.illinois.edu

Subject: Charm++ parallel programming system

List archive

RE: [charm] Scalable Deep Learning using Charm++


Chronological Thread 
  • From: "Galvez Garcia, Juan Jose" <jjgalvez AT illinois.edu>
  • To: Abid Muslim Malik <abidmuslim AT gmail.com>
  • Cc: "charm AT lists.cs.illinois.edu" <charm AT lists.cs.illinois.edu>
  • Subject: RE: [charm] Scalable Deep Learning using Charm++
  • Date: Thu, 19 Jul 2018 02:03:28 +0000
  • Accept-language: en-US
  • Authentication-results: illinois.edu; spf=pass smtp.mailfrom=jjgalvez AT illinois.edu; dmarc=pass header.from=illinois.edu

Yes, it should be possible to integrate with TensorFlow, at least in the way that others have done it so far (like MPI for Python and Ray), because charmpy has the capabilities that those are using.

 

Compared to MPI, CharmPy has added benefits especially in terms of asynchronous execution model, and adaptive runtime features, which allows for things like dynamic load balancing and fault tolerance. So, potentially more interesting things could be done.

 

CharmPy also has better performance/scalability compared to other high-level Python frameworks that we have looked at. I don't know if you have explored Dask or Ray. Recently we ran simple benchmarks and charmpy was substantially faster, and is more scalable than those. I can give you more details/numbers if you are interested.

 

-Juan

 

From: Abid Muslim Malik
Sent: Wednesday, July 18, 2018 8:16 PM
To: Galvez Garcia, Juan Jose
Cc: charm AT lists.cs.illinois.edu
Subject: Re: [charm] Scalable Deep Learning using Charm++

 

Hi Juan:
Thanks for the e-mail. We are exploring different parallel programming models for scalable machine learning. I think Charm++ has better runtime support and has the ability to scale well.
Can we integrate it with Tensorflow? People use py4mpi for distributed Tensorflow. Can I use Charmpy for distributed tensorflow learning?

Thanks-

On Wed, Jul 18, 2018 at 4:32 PM, Galvez Garcia, Juan Jose <jjgalvez AT illinois.edu> wrote:

Hello Abid,

 

That is something we are interested in, and which we want to explore with CharmPy (https://github.com/UIUC-PPL/charmpy) and Charm++, but we don’t currently have a project. If you are interested, we would be happy to collaborate.

 

-Juan

 

From: Abid Muslim Malik
Sent: Tuesday, July 17, 2018 9:05 AM
To: charm AT lists.cs.illinois.edu
Subject: [charm] Scalable Deep Learning using Charm++

 

Hello,

 

Is there a work on distributed deep learning using Charm++?

 

 

Thanks,

 

Abid

Staff Engineer

BNL

 

 

 




--
Abid M. Malik
Brookhaven National Lab

******************************************************
"I have learned silence from the talkative, toleration from the intolerant, and kindness from the unkind"---Gibran
"Success is not for the chosen few, but for the few who choose" --- John Maxwell
"Being a good person does not depend on your religion or status in life, your race or skin color, political views or culture. IT DEPENDS ON HOW GOOD YOU TREAT OTHERS"--- Abid
"The Universe is talking to us, and the language of the Universe is mathematics."----Abid
 



Archive powered by MHonArc 2.6.19.

Top of Page