Bug#790803: ITP: Amp -- atomistic machine-learning potentials

2017-06-10 Thread Muammar El Khatib
Control: retitle 790803 ITP: Amp -- atomistic machine-learning potentials
Control: owner 790803 !

thanks,

I am trying for the third time because for some reason the BTS is not recording
my changes.

--
Muammar El Khatib.
Linux user: 403107.
GPG Key = 71246E4A.
http://muammar.me | http://proyectociencia.org
  ,''`.
 : :' :
 `. `'
   `-



Bug#790803: ITP: Amp -- atomistic machine-learning potentials

2017-06-09 Thread Muammar El Khatib
retitle 790803 ITP: Amp -- atomistic machine-learning potentials
owner 790803 muam...@debian.org
thanks

-- 
Muammar El Khatib.
Linux user: 403107.
GPG Key = 71246E4A.
http://muammar.me | http://proyectociencia.org
  ,''`.
 : :' :
 `. `'
   `-



Bug#790803: ITP: amp -- atomistic machine-learning potentials

2016-10-24 Thread Muammar El Khatib


On 10/24/2016 04:09 AM, Graham Inggs wrote:
>> That would be great!.
> I have sent it, let me know if you didn't receive it.
> 

I have received it correctly.

>> > I forgot to answer that. I would love to team-maintain scalapack in
>> > debian-science!. I do not have too much time for maintaining it  as it
>> > deserves. I will read the wiki of Debian science and request to be added to
>> > the group.
> Thanks!  Would you consider doing the same for blacs-mpi?

Yeap. For blacs libraries as well. I will try to do everything by
Wednesday. If I find any troubles with the transition to team-maintain
scalapack+blacs I will contact you.

Regards,

-- 
Muammar El Khatib.
Linux user: 403107.
GPG Key = 71246E4A.
http://muammar.me | http://proyectociencia.org
  ,''`.
 : :' :
 `. `'
   `-



Bug#790803: ITP: amp -- atomistic machine-learning potentials

2016-10-24 Thread Graham Inggs
On 24 October 2016 at 05:36, Muammar El Khatib  wrote:
> On 10/20/2016 03:28 AM, Graham Inggs wrote:
>> No, but I do have a local packaging of neural (before the name changed
>> to amp) which was working, but since the project changed to amp and
>> was re-organized, it longer works and I don't know if any of it is
>> still relevant.  I can mail it to you privately, if you wish.
>>
>
> That would be great!.

I have sent it, let me know if you didn't receive it.

> I forgot to answer that. I would love to team-maintain scalapack in
> debian-science!. I do not have too much time for maintaining it  as it
> deserves. I will read the wiki of Debian science and request to be added to
> the group.

Thanks!  Would you consider doing the same for blacs-mpi?



Bug#790803: ITP: amp -- atomistic machine-learning potentials

2016-10-23 Thread Muammar El Khatib



On 10/20/2016 03:28 AM, Graham Inggs wrote:

On 20 October 2016 at 02:50, Muammar El Khatib  wrote:



Are you working on a git repo available in the debian platform?, if so, could 
you
point me out to it?. I will be playing around with amp in the following days and
I could help with the packaging.


No, but I do have a local packaging of neural (before the name changed
to amp) which was working, but since the project changed to amp and
was re-organized, it longer works and I don't know if any of it is
still relevant.  I can mail it to you privately, if you wish.



That would be great!.


While preparing to package amp, with the help of Marcin Dulak and Ask
Hjorth Larson, I did manage to get the prerequisites python-ase, gpaw
and gpaw-setups updated and into the archive.

BTW, I though I recognized your name from somewhere, would you mind
taking a look at #671380 ?



I forgot to answer that. I would love to team-maintain scalapack in 
debian-science!. I do not have too much time for maintaining it  as it 
deserves. I will read the wiki of Debian science and request to be added 
to the group.


Regards,
--
Muammar El Khatib.
Linux user: 403107.
GPG Key = 71246E4A.
http://muammar.me | http://proyectociencia.org
  ,''`.
 : :' :
 `. `'
   `-



Bug#790803: ITP: amp -- atomistic machine-learning potentials

2016-10-20 Thread Graham Inggs
On 20 October 2016 at 02:50, Muammar El Khatib  wrote:
> I discussed with Peterson and Alireza, and there is a new version on the go
> (v0.5.0, maybe in a month or something). What we could do is to work on
> snapshots from master (that is the development branch). What do you think?.

Sounds good!

> Are you working on a git repo available in the debian platform?, if so, could 
> you
> point me out to it?. I will be playing around with amp in the following days 
> and
> I could help with the packaging.

No, but I do have a local packaging of neural (before the name changed
to amp) which was working, but since the project changed to amp and
was re-organized, it longer works and I don't know if any of it is
still relevant.  I can mail it to you privately, if you wish.

While preparing to package amp, with the help of Marcin Dulak and Ask
Hjorth Larson, I did manage to get the prerequisites python-ase, gpaw
and gpaw-setups updated and into the archive.

BTW, I though I recognized your name from somewhere, would you mind
taking a look at #671380 ?



Bug#790803: ITP: amp -- atomistic machine-learning potentials

2016-10-19 Thread Muammar El Khatib
Hi Graham,

On Tue, Oct 18, 2016 at 05:33:42PM +0200, Graham Inggs wrote:
> Hi Muammar
>
> Sadly, this fell off my radar. :(


It's ok :).

>
> On 18 October 2016 at 17:00, Muammar El Khatib  wrote:
> > I am interested in participating in the packaging of amp. I recently
> > joined Prof. Peterson's group as postdoctoral research associate at
> > Brown, and thus I will be involved in amp (use/development). I would
> > be glad if you let me know how I can help you with.
>
> I did have a problem with relative imports when running the tests.  I
> ended up repacking the tarball and moving some of the files and
> directories into a directory named amp which seemed to improve things.

I think there are some problems related to those imports, I will take a look at
that as well from here.

>
> Is v0.4.1 the version we should be working on, or can you tag
> something more recent?
>

I discussed with Peterson and Alireza, and there is a new version on the go
(v0.5.0, maybe in a month or something). What we could do is to work on
snapshots from master (that is the development branch). What do you think?.

Are you working on a git repo available in the debian platform?, if so, could 
you
point me out to it?. I will be playing around with amp in the following days and
I could help with the packaging.

Regards,
--
Muammar El Khatib.
http://muammar.me | http://proyectociencia.org



Bug#790803: ITP: amp -- atomistic machine-learning potentials

2016-10-18 Thread Graham Inggs
Hi Muammar

Sadly, this fell off my radar. :(

On 18 October 2016 at 17:00, Muammar El Khatib  wrote:
> I am interested in participating in the packaging of amp. I recently
> joined Prof. Peterson's group as postdoctoral research associate at
> Brown, and thus I will be involved in amp (use/development). I would
> be glad if you let me know how I can help you with.

I did have a problem with relative imports when running the tests.  I
ended up repacking the tarball and moving some of the files and
directories into a directory named amp which seemed to improve things.

Is v0.4.1 the version we should be working on, or can you tag
something more recent?

Regards
Graham



Bug#790803: ITP: amp -- atomistic machine-learning potentials

2016-10-18 Thread Muammar El Khatib
Dear All,

On Fri, Nov 20, 2015 at 5:17 AM, Graham Inggs  wrote:
> retitle 790803 amp -- atomistic machine-learning potentials
> owner 790803 gin...@debian.org
> thanks
>
> Upstream have relaunched Neural as Amp.
>
> * Package name: amp
>   Version : 0.3
>   Upstream Author : Andrew Peterson, Alireza Khorshidi
> * URL : https://bitbucket.org/andrewpeterson/amp
> * License : GPL-3.0+
>   Programming Lang: Python
>   Description : Atomistic Machine-learning Potentials
> Amp is an open-source package designed to easily bring machine-learning to
> atomistic calculations. This allows one to predict (or really, interpolate)
> calculations on the potential energy surface, by first building up a
> regression representation of a “train set” of atomic images. Amp calculator
> works by first learning from any other calculator (usually quantum
> mechanical calculations) that can provide energy and forces as a function of
> atomic coordinates. In theory, these predictions can take place with
> arbitrary accuracy approaching that of the original calculator.
> .
> Amp is designed to integrate closely with the Atomic Simulation Environment
> (ASE). As such, the interface is in pure python, although several
> compute-heavy parts of the underlying codes also have fortran versions to
> accelerate the calculations. The close integration with ASE means that any
> calculator that works with ASE - including EMT, GPAW, DACAPO, VASP, NWChem,
> and Gaussian - can easily be used as the parent method.
>
> I intend maintaining this package as part of the DebiChem team.
>
> I found there was a packaged named amp in Debian circa 2000; the Audio MPEG
> Player in non-free, but I don't believe this is a problem.
>

I am interested in participating in the packaging of amp. I recently
joined Prof. Peterson's group as postdoctoral research associate at
Brown, and thus I will be involved in amp (use/development). I would
be glad if you let me know how I can help you with.


Regards,
-- 
Muammar El Khatib.
Linux user: 403107.
GPG Key = 71246E4A.
http://muammar.me | http://proyectociencia.org
  ,''`.
 : :' :
 `. `'
   `-



Bug#790803: ITP: amp -- atomistic machine-learning potentials

2015-11-20 Thread Graham Inggs

retitle 790803 amp -- atomistic machine-learning potentials
owner 790803 gin...@debian.org
thanks

Upstream have relaunched Neural as Amp.

* Package name: amp
  Version : 0.3
  Upstream Author : Andrew Peterson, Alireza Khorshidi
* URL : https://bitbucket.org/andrewpeterson/amp
* License : GPL-3.0+
  Programming Lang: Python
  Description : Atomistic Machine-learning Potentials
Amp is an open-source package designed to easily bring machine-learning 
to atomistic calculations. This allows one to predict (or really, 
interpolate) calculations on the potential energy surface, by first 
building up a regression representation of a “train set” of atomic 
images. Amp calculator works by first learning from any other calculator 
(usually quantum mechanical calculations) that can provide energy and 
forces as a function of atomic coordinates. In theory, these predictions 
can take place with arbitrary accuracy approaching that of the original 
calculator.

.
Amp is designed to integrate closely with the Atomic Simulation 
Environment (ASE). As such, the interface is in pure python, although 
several compute-heavy parts of the underlying codes also have fortran 
versions to accelerate the calculations. The close integration with ASE 
means that any calculator that works with ASE - including EMT, GPAW, 
DACAPO, VASP, NWChem, and Gaussian - can easily be used as the parent 
method.


I intend maintaining this package as part of the DebiChem team.

I found there was a packaged named amp in Debian circa 2000; the Audio 
MPEG Player in non-free, but I don't believe this is a problem.