On Jan 3, 2014, at 8:15 AM, Muhammad Wahaj Sethi <se...@hlrs.de> wrote:
> Thanks for the quick reply.
>
> Right now I am making use of option no 1.
>
> As per my understanding, mtt runs tests only if new versions/updates are
> available. Info about last run is stored in the scratch dir.
No, it is stored in the mtt-versions directory for just that reason. You should
have an entry like the following in your .ini file:
#==
# MPI get phase
#==
[MPI get: ompi-nightly-trunk]
mpi_details = Open MPI
module = OMPI_Snapshot
ompi_snapshot_url = http://www.open-mpi.org/nightly/trunk
ompi_snapshot_version_file =
/home/common/mtt-versions/trunk("MTT_VERSION_FILE_SUFFIX").txt
#--
# Get the 1.7 nightly
[MPI get: ompi-nightly-v1.7]
mpi_details = Open MPI
module = OMPI_Snapshot
ompi_snapshot_url = http://www.open-mpi.org/nightly/v1.7
ompi_snapshot_version_file =
/home/common/mtt-versions/v1.7("MTT_VERSION_FILE_SUFFIX").txt
#--
Note the "snapshot_version_file" entry where the version gets stored
>
> By removing/changing workspace dir this info will be lost. And next mtt
> execution may submit old results again.
>
> - Original Message -
> From: "Ralph Castain" <r...@open-mpi.org>
> To: "MTT Users" <mtt-us...@open-mpi.org>
> Sent: Friday, January 3, 2014 4:11:18 PM
> Subject: Re: [MTT users] mtt fails, error: identical key already exists
>
> You have two options:
>
> 1. After each mtt execution, run a "cleanup" script that simply does "rm -rf
> "
>
> 2. give the mtt scratch location a different name for each execution.
> However, be careful as you will fill your disk this way - so perhaps run a
> cleanup script every N times that whacks the oldest location
>
> Ralph
>
> On Jan 3, 2014, at 4:42 AM, Muhammad Wahaj Sethi <se...@hlrs.de> wrote:
>
>> Hello!
>> I am running mtt daily using a cron job. After every successful execution,
>> next mtt execution fails with the following error message.
>>
>> *** ERROR: An identical key already exists in memory when MTT tried to read
>> the file
>> /lustre/ws1/ws/hpcmtt-mtt-0/mtt-scratch/sources/mpi_sources-ompi-nightly-trunk.1.9a1r30043.dump
>> (key=1.9a1r30042). This should not happen. It likely indicates that
>> multiple MTT clients are incorrectly operating in the same scratch tree. at
>> /zhome/academic/HLRS/hlrs/hpcmtt/mtt-trunk/lib/MTT/Messages.pm line 131.
>>
>> I have checked mtt successful run log file and was not able to notice
>> something unusual.
>>
>> Kindly, let me know if addition info is required.
>>
>> Any help will be greatly appreciated.
>>
>> regards,
>> Wahaj
>> ___
>> mtt-users mailing list
>> mtt-us...@open-mpi.org
>> http://www.open-mpi.org/mailman/listinfo.cgi/mtt-users
>
> ___
> mtt-users mailing list
> mtt-us...@open-mpi.org
> http://www.open-mpi.org/mailman/listinfo.cgi/mtt-users
> ___
> mtt-users mailing list
> mtt-us...@open-mpi.org
> http://www.open-mpi.org/mailman/listinfo.cgi/mtt-users