John Martin wrote:
> Rayson Ho wrote:
>
>> If you can share the changes, I believe Chris and I will be able to
>> merge it to the official tree, as many other members of Open64 are
>> interested in the Solaris/AMD64 port.
>>
>>
>>
> All of the changes I made were to code I left at the ve
John Martin wrote:
> Rayson Ho wrote:
>
>> Just want to see if anyone on the lists is interested in getting the
>> Open64 compiler ported to Solaris/OpenSolaris on the AMD64/EM64T
>> platform...
>>
>> (And FYI: PathScale is based on Open64...)
>>
>> Please join the Open64 project if you are inte
Rayson Ho wrote:
>
> If you can share the changes, I believe Chris and I will be able to
> merge it to the official tree, as many other members of Open64 are
> interested in the Solaris/AMD64 port.
>
>
All of the changes I made were to code I left at the vendor site. [The
vendor
does monitor
Hi John,
The "thousands of source files" I was talking about were mostly
related to missing header files on Solaris or different locations of
the system header files. I did a gmake -i >& /tmp/error.txt but I
haven't gone through the error messages yet. However, one of the
missing files was endian
Rayson Ho wrote:
> Just want to see if anyone on the lists is interested in getting the
> Open64 compiler ported to Solaris/OpenSolaris on the AMD64/EM64T
> platform...
>
> (And FYI: PathScale is based on Open64...)
>
> Please join the Open64 project if you are interested!!
>
I have ported open6
Hello Frank:
Thanks for the report. My first thought is that you are running into a
known problem when running on a single node. If the code has one rank
doing mostly sends while the other does mostly receives, it may hang.
There are details on this problem at
https://svn.open-mpi.org/trac/
Frank, can you confirm for us what sort of message traffic your program
has? E.g., does the program comprise exclusively broadcasts from the
same root? Many MPI programs are "balanced" in that each process both
sends and receives. The problem Rolf mentioned comes about when a
process sends e
Hello all,
I noticed something weird with the MPI in
ClusterTools 8.
Our application, which runs correctly with half a dozen
other MPI implementations and also with ClusterTools 6 & 7
hangs reproduceably in MPI_Bcast (on one node, multiple processes).
There are definitely no dangling barriers or
Just want to see if anyone on the lists is interested in getting the
Open64 compiler ported to Solaris/OpenSolaris on the AMD64/EM64T
platform...
(And FYI: PathScale is based on Open64...)
Please join the Open64 project if you are interested!!
Thanks,
Rayson
--- On Mon, 10/27/08, "C. Bergstr?