Re: [hwloc-users] ? Finding cache & pci info on SPARC/Solaris 11.3

2017-06-09 Thread Brice Goglin
Thanks a lot for the input.
I opened https://github.com/open-mpi/hwloc/issues/243
I have access to a T5 but this will need investigation to actually find
where to get the info from.
Feel free to comment the issue if you find more. I am going to modify
Pg.pm to better understand where Caches come from.

Brice




Le 09/06/2017 09:11, Maureen Chew a écrit :
> Re: cache relationship… ah… so you’d need to parse both
> prtpicl(8) (to get sizes)  and something like pginfo(8) (perl script)
> to get
> relationship…..
>
> bash-4.3$ pginfo -v | more
> 0 (System [system]) CPUs: 0-511
> |-- 5 (Data_Pipe_to_memory [socket 0]) CPUs: 0-255
> |   |-- 4 (L3_Cache) CPUs: 0-31
> |   |   `-- 6 (CPU_PM_Active_Power_Domain) CPUs: 0-31
> |   |   |-- 3 (L2_Cache) CPUs: 0-15
> |   |   |   |-- 2 (Floating_Point_Unit [core 0]) CPUs: 0-7
> |   |   |   |   `-- 1 (Integer_Pipeline [core 0]) CPUs: 0-7
> |   |   |   `-- 8 (Floating_Point_Unit [core 1]) CPUs: 8-15
> |   |   |   `-- 7 (Integer_Pipeline [core 1]) CPUs: 8-15
> |   |   `-- 11 (L2_Cache) CPUs: 16-31
> |   |   |-- 10 (Floating_Point_Unit [core 2]) CPUs: 16-23
> |   |   |   `-- 9 (Integer_Pipeline [core 2]) CPUs: 16-23
> |   |   `-- 13 (Floating_Point_Unit [core 3]) CPUs: 24-31
> |   |   `-- 12 (Integer_Pipeline [core 3]) CPUs: 24-31
> |   |-- 17 (L3_Cache) CPUs: 32-63
> |   |   `-- 18 (CPU_PM_Active_Power_Domain) CPUs: 32-63
> |   |   |-- 16 (L2_Cache) CPUs: 32-47
> |   |   |   |-- 15 (Floating_Point_Unit [core 4]) CPUs: 32-39
> |   |   |   |   `-- 14 (Integer_Pipeline [core 4]) CPUs: 32-39
> |   |   |   `-- 20 (Floating_Point_Unit [core 5]) CPUs: 40-47
> |   |   |   `-- 19 (Integer_Pipeline [core 5]) CPUs: 40-47
> |   |   `-- 23 (L2_Cache) CPUs: 48-63
> |   |   |-- 22 (Floating_Point_Unit [core 6]) CPUs: 48-55
> |   |   |   `-- 21 (Integer_Pipeline [core 6]) CPUs: 48-55
> |   |   `-- 25 (Floating_Point_Unit [core 7]) CPUs: 56-63
> |   |   `-- 24 (Integer_Pipeline [core 7]) CPUs: 56-63
> |   |-- 29 (L3_Cache) CPUs: 64-95
> |   |   `-- 30 (CPU_PM_Active_Power_Domain) CPUs: 64-95
> |   |   |-- 28 (L2_Cache) CPUs: 64-79
> |   |   |   |-- 27 (Floating_Point_Unit [core 8]) CPUs: 64-71
> |   |   |   |   `-- 26 (Integer_Pipeline [core 8]) CPUs: 64-71
> |   |   |   `-- 32 (Floating_Point_Unit [core 9]) CPUs: 72-79
> |   |   |   `-- 31 (Integer_Pipeline [core 9]) CPUs: 72-79
> |   |   `-- 35 (L2_Cache) CPUs: 80-95
> |   |   |-- 34 (Floating_Point_Unit [core 10]) CPUs: 80-87
> |   |   |   `-- 33 (Integer_Pipeline [core 10]) CPUs: 80-87
> |   |   `-- 37 (Floating_Point_Unit [core 11]) CPUs: 88-95
> |   |   `-- 36 (Integer_Pipeline [core 11]) CPUs: 88-95
> |   |-- 41 (L3_Cache) CPUs: 96-127
> |   |   `-- 42 (CPU_PM_Active_Power_Domain) CPUs: 96-127
> |   |   |-- 40 (L2_Cache) CPUs: 96-111
> |   |   |   |-- 39 (Floating_Point_Unit [core 12]) CPUs: 96-103
> |   |   |   |   `-- 38 (Integer_Pipeline [core 12]) CPUs: 96-103
> |   |   |   `-- 44 (Floating_Point_Unit [core 13]) CPUs: 104-111
> |   |   |   `-- 43 (Integer_Pipeline [core 13]) CPUs: 104-111
> |   |   `-- 47 (L2_Cache) CPUs: 112-127
> |   |   |-- 46 (Floating_Point_Unit [core 14]) CPUs: 112-119
> |   |   |   `-- 45 (Integer_Pipeline [core 14]) CPUs: 112-119
> |   |   `-- 49 (Floating_Point_Unit [core 15]) CPUs: 120-127
> |   |   `-- 48 (Integer_Pipeline [core 15]) CPUs: 120-127
> |   |-- 53 (L3_Cache) CPUs: 128-159
> |   |   `-- 54 (CPU_PM_Active_Power_Domain) CPUs: 128-159
> |   |   |-- 52 (L2_Cache) CPUs: 128-143
> |   |   |   |-- 51 (Floating_Point_Unit [core 16]) CPUs: 128-135
> |   |   |   |   `-- 50 (Integer_Pipeline [core 16]) CPUs: 128-135
> |   |   |   `-- 56 (Floating_Point_Unit [core 17]) CPUs: 136-143
> |   |   |   `-- 55 (Integer_Pipeline [core 17]) CPUs: 136-143
> |   |   `-- 59 (L2_Cache) CPUs: 144-159
> |   |   |-- 58 (Floating_Point_Unit [core 18]) CPUs: 144-151
> |   |   |   `-- 57 (Integer_Pipeline [core 18]) CPUs: 144-151
> |   |   `-- 61 (Floating_Point_Unit [core 19]) CPUs: 152-159
> |   |   `-- 60 (Integer_Pipeline [core 19]) CPUs: 152-159
> |   |-- 65 (L3_Cache) CPUs: 160-191
> |   |   `-- 66 (CPU_PM_Active_Power_Domain) CPUs: 160-191
> |   |   |-- 64 (L2_Cache) CPUs: 160-175
> |   |   |   |-- 63 (Floating_Point_Unit [core 20]) CPUs: 160-167
> |   |   |   |   `-- 62 (Integer_Pipeline [core 20]) CPUs: 160-167
> |   |   |   `-- 68 (Floating_Point_Unit [core 21]) CPUs: 168-175
> |   |   |   `-- 67 (Integer_Pipeline [core 21]) CPUs: 168-175
> |   |   `-- 71 (L2_Cache) CPUs: 176-191
> |   |   |-- 70 (Floating_Point_Unit [core 22]) CPUs: 176-183
> |   |   |   `-- 69 (Integer_Pipeline [core 22]) CPUs: 176-183
> |   |   `-- 73 (Floating_Point_Unit 

Re: [hwloc-users] ? Finding cache & pci info on SPARC/Solaris 11.3

2017-06-09 Thread Maureen Chew

> Le 08/06/2017 16:58, Samuel Thibault a ?crit :
>> Hello,
>> 
>> Maureen Chew, on jeu. 08 juin 2017 10:51:56 -0400, wrote:
>>> Should finding cache & pci info work?
>> AFAWK, there is no user-available way to get cache information on
>> Solaris, so it's not implemented in hwloc.
> 
> And even if prtpicl reports some information using the PICL API, I don't
> think it says how caches are shared between cores.
> 
>> Concerning pci, you need libpciaccess to get PCI information.
>> 
> 
> And usually you need root access (I think it looks inside /devices/pci*
> where files are root-only).
> 
> Brice


Thanks so much Samuel and Brice.

Did have libpciaccess and now  able to get pci info as root.   
Will see if an RBAC/pfexec combo could be reasonable
# /usr/local/bin/hwloc-info -v
depth 0:1 Machine (type #1)
 depth 1:   2 NUMANode (type #2)
  depth 2:  2 Package (type #3)
   depth 3: 64 Core (type #5)
depth 4:512 PU (type #6)
Special depth -3:   16 Bridge (type #9)
Special depth -4:   10 PCI Device (type #10)

Re: cache relationship… ah… so you’d need to parse both
prtpicl(8) (to get sizes)  and something like pginfo(8) (perl script) to get
relationship…..

bash-4.3$ pginfo -v | more
0 (System [system]) CPUs: 0-511
|-- 5 (Data_Pipe_to_memory [socket 0]) CPUs: 0-255
|   |-- 4 (L3_Cache) CPUs: 0-31
|   |   `-- 6 (CPU_PM_Active_Power_Domain) CPUs: 0-31
|   |   |-- 3 (L2_Cache) CPUs: 0-15
|   |   |   |-- 2 (Floating_Point_Unit [core 0]) CPUs: 0-7
|   |   |   |   `-- 1 (Integer_Pipeline [core 0]) CPUs: 0-7
|   |   |   `-- 8 (Floating_Point_Unit [core 1]) CPUs: 8-15
|   |   |   `-- 7 (Integer_Pipeline [core 1]) CPUs: 8-15
|   |   `-- 11 (L2_Cache) CPUs: 16-31
|   |   |-- 10 (Floating_Point_Unit [core 2]) CPUs: 16-23
|   |   |   `-- 9 (Integer_Pipeline [core 2]) CPUs: 16-23
|   |   `-- 13 (Floating_Point_Unit [core 3]) CPUs: 24-31
|   |   `-- 12 (Integer_Pipeline [core 3]) CPUs: 24-31
|   |-- 17 (L3_Cache) CPUs: 32-63
|   |   `-- 18 (CPU_PM_Active_Power_Domain) CPUs: 32-63
|   |   |-- 16 (L2_Cache) CPUs: 32-47
|   |   |   |-- 15 (Floating_Point_Unit [core 4]) CPUs: 32-39
|   |   |   |   `-- 14 (Integer_Pipeline [core 4]) CPUs: 32-39
|   |   |   `-- 20 (Floating_Point_Unit [core 5]) CPUs: 40-47
|   |   |   `-- 19 (Integer_Pipeline [core 5]) CPUs: 40-47
|   |   `-- 23 (L2_Cache) CPUs: 48-63
|   |   |-- 22 (Floating_Point_Unit [core 6]) CPUs: 48-55
|   |   |   `-- 21 (Integer_Pipeline [core 6]) CPUs: 48-55
|   |   `-- 25 (Floating_Point_Unit [core 7]) CPUs: 56-63
|   |   `-- 24 (Integer_Pipeline [core 7]) CPUs: 56-63
|   |-- 29 (L3_Cache) CPUs: 64-95
|   |   `-- 30 (CPU_PM_Active_Power_Domain) CPUs: 64-95
|   |   |-- 28 (L2_Cache) CPUs: 64-79
|   |   |   |-- 27 (Floating_Point_Unit [core 8]) CPUs: 64-71
|   |   |   |   `-- 26 (Integer_Pipeline [core 8]) CPUs: 64-71
|   |   |   `-- 32 (Floating_Point_Unit [core 9]) CPUs: 72-79
|   |   |   `-- 31 (Integer_Pipeline [core 9]) CPUs: 72-79
|   |   `-- 35 (L2_Cache) CPUs: 80-95
|   |   |-- 34 (Floating_Point_Unit [core 10]) CPUs: 80-87
|   |   |   `-- 33 (Integer_Pipeline [core 10]) CPUs: 80-87
|   |   `-- 37 (Floating_Point_Unit [core 11]) CPUs: 88-95
|   |   `-- 36 (Integer_Pipeline [core 11]) CPUs: 88-95
|   |-- 41 (L3_Cache) CPUs: 96-127
|   |   `-- 42 (CPU_PM_Active_Power_Domain) CPUs: 96-127
|   |   |-- 40 (L2_Cache) CPUs: 96-111
|   |   |   |-- 39 (Floating_Point_Unit [core 12]) CPUs: 96-103
|   |   |   |   `-- 38 (Integer_Pipeline [core 12]) CPUs: 96-103
|   |   |   `-- 44 (Floating_Point_Unit [core 13]) CPUs: 104-111
|   |   |   `-- 43 (Integer_Pipeline [core 13]) CPUs: 104-111
|   |   `-- 47 (L2_Cache) CPUs: 112-127
|   |   |-- 46 (Floating_Point_Unit [core 14]) CPUs: 112-119
|   |   |   `-- 45 (Integer_Pipeline [core 14]) CPUs: 112-119
|   |   `-- 49 (Floating_Point_Unit [core 15]) CPUs: 120-127
|   |   `-- 48 (Integer_Pipeline [core 15]) CPUs: 120-127
|   |-- 53 (L3_Cache) CPUs: 128-159
|   |   `-- 54 (CPU_PM_Active_Power_Domain) CPUs: 128-159
|   |   |-- 52 (L2_Cache) CPUs: 128-143
|   |   |   |-- 51 (Floating_Point_Unit [core 16]) CPUs: 128-135
|   |   |   |   `-- 50 (Integer_Pipeline [core 16]) CPUs: 128-135
|   |   |   `-- 56 (Floating_Point_Unit [core 17]) CPUs: 136-143
|   |   |   `-- 55 (Integer_Pipeline [core 17]) CPUs: 136-143
|   |   `-- 59 (L2_Cache) CPUs: 144-159
|   |   |-- 58 (Floating_Point_Unit [core 18]) CPUs: 144-151
|   |   |   `-- 57 (Integer_Pipeline [core 18]) CPUs: 144-151
|   |   `-- 61 (Floating_Point_Unit [core 19]) CPUs: 152-159
|   |   `-- 60 (Integer_Pipeline [core 19]) CPUs: 152-159
|   |-- 65 (L3_Cache) CPUs: 160-191
|   |   `-- 66 (CPU_PM_Active_Power_Domain) CPUs: 160-191
|   |   

Re: [hwloc-users] ? Finding cache & pci info on SPARC/Solaris 11.3

2017-06-08 Thread Brice Goglin


Le 08/06/2017 16:58, Samuel Thibault a écrit :
> Hello,
>
> Maureen Chew, on jeu. 08 juin 2017 10:51:56 -0400, wrote:
>> Should finding cache & pci info work?
> AFAWK, there is no user-available way to get cache information on
> Solaris, so it's not implemented in hwloc.

And even if prtpicl reports some information using the PICL API, I don't
think it says how caches are shared between cores.

> Concerning pci, you need libpciaccess to get PCI information.
>

And usually you need root access (I think it looks inside /devices/pci*
where files are root-only).

Brice

___
hwloc-users mailing list
hwloc-users@lists.open-mpi.org
https://rfd.newmexicoconsortium.org/mailman/listinfo/hwloc-users


Re: [hwloc-users] ? Finding cache & pci info on SPARC/Solaris 11.3

2017-06-08 Thread Samuel Thibault
Hello,

Maureen Chew, on jeu. 08 juin 2017 10:51:56 -0400, wrote:
> Should finding cache & pci info work?

AFAWK, there is no user-available way to get cache information on
Solaris, so it's not implemented in hwloc.

Concerning pci, you need libpciaccess to get PCI information.

Samuel
___
hwloc-users mailing list
hwloc-users@lists.open-mpi.org
https://rfd.newmexicoconsortium.org/mailman/listinfo/hwloc-users


[hwloc-users] ? Finding cache & pci info on SPARC/Solaris 11.3

2017-06-08 Thread Maureen Chew
Just built hwloc-1.11.7 on Solaris 11.3 & SPARC T7-2.  Should finding
cache & pci info work?   Not sure if I missed some build flags as I just did a 
vanilla build.
Poked around in the mailist user and devel archives but didnt’ find anything.

Build script
#!/bin/sh
export CC=/opt/developerstudio12.5/bin/cc
export CFLAGS="-m64 -O3"

./configure 2>&1 | tee build.out
make install 2>&1 | tee -a build.out


bash-4.3$ HWLOC_COMPONENTS_VERBOSE=1 /usr/local/bin/hwloc-info -v
Registered cpu discovery component `no_os' with priority 40 (statically build)
Registered global discovery component `xml' with priority 30 (statically build)
Registered global discovery component `synthetic' with priority 30 (statically 
build)
Registered global discovery component `custom' with priority 30 (statically 
build)
Registered cpu discovery component `solaris' with priority 50 (statically build)
Registered misc discovery component `pci' with priority 20 (statically build)
Enabling cpu discovery component `solaris'
Enabling cpu discovery component `no_os'
Excluding global discovery component `xml', conflicts with excludes 0x2
Excluding global discovery component `synthetic', conflicts with excludes 0x2
Excluding global discovery component `custom', conflicts with excludes 0x2
Enabling misc discovery component `pci'
Final list of enabled discovery components: solaris,no_os,pci
depth 0:1 Machine (type #1)
 depth 1:   2 NUMANode (type #2)
  depth 2:  2 Package (type #3)
   depth 3: 64 Core (type #5)
depth 4:512 PU (type #6)
Disabling cpu discovery component `solaris'
Disabling cpu discovery component `no_os'
Disabling misc discovery component `pci’

bash-4.3$ /usr/sbin/prtpicl -v -c cpu | grep -i cache-size | sort | uniq
  :l1-dcache-size16384 
  :l1-icache-size16384 
  :l2-dcache-size262144 
  :l2-icache-size262144 
  :l3-cache-size 8388608 


TIA-
—maureen___
hwloc-users mailing list
hwloc-users@lists.open-mpi.org
https://rfd.newmexicoconsortium.org/mailman/listinfo/hwloc-users