Failed to initialize GAHP means that for some reason you don't have
a proxy that is visible to Condor.
Also the syntax that you are using to submit that
file is outdated.

Should have
universe = grid
GridResource = gt4 https://b.xx.xx.xx:8443 Fork

The syntax you are using "globusscheduler" is only for GT2 resources
and is deprecated besides.
Steve


On Tue, 10 Mar 2009, Martin Feller wrote:

AFAIK Gahp initialization pure Condor, so I think this
question is for the Condor group.

-Martin

Samir Khanal wrote:
Hi All
I don;t know where to ask this question (Condor or Globus)

I had setup a Globus_condor_g grid

A had the Gatekeeper and B had to submit jobs to A
Everything was going smoothly and i could submit PBS/CONDOR both type of jobs

Then i was asked to reverse the situation
B had to be the gatekeeper (as it had larger resources) and A had now to submit 
jobs to B's resources.
I used the GT4 quickstart guide and every setup went well, except now when i 
submit Grid jobs via Condor-G
The jobs get held.

executable = /bin/date
Transfer_Executable = false
globusscheduler = B.xx.xx.xx/jobmanager-fork
universe = grid
output = date.out
error=date.error
log = date.log
queue


The same script worked the other way around
The myproxy login and all other stuffs work, besides this problem.

An i looked into the submit.log
it says

012 (086.000.000) 03/05 18:28:53 Job was held.
        Failed to initialize GAHP
        Code 0 Subcode 0
...
 I then tried

[~]$ /opt/condor/sbin/gt4_gahp
$GahpVersion: 1.7.1 Apr 23 2008 GT4\ GAHP\ (GT-4.0.4) $


and it does start (JAVA Is set up correctly)

What seems to be the problem? i am a bit stuck with this.
I am using Rocks 5.1, GT 4.2.1. Condor Roll that came with Rocks 5.1


Thanks
Samir




--
------------------------------------------------------------------
Steven C. Timm, Ph.D  (630) 840-8525
[email protected]  http://home.fnal.gov/~timm/
Fermilab Computing Division, Scientific Computing Facilities,
Grid Facilities Department, FermiGrid Services Group, Assistant Group Leader.

Reply via email to