Hi, Jonathan,
TestOrderBy is an excluded test. It does not seems to be written in a right
way. It runs against MiniCluster, but it uses local file as input. This is
not supported even in current code. All regular tests runs fine for me.

Daniel

On Mon, Jan 9, 2012 at 9:54 PM, Jon Coveney <[email protected]> wrote:

> Err, I mistyped. I did see the file permission error mentioned in the Jura
> ticket, but setting umask 0022 fixed that and led to the wrongfs exception
>
> Sent from my iPhone
>
> On Jan 9, 2012, at 9:42 PM, Daniel Dai <[email protected]> wrote:
>
> > Hi, Jonathan,
> > What is the file permissions error you saw? Your previous log only shows
> > WrongFS exception.
> >
> > Daniel
> >
> > On Mon, Jan 9, 2012 at 8:01 PM, Jonathan Coveney <[email protected]>
> wrote:
> >
> >> I'm not getting the file permissions error anymore. I tried TestOrderBy
> >> (which still has some failing tests, but not related to this) and
> TestUDF.
> >>
> >> 2012/1/9 Jonathan Coveney <[email protected]>
> >>
> >>> Running now, will let you know.
> >>>
> >>>
> >>> 2012/1/9 Daniel Dai <[email protected]>
> >>>
> >>>> I just rolled back 205 changes. Can you try if tests pass?
> >>>>
> >>>> Thanks,
> >>>> Daniel
> >>>>
> >>>> On Mon, Jan 9, 2012 at 6:57 PM, Jonathan Coveney <[email protected]>
> >>>> wrote:
> >>>>
> >>>>> In my /etc/hosts, oauth.localhost.twitter.com is redirected to
> >>>>> 127.0.0.1...but maybe the presence of this mapping makes it think
> that
> >>>> the
> >>>>> cluster is remote, and not local? I should note that I've always had
> >>>> this
> >>>>> set in my /etc/hosts, but only Friday started having issues.
> >>>>>
> >>>>> I removed just that line, and it chose a random other one. I removed
> >>>> all of
> >>>>> the mappings to 127.0.0.1 except localhost, and here is the error I
> >> got:
> >>>>>
> >>>>> org.apache.pig.backend.executionengine.ExecException: ERROR 2118:
> >> Wrong
> >>>> FS:
> >>>>> hdfs://localhost:60145/user/jcoveney, expected: file:///
> >>>>>
> >>>>> Just to see, I got rid of all of my mappings in /etc/hosts (though
> >>>> mapping
> >>>>> localhost to 127.0.0.1 is totally valid), and got this error:
> >>>>>
> >>>>> org.apache.pig.backend.executionengine.ExecException: ERROR 2118:
> >> Wrong
> >>>> FS:
> >>>>> hdfs://127.0.0.1:60420/user/jcoveney, expected: file:///
> >>>>>
> >>>>> So It's not related to that. For whatever reason, it's treating it
> >> like
> >>>> an
> >>>>> hdfs file system instead of just going for file:///
> >>>>>
> >>>>> 2012/1/9 Daniel Dai <[email protected]>
> >>>>>
> >>>>>> Thanks Jonathan,
> >>>>>> I saw this in the log:
> >>>>>> Wrong FS: hdfs://oauth.localhost.twitter.com:55237/user/jcoveney,
> >>>>>> expected:
> >>>>>> file:///
> >>>>>>
> >>>>>> Where does this host come from? Is there a hadoop config file in
> >> your
> >>>>>> CLASSPATH?
> >>>>>>
> >>>>>> Daniel
> >>>>>>
> >>>>>> On Mon, Jan 9, 2012 at 5:46 PM, Jonathan Coveney <
> >> [email protected]>
> >>>>>> wrote:
> >>>>>>
> >>>>>>> Attached is the log for ant -Dtestcase=TestOrderBy test on a clean
> >>>>> clone
> >>>>>>> of trunk.
> >>>>>>>
> >>>>>>> 2012/1/9 Daniel Dai <[email protected]>
> >>>>>>>
> >>>>>>>> Hi, Jonathan, can you paste your error message?
> >>>>>>>>
> >>>>>>>> Daniel
> >>>>>>>>
> >>>>>>>> On Mon, Jan 9, 2012 at 4:47 PM, Jonathan Coveney <
> >>>> [email protected]>
> >>>>>>>> wrote:
> >>>>>>>>
> >>>>>>>>> Aha! I've been having a bunch of unit tests fail mysteriously,
> >>>> and
> >>>>> it
> >>>>>>>>> started Friday..and they've been giving local filesystem
> >>>> permissions
> >>>>>>>>> errors. Seems like it is related to that. I was going crazy.
> >>>>>>>>>
> >>>>>>>>> 2012/1/9 Daniel Dai <[email protected]>
> >>>>>>>>>
> >>>>>>>>>> Is that something new? Are you testing trunk? What's the
> >>>> revision
> >>>>>>>> number?
> >>>>>>>>>> Could that relate to 205 upgrade?
> >>>>>>>>>> (PIG-2431<https://issues.apache.org/jira/browse/PIG-2431>).
> >>>>>>>>>>
> >>>>>>>>>>
> >>>>>>>>>> Daniel
> >>>>>>>>>>
> >>>>>>>>>> On Mon, Jan 9, 2012 at 3:54 PM, Bill Graham <
> >>>> [email protected]
> >>>>>>
> >>>>>>>>> wrote:
> >>>>>>>>>>
> >>>>>>>>>>> Is anyone else seeing a bunch of errors when trying to run
> >>>> the
> >>>>> pig
> >>>>>>>> or
> >>>>>>>>>>> piggybank unit tests? I had to change ivy to use xercesImpl
> >>>>> 2.9.1
> >>>>>>>>> instead
> >>>>>>>>>>> of xerces 2.4.4 to make this exception go away:
> >>>>>>>>>>>
> >>>>>>>>>>> 12/01/09 15:47:52 ERROR conf.Configuration: Failed to set
> >>>>>>>>>>> setXIncludeAware(true) for parser
> >>>>>>>>>>> org.apache.xerces.jaxp.DocumentBuilderFactoryImpl@358b3364
> >>>>>>>>>>> :java.lang.UnsupportedOperationException:
> >>>>>>>>>>> This parser does not support specification "null" version
> >>>> "null"
> >>>>>>>>>>> java.lang.UnsupportedOperationException: This parser does
> >> not
> >>>>>>>> support
> >>>>>>>>>>> specification "null" version "null"
> >>>>>>>>>>>       at
> >>>>>>>>>>>
> >>>>>>>>>>>
> >>>>>>>>>>
> >>>>>>>>>
> >>>>>>>>
> >>>>>>
> >>>>>
> >>>>
> >>
> javax.xml.parsers.DocumentBuilderFactory.setXIncludeAware(DocumentBuilderFactory.java:590)
> >>>>>>>>>>>       at
> >>>>>>>>>>>
> >>>>>>>>>>
> >>>>>>>>>
> >>>>>>>>
> >>>>>>
> >>>>>
> >>>>
> >>
> org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:1117)
> >>>>>>>>>>>       at
> >>>>>>>>>>>
> >>>>>>>>>>
> >>>>>>>>>
> >>>>>>>>
> >>>>>>
> >>>>>
> >>>>
> >>
> org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:1103)
> >>>>>>>>>>>       at
> >>>>>>>>>>>
> >>>>>>>>
> >>>> org.apache.hadoop.conf.Configuration.getProps(Configuration.java:1037)
> >>>>>>>>>>>       at
> >>>>>>>>>>>
> >>>>>>>>
> >>>> org.apache.hadoop.conf.Configuration.iterator(Configuration.java:1079)
> >>>>>>>>>>>       at
> >>>>>>>>>>>
> >>>>>>>>>>>
> >>>>>>>>>>
> >>>>>>>>>
> >>>>>>>>
> >>>>>>
> >>>>>
> >>>>
> >>
> org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.recomputeProperties(HExecutionEngine.java:366)
> >>>>>>>>>>>       at
> >>>>>>>>>>>
> >>>>>>>>>>>
> >>>>>>>>>>
> >>>>>>>>>
> >>>>>>>>
> >>>>>>
> >>>>>
> >>>>
> >>
> org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.init(HExecutionEngine.java:177)
> >>>>>>>>>>>       at
> >>>>>>>>>>>
> >>>>>>>>>>>
> >>>>>>>>>>
> >>>>>>>>>
> >>>>>>>>
> >>>>>>
> >>>>>
> >>>>
> >>
> org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.init(HExecutionEngine.java:119)
> >>>>>>>>>>>       at
> >>>>>>>> org.apache.pig.impl.PigContext.connect(PigContext.java:206)
> >>>>>>>>>>>       at
> >> org.apache.pig.PigServer.<init>(PigServer.java:246)
> >>>>>>>>>>>       at
> >> org.apache.pig.PigServer.<init>(PigServer.java:231)
> >>>>>>>>>>>       at
> >> org.apache.pig.PigServer.<init>(PigServer.java:227)
> >>>>>>>>>>>       at
> >> org.apache.pig.PigServer.<init>(PigServer.java:223)
> >>>>>>>>>>> ...
> >>>>>>>>>>>
> >>>>>>>>>>> thanks,
> >>>>>>>>>>> Bill
> >>>>>>>>>>>
> >>>>>>>>>>
> >>>>>>>>>
> >>>>>>>>
> >>>>>>>
> >>>>>>>
> >>>>>>
> >>>>>
> >>>>
> >>>
> >>>
> >>
>

Reply via email to