Hello,

I installed jython + jruby on debian.

 export
PIG_CLASSPATH=/path/cassandra-0.7/contrib/pig:/usr/share/java/jython.jar

/path/cassandra-0.7/contrib/pig - here is my udf, myFunc.py


#!/usr/bin/python
@outputSchema("t:tuple(domain:chararray, spam:int, size:int, time:int)")
def toTuple(bag):

  #{(colname, value), (...), ...,}

  for word in bag:
    if word[0] == 'domain':
domain = word[1]
    elif word[0] == 'spam':
spam = word[1]
    elif word[0] == 'size':
size = word[1]
    elif word[0] == 'time':
time = word[1]

  return (domain, spam, size, time)


After starting grunt, i type

register '/path/cassandra-0.7/contrib/pig/myFunc.py' using jython as myUDF;

rows = LOAD 'cassandra://emailArchive/messagesMetaData' USING
CassandraStorage() AS (key, columns: bag {T: tuple(name:chararray,
value:int)});
d = foreach rows generate  myUDF.toTuple($1);

When I type illustrate / dump d; the error occure:


Any idea where problem should be?


Thanks.



2011-04-23 23:40:13,428 [main] ERROR org.apache.pig.tools.grunt.Grunt -
ERROR 2998: Unhandled internal error. org/jruby/ext/posix/util/Platform

2011-04-23 23:26:02,211 [Thread-14] WARN
 org.apache.hadoop.mapred.LocalJobRunner - job_local_0001
java.lang.NoClassDefFoundError: org/jruby/ext/posix/util/Platform
  at org.python.core.PySystemState.getPath(PySystemState.java:513)
  at org.python.core.PySystemState.getPathLazy(PySystemState.java:502)
  at org.python.core.util.RelativeFile.<init>(RelativeFile.java:17)
  at org.python.core.PyTraceback.getLine(PyTraceback.java:52)
  at org.python.core.PyTraceback.tracebackInfo(PyTraceback.java:37)
  at org.python.core.PyTraceback.dumpStack(PyTraceback.java:108)
  at org.python.core.PyTraceback.dumpStack(PyTraceback.java:119)
  at org.python.core.Py.displayException(Py.java:1007)
  at org.python.core.PyException.printStackTrace(PyException.java:79)
  at org.python.core.PyException.toString(PyException.java:98)
  at java.lang.String.valueOf(String.java:2826)
  at java.lang.StringBuilder.append(StringBuilder.java:115)
  at
org.apache.pig.scripting.jython.JythonFunction.exec(JythonFunction.java:107)
  at
org.apache.pig.backend.hadoop.executionengine.physicalLayer.expressionOperators.POUserFunc.getNext(POUserFunc.java:229)
  at
org.apache.pig.backend.hadoop.executionengine.physicalLayer.expressionOperators.POUserFunc.getNext(POUserFunc.java:273)
  at
org.apache.pig.backend.hadoop.executionengine.physicalLayer.relationalOperators.POForEach.processPlan(POForEach.java:343)
  at
org.apache.pig.backend.hadoop.executionengine.physicalLayer.relationalOperators.POForEach.getNext(POForEach.java:291)
  at
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigMapBase.runPipeline(PigMapBase.java:236)
  at
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigMapBase.map(PigMapBase.java:231)
  at
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigMapBase.map(PigMapBase.java:53)
  at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
  at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:621)
  at org.apache.hadoop.mapred.MapTask.run(MapTask.java:305)
  at
org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:177)
2011-04-23 23:26:02,331 [main] INFO
 org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher
- HadoopJobId: job_local_0001

Reply via email to