Hi ,
Please see the code snippet below:

register pig.jar;
register piggybank.jar;

o1 =  load 'observations.csv' as (obs_id, encounter_id, sub_form_id,
observed_by, verified_by, remark);

oc1 = load 'observation_concept.csv' as (obs_id, concept_id, concept_val);

l1 = load 'locations.csv' as (location_id, longitude, latitude, address1,
address2, village, town,city, state_province, postal_code, country,
is_person_address);

e1 = load 'encounters.csv' as (encounter_id, person_id, location_id,
encounter_date_time);

p1 = load 'persons.csv' as (person_id, gender, given_name, middle_name,
family_name, birth_date,birth_date_estimated, birth_place, mothers_name,
spouses_name,death_date, death_date_estimated, location_id,
marriage_date,marriage_date_estimated, entry_date, marriage_status,
contact_number,father_name);

pid = load 'person__patient_id_type.csv'as (patient_id_type_id, person_id,
patient_id);

oc1 = filter oc1 by concept_id == 317;

temp = join  oc1 by obs_id, o1 by obs_id;

temp = join temp by o1::encounter_id, e1 by encounter_id;

temp = join p1 by person_id, temp by e1::person_id;

temp = join l1 by location_id, temp by p1::location_id;

temp = join pid by person_id, temp by p1::person_id;


temp = group temp by (p1::person_id);
temp = foreach temp generate flatten(temp), MAX(temp.oc1::concept_val) as
DeliveryDate;

Everytime I try to execute it, I get the following error:

2011-08-19 04:11:29,561 [main] INFO
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher
- Some jobs have failed! Stop running all dependent jobs
2011-08-19 04:11:29,572 [main] ERROR org.apache.pig.tools.grunt.Grunt -
ERROR 2997: Unable to recreate exception from backed error:
org.apache.pig.backend.executionengine.ExecException: ERROR 2103: Problem
while computing max of doubles.

Any clue why this happens?

Thanks,

Reply via email to