Hi Kabeer, Thanks for the test. Really appreciate the effort you put into
this. I will check that and report back to you.
Regards,
Taher Koitawala
On Tue, Sep 24, 2019 at 5:54 PM Kabeer Ahmed wrote:
> Taher,
>
> Sorry I got a bit delayed. I have now put everything you may need in a
> gist at:
Taher,
Sorry I got a bit delayed. I have now put everything you may need in a gist at:
https://gist.github.com/smdahmed/3af0e3110e07cf76772bb73d5e9b65e2
I think we can also try to find if there is any illegal character that
could mess up Avro scheme in the column. Like a stand alone “/“ or “.”
On Tue, Sep 17, 2019 at 8:35 PM Vinoth Chandar wrote:
> [Orthogonal comment] It's so awesome to see us troubleshooting together..
> Thanks everyone on
[Orthogonal comment] It's so awesome to see us troubleshooting together..
Thanks everyone on this thread!
On Tue, Sep 17, 2019 at 8:04 PM Taher Koitawala wrote:
> No there are no nulls in the data and I am getting the same error.
>
> On Wed, Sep 18, 2019, 3:33 AM Kabeer Ahmed wrote:
>
> >
Taher - did you find any NULLs in the data? If you are still not able to make
progress, let us know.
On Sep 17 2019, at 8:30 am, Taher Koitawala wrote:
> Sure Gary, Let me check if i can find any nulls in there
>
> On Tue, Sep 17, 2019 at 1:28 AM Gary Li wrote:
> > Hello, I have seen this
Sure Gary, Let me check if i can find any nulls in there
On Tue, Sep 17, 2019 at 1:28 AM Gary Li wrote:
> Hello, I have seen this exception before. In my case, if the precombine key
> of one entry is null, then I will have this error. I'd recommend checking
> if there is any row has null in
Hello, I have seen this exception before. In my case, if the precombine key
of one entry is null, then I will have this error. I'd recommend checking
if there is any row has null in *last_update.*
Best,
Gary
On Mon, Sep 16, 2019 at 12:32 PM Kabeer Ahmed wrote:
> Taher,
>
> Let me spin a test
Taher,
Let me spin a test for you to test similar scenario and let me revert back to
you.
On Sep 16 2019, at 2:09 pm, Taher Koitawala wrote:
> Hi Kabeer, hive table has everything as a string. However when fetching
> data, the spark query is
> .sql(String.format("select
Hi Kabeer, hive table has everything as a string. However when fetching
data, the spark query is
.sql(String.format("select contact_id,country,cast(last_update as
TIMESTAMP) as last_update from %s",hiveTable))
On Mon, Sep 16, 2019 at 6:18 PM Kabeer Ahmed wrote:
> Is last_update a timestamp? Can
Is last_update a timestamp? Can you please throw the hive schema that you are
using to create table. You could run show create table and send us
the output please?
On Sep 16 2019, at 1:32 pm, Taher Koitawala wrote:
> Hi Kaber, Same issue when last_update is converted to long.
>
>
Hi Kaber, Same issue when last_update is converted to long.
HoodieSparkSQLWriter: Registered avro schema : {
"type" : "record",
"name" : "s3_master_contacts_list_hudi_record",
"namespace" : "hoodie.s3_master_contacts_list_hudi",
"fields" : [ {
"name" : "contact_id",
"type" : [
Taher,
This error of field not found exception with HUDI is mostly because of 2 cases:
The data types of the fields do not match with the types listed in hive tables.
The field may really not be preset - which doesnt seem to be your case.
I looked into the schema in your log which is below.
12 matches
Mail list logo