maheshguptags commented on issue #10609:
URL: https://github.com/apache/hudi/issues/10609#issuecomment-2167344888
Thank you very much @michael1991 !!.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to
michael1991 commented on issue #10609:
URL: https://github.com/apache/hudi/issues/10609#issuecomment-2167291939
Hey @maheshguptags , I just got inspired by GCP Dataproc Doc here:
https://cloud.google.com/dataproc/docs/concepts/components/hudi
maheshguptags commented on issue #10609:
URL: https://github.com/apache/hudi/issues/10609#issuecomment-2167275346
Hi, @michael1991 thank you for solving this, I can run the deltastream with
RLI. Out of curiosity, how did you figure out we need to pass the jar in
extraPath?
michael1991 commented on issue #10609:
URL: https://github.com/apache/hudi/issues/10609#issuecomment-2158376225
> @michael1991 can you add the value that you pass
`spark.executor.extraClassPath` and `spark.driver.extraClassPath`? so that I
can try at my end as well.
Sure
maheshguptags commented on issue #10609:
URL: https://github.com/apache/hudi/issues/10609#issuecomment-2157672634
@michael1991 can you add the value that you pass
`spark.executor.extraClassPath` and `spark.driver.extraClassPath`? so that I
can try at my end as well.
--
This is an
michael1991 commented on issue #10609:
URL: https://github.com/apache/hudi/issues/10609#issuecomment-2156683190
> I hit the same error when I try to use record indexing:
>
> ```
> hoodie.metadata.record.index.enable=true
> hoodie.index.type=RECORD_INDEX
> ```
>
> Are
michael1991 commented on issue #10609:
URL: https://github.com/apache/hudi/issues/10609#issuecomment-2155893143
Hi @ad1happy2go , I can reproduce this error by following env and scala
code, hope it could be helpful.
Environment: Dataproc 2.1(Spark 3.3.2) with Hudi 0.14.x / Dataproc
michael1991 commented on issue #10609:
URL: https://github.com/apache/hudi/issues/10609#issuecomment-2154535715
trying spark3.5 and hudi0.15.0, same issue is hit too.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use
jayakasadev commented on issue #10609:
URL: https://github.com/apache/hudi/issues/10609#issuecomment-2050767899
I hit the same error when I try to use record indexing:
```
hoodie.metadata.record.index.enable=true
hoodie.index.type=RECORD_INDEX
```
Are there additional
ad1happy2go commented on issue #10609:
URL: https://github.com/apache/hudi/issues/10609#issuecomment-2046560389
@nsivabalan We were not able to reproduce this error in our setup. I went
into multiple calls with @maheshguptags and setup the exact same setup in my
local. But He is
maheshguptags commented on issue #10609:
URL: https://github.com/apache/hudi/issues/10609#issuecomment-2046555120
@nsivabalan We haven't resolved the original issue and it is still open.
--
This is an automated message from the Apache Git Service.
To respond to the message, please
nsivabalan commented on issue #10609:
URL: https://github.com/apache/hudi/issues/10609#issuecomment-2043998416
and @ad1happy2go : if you encounter any bugs wrt MDT or RLI, do keep me
posted.
--
This is an automated message from the Apache Git Service.
To respond to the message,
nsivabalan commented on issue #10609:
URL: https://github.com/apache/hudi/issues/10609#issuecomment-2043998156
hey @bksrepo : can you file a new issue
hey @ad1happy2go : if the original issue is resolved, can we close it out.
--
This is an automated message from the Apache Git
bksrepo commented on issue #10609:
URL: https://github.com/apache/hudi/issues/10609#issuecomment-1991619367
I am using spark 3.4.1 with hudi bundle
'hudi-spark3.4-bundle_2.12-0.14.0.jar', Hadoop is 3.3.6 and source database is
mysql version 8.0.36
Reported ERROR comes at the
ad1happy2go commented on issue #10609:
URL: https://github.com/apache/hudi/issues/10609#issuecomment-1988564921
@bksrepo which version you used to load the data? Is it an upgraded table?
Original issue is different here compared to your stack trace. Can you share
all the table/writer
bksrepo commented on issue #10609:
URL: https://github.com/apache/hudi/issues/10609#issuecomment-1988119816
Any conclusion on this issue? I am facing same issue too.
10:29:32.481 [qtp264384338-719] ERROR
org.apache.hudi.common.table.log.AbstractHoodieLogRecordReader - Got exception
maheshguptags commented on issue #10609:
URL: https://github.com/apache/hudi/issues/10609#issuecomment-1972504593
Sure let me schedule some time and we will discuss it.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use
ad1happy2go commented on issue #10609:
URL: https://github.com/apache/hudi/issues/10609#issuecomment-1971610460
@maheshguptags I noticed in your timeline, there is multi writer kind of
scenario -
michael1991 commented on issue #10609:
URL: https://github.com/apache/hudi/issues/10609#issuecomment-1968547164
> @michael1991 the above one is `hoodie.properties` and @ad1happy2go is
asking for the table properties you used during table creation. thanks
Thanks for reminding, i'm
maheshguptags commented on issue #10609:
URL: https://github.com/apache/hudi/issues/10609#issuecomment-1968465772
@michael1991 the above one is `hoodie.properties` and @ad1happy2go is asking
for the table properties you used during table creation.
thanks
--
This is an automated
michael1991 commented on issue #10609:
URL: https://github.com/apache/hudi/issues/10609#issuecomment-1968072665
> @michael1991 just to check , Are you also using composite key? Can you
post table configuration
#Updated at 2024-02-27T07:34:03.809265Z
#Tue Feb 27 07:34:03 UTC 2024
ad1happy2go commented on issue #10609:
URL: https://github.com/apache/hudi/issues/10609#issuecomment-1966687166
[hoodie.zip](https://github.com/apache/hudi/files/14421090/hoodie.zip)
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on
ad1happy2go commented on issue #10609:
URL: https://github.com/apache/hudi/issues/10609#issuecomment-1966354050
@michael1991 just to check , Are you also using composite key? Can you post
table configuration
--
This is an automated message from the Apache Git Service.
To respond to
michael1991 commented on issue #10609:
URL: https://github.com/apache/hudi/issues/10609#issuecomment-1963611302
facing same issue, wait for updates
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go
maheshguptags commented on issue #10609:
URL: https://github.com/apache/hudi/issues/10609#issuecomment-1955927720
@ad1happy2go and @yihua any update on this?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL
ad1happy2go commented on issue #10609:
URL: https://github.com/apache/hudi/issues/10609#issuecomment-1947804747
Had working session with @maheshguptags . We were able to consistently
reproduce with composite key in his setup. although I couldn't reproduce in my
setup. SO this issue is
ad1happy2go commented on issue #10609:
URL: https://github.com/apache/hudi/issues/10609#issuecomment-1942095222
@maheshguptags I tried to reproduce the issue but couldn't do it. Following
are the artefacts.
Kafka-source.props
```
hoodie.datasource.write.recordkey.field=volume
maheshguptags commented on issue #10609:
URL: https://github.com/apache/hudi/issues/10609#issuecomment-1931327718
@ad1happy2go as discussed, I have tried hudi delta stream but unfortunately,
I could not execute it due to heap space issues even without sending any data.
**Command**
ad1happy2go commented on issue #10609:
URL: https://github.com/apache/hudi/issues/10609#issuecomment-1930410768
Thanks @maheshguptags . As discussed are you getting same error with Hudi
Streamer?
--
This is an automated message from the Apache Git Service.
To respond to the message,
maheshguptags commented on issue #10609:
URL: https://github.com/apache/hudi/issues/10609#issuecomment-1926535894
@ad1happy2go I tried without RLI, it is working fine. however, when I add
the `RLI` index to the table, it starts failing.
I am not sure why RLi is causing errors while
ad1happy2go commented on issue #10609:
URL: https://github.com/apache/hudi/issues/10609#issuecomment-1926275891
had a discussion with @maheshguptags , Issue can be either related to
deserialiser configs or some bug in RLI. He is trying without RLI and will let
us know his findings. Thanks
maheshguptags commented on issue #10609:
URL: https://github.com/apache/hudi/issues/10609#issuecomment-1923978525
cc: @codope @ad1happy2go @bhasudha
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to
maheshguptags opened a new issue, #10609:
URL: https://github.com/apache/hudi/issues/10609
I am trying to ingest the data using spark+kafka streaming to hudi table
with the RLI index. but unfortunately ingesting 5-10 records is throwing the
below issue.
Steps to reproduce the
33 matches
Mail list logo