Hi hddong,

Thank you for your help. Looks like brew installation of spark was the
issue. I set up spark on my machine using spark binaries, and it runs fine
now.

On Mon, May 18, 2020 at 9:02 PM Pratyaksh Sharma <[email protected]>
wrote:

> Hi hddong,
>
> The concerned test in my error log (org.apache.hudi.cli.integ.
> ITTestRepairsCommand.testDeduplicateWithReal) passes when run on our
> travis CI. So there is some problem with my local itself.
>
> On Mon, May 18, 2020 at 3:33 PM hddong <[email protected]> wrote:
>
>> Hi,
>>
>> I had try docker before, it usually use `execStartCmd` to exec cmd
>> directly.
>> But for hudi-cli, we need exec cmd in interactive mode. There are some
>> different.
>> If there is any way, run in docker is better.
>>
>> @Shiyan Your command run failed due to spark task failed, I guess you need
>> a tmp folder. Use `mkdir /tmp/spark-events/`, if you not change the config
>> for spark.
>> You'd better have a look of  detail error log (above assert Error).
>>
>> @Pratyaksh yes, it looks like deduping is done, but not work. Is it cause
>> of your code adjustment?
>> Can you try run the test in master branch and check if the exception
>> exist.
>>
>> Pratyaksh Sharma <[email protected]> 于2020年5月18日周一 上午1:30写道:
>>
>> > Hi,
>> >
>> > For me also the test runs but looking at the error, it looks like no
>> work
>> > or deduping is done, which is strange. Here is the error ->
>> >
>> > [*ERROR*] *Tests **run: 1*, *Failures: 1*, Errors: 0, Skipped: 0, Time
>> > elapsed: 8.425 s* <<< FAILURE!* - in org.apache.hudi.cli.integ.
>> > *ITTestRepairsCommand*
>> >
>> > [*ERROR*]
>> > org.apache.hudi.cli.integ.ITTestRepairsCommand.testDeduplicateWithReal
>> > Time elapsed: 7.588 s  <<< FAILURE!
>> >
>> > org.opentest4j.AssertionFailedError: expected: <200> but was: <210>
>> >
>> > at
>> >
>> >
>> org.apache.hudi.cli.integ.ITTestRepairsCommand.testDeduplicateWithReal(ITTestRepairsCommand.java:254)
>> >
>> > Initially also 210 records are present, so effectively the test runs but
>> > without doing anything. There is no other error apart from the above
>> one.
>> >
>> > I feel integration tests for hudi-cli should also be running in docker
>> like
>> > other integration tests rather than running on local spark installation.
>> > That would help ensure such issues do not come up in future. Thoughts?
>> >
>> > On Sun, May 17, 2020 at 7:59 PM hddong <[email protected]> wrote:
>> >
>> > > Hi Pratyaksh,
>> > >
>> > > Dose it throws the same Exception? And can you check if sparkLauncher
>> > > throws the same Exception. Most time, ITTest failed due to some
>> config of
>> > > local spark.
>> > >  I had got this Exception before, but it run successfully after `mvn
>> > clean
>> > > package ...`.
>> > >
>> > > Regards
>> > > hddong
>> > >
>> > > Pratyaksh Sharma <[email protected]> 于2020年5月17日周日 下午8:42写道:
>> > >
>> > > > Hi hddong,
>> > > >
>> > > > Strange but nothing seems to work for me. I tried doing mvn clean
>> and
>> > > then
>> > > > run travis tests. Also I tried running the command `mvn clean
>> package
>> > > > -DskipTests -DskipITs -Pspark-shade-unbundle-avro` first and then
>> run
>> > the
>> > > > test using `mvn -Dtest=ITTestRepairsCommand#testDeduplicateWithReal
>> > > > -DfailIfNoTests=false test`. But both of them did not work. I have
>> > spark
>> > > > installation and I am setting the SPARK_HOME to
>> > > > /usr/local/Cellar/apache-spark/2.4.5.
>> > > >
>> > > > On Sun, May 17, 2020 at 9:00 AM hddong <[email protected]>
>> wrote:
>> > > >
>> > > > > Hi Pratyaksh,
>> > > > >
>> > > > > run_travis_tests,sh not run `mvn clean`, You can try to run `mvn
>> > > > > clean` manually
>> > > > > before integration test.
>> > > > >
>> > > > > BTW, if you use IDEA, you can do
>> > > > > `mvn clean package -DskipTests -DskipITs
>> -Pspark-shade-unbundle-avro`
>> > > > > first,
>> > > > > then just run integration test in IDEA like unit test does.
>> > > > >
>> > > > > But, there are something to notice here: you need a runnable spark
>> > and
>> > > > > SPARK_HOME should in env.
>> > > > >
>> > > > > Regards
>> > > > > hddong
>> > > > >
>> > > >
>> > >
>> >
>>
>

Reply via email to