I think Balaji is working on updating docs now.
Instructions here
https://github.com/apache/incubator-hudi/tree/asf-site

On Thu, May 2, 2019 at 7:29 AM Vinoth Chandar <[email protected]> wrote:

> Welcome to the club, Tristan!
>
> On Wed, May 1, 2019 at 2:03 PM Bhavani Sudha Saktheeswaran
> <[email protected]> wrote:
>
>> Hi Tristan,
>> you might want to include "--schemaprovider-class
>> com.uber.hoodie.utilities.schema.FilebasedSchemaProvider" in the spark
>> submit command. I also faced similar issue when I tried the Docker demo. I
>> think there is a PR pending for Docs that includes this change.
>>
>> Thanks,
>> Sudha
>>
>> On Wed, May 1, 2019 at 1:33 PM Baker, Tristan <[email protected]>
>> wrote:
>>
>> > Hi,
>> >
>> > Been working through the quickstart here:
>> > https://hudi.apache.org/docker_demo.html
>> >
>> > I get an NPE when running the merge on read spark job.
>> >
>> > Here’s the spark-submit command (copied from the quickstart
>> instructions)
>> >
>> > https://gist.github.com/tcbakes/4a11cff217fb8a98205b4cc46cd29750
>> >
>> >
>> > Here’s the NPE:
>> >
>> > https://gist.github.com/tcbakes/021258638184ddcbde2b0320ec589fde
>> >
>> >
>> > I attached my debugger to the process and discovered that the
>> > schemaProvider is null in on line 65 here:
>> >
>> >
>> >
>> https://github.com/apache/incubator-hudi/blob/3a0044216cb2f707639d48e2869f4ee6f25cfc19/hoodie-utilities/src/main/java/com/uber/hoodie/utilities/deltastreamer/SourceFormatAdapter.java#L65
>> >
>> > The Copy On Write spark job/example works fine, but this one doesn’t.
>> >
>> > Any pointers?
>> >
>> > Thanks,
>> > Tristan
>> >
>>
>

Reply via email to