Umesh,

This issue still persists. Could you please use num-cores = 1 ? You can
scale out using num-executors.

-Nishith

On Fri, Mar 8, 2019 at 12:06 PM Umesh Kacha <umesh.ka...@gmail.com> wrote:

> I think issue is this https://github.com/uber/hudi/issues/227 I get the
> same error and I tried to use multiple executor cores 4 and I am using
> Spark 2.2.0. Is this issue fixed?
>
>
>
> On Fri, Mar 8, 2019 at 6:58 PM Vinoth Chandar <vin...@apache.org> wrote:
>
> > Could you please share the entire stack trace?
> >
> > On Fri, Mar 8, 2019 at 1:56 AM Umesh Kacha <umesh.ka...@gmail.com>
> wrote:
> >
> > > Hi I am using Spark Shell to save spark dataframe as Hoodie dataset
> using
> > > bulk insert option inside Hoodie spark datasource. It seems to be
> working
> > > and trying to save but in the end it fails giving the following
> exception
> > >
> > > Failed to initialize HoodieStorageWriter for path
> > > /tmp/hoodie-test/2019/blabla.parquet
> > >
> >
>

Reply via email to