d you try in spark local mode? i.e. https://jaceklaskowski.gi
> tbooks.io/mastering-apache-spark/content/spark-local.html
>
> Mike
>
> On Tue, Mar 6, 2018 at 7:14 PM, Ravi Kanth <ravikanth@gmail.com>
> wrote:
>
>> Mike,
>>
>> Can you clarify a bit on grabb
On Tue, Mar 6, 2018 at 8:52 AM, Ravi Kanth <ravikanth@gmail.com>
> wrote:
>
>>
>> Yes, I have debugged to find the root cause. Every logger before "table
>> = client.openTable(tableName);" is executing fine and exactly at the
>> point of opening the t
code is hanging
> when the connection is lost (what line)?
>
> Mike
>
> On Mon, Mar 5, 2018 at 9:08 PM, Ravi Kanth <ravikanth@gmail.com>
> wrote:
>
>> In addition to my previous comment, I raised a support ticket for this
>> issue with Cloudera and one of the sup
ou -- can you be more specific about
> what you expect and what you are observing?
>
> Thanks,
> Mike
>
>
>
> On Mon, Feb 26, 2018 at 8:04 PM, Ravi Kanth <ravikanth@gmail.com>
> wrote:
>
>> Thank Clifford. We are running Kudu 1.4 version. Till date
pshots. I imagine there is
> similar with Spark. Sorry I cant be of more help!
>
>
>
> On Feb 26, 2018 9:10 PM, Ravi Kanth <ravikanth@gmail.com> wrote:
>
> Cliff,
>
> Thanks for the response. Well, I do agree that its simple and seamless. In
> my case, I am ab
l>
> object, which should be able to identify the affected row that the write
> failed for. Does that help?
>
> Since you are using the Kudu client directly, Spark is not involved from
> the Kudu perspective, so you will need to deal with Spark on your own in
> that case.
>
>
aware of
the underlying error."*
On 5 March 2018 at 21:02, Ravi Kanth <ravikanth@gmail.com> wrote:
> Mike,
>
> Thanks for the information. But, once the connection to any of the Kudu
> servers is lost then there is no way I can have a control on the
> KuduSession obje