RE: OutOfMemory when doing joins in spark 2.0 while same code runs fine in spark 1.5.2

2016-07-21 Thread Ravi Aggarwal
[mailto:ianoconn...@gmail.com] On Behalf Of Ian O'Connell Sent: Wednesday, July 20, 2016 11:05 PM To: Ravi Aggarwal Cc: Ted Yu ; user Subject: Re: OutOfMemory when doing joins in spark 2.0 while same code runs fine in spark 1.5.2 Ravi did your issue ever get solved for this? I think i've be

Re: OutOfMemory when doing joins in spark 2.0 while same code runs fine in spark 1.5.2

2016-07-20 Thread Ian O'Connell
gt; => { > > val fieldCell = b.asInstanceOf[Cell] > > a :+ new > String(fieldCell.getQualifierArray).substring(fieldCell.getQualifierOffset, > fieldCell.getQualifierLength + fieldCell.getQualifierOffset) > > } > > } > > > >

RE: OutOfMemory when doing joins in spark 2.0 while same code runs fine in spark 1.5.2

2016-06-14 Thread Ravi Aggarwal
t-merge join. Can we deduce anything from this? Thanks Ravi From: Ravi Aggarwal Sent: Friday, June 10, 2016 12:31 PM To: 'Ted Yu' Cc: user Subject: RE: OutOfMemory when doing joins in spark 2.0 while same code runs fine in spark 1.5.2 Hi Ted, Thanks for the reply. Here is the code Btw

RE: OutOfMemory when doing joins in spark 2.0 while same code runs fine in spark 1.5.2

2016-06-10 Thread Ravi Aggarwal
CatalystTypeConverters.convertToScala( Cast(Literal(value._2), colDataType).eval(), colDataType) }).toArray Row(recordFields: _*) } rowRdd } } Thanks Ravi From: Ted Yu [mailto:yuzhih...@gmail.com] Sent: Thursday, June 9, 2016 7:56 PM To: Ravi Aggarw

Re: OutOfMemory when doing joins in spark 2.0 while same code runs fine in spark 1.5.2

2016-06-09 Thread Ted Yu
bq. Read data from hbase using custom DefaultSource (implemented using TableScan) Did you use the DefaultSource from hbase-spark module in hbase master branch ? If you wrote your own, mind sharing related code ? Thanks On Thu, Jun 9, 2016 at 2:53 AM, raaggarw wrote: > Hi, > > I was trying to p