Re: Facing error while extending scala class with Product interface to overcome limit of 22 fields in spark-shell

2015-02-26 Thread Patrick Varilly
;sdp_d")
>>>
>>> :26: error: type mismatch;
>>>  found   : Int
>>>  required: Option[Int]
>>>   new sdp_d(r(0).trim.toInt, r(1).trim.toInt, r(2).trim,
>>> r(3).trim.toInt, r(4).trim.toInt, r(5).trim, r(6).trim.toInt, r(7).trim,
>>> r(8).trim.toDouble, r(9).trim.toDouble, r(10).trim, r(11).trim, r(12).trim,
>>> r(13).trim, r(14).trim, r(15).trim, r(16).trim, r(17).trim, r(18).trim,
>>> r(19).trim, r(20).trim, r(21).trim.toInt, r(22).trim, r23, r(24).trim,
>>> r(25).trim, r(26).trim, r27, r28)
>>>
>>> On Wed, Feb 25, 2015 at 2:32 PM, Akhil Das 
>>> wrote:
>>>
>>>> It says sdp_d not found, since it is a class you need to instantiate it
>>>> once. like:
>>>>
>>>> sc.textFile("derby.log").map(_.split(",")).map( r => {
>>>>   val upto_time = sdf.parse(r(23).trim);
>>>>   calendar.setTime(upto_time);
>>>>   val r23 = new java.sql.Timestamp(upto_time.getTime);
>>>>
>>>>   val insert_time = sdf.parse(r(26).trim);
>>>>   calendar.setTime(insert_time);
>>>>   val r26 = new java.sql.Timestamp(insert_time.getTime);
>>>>
>>>>   val last_upd_time = sdf.parse(r(27).trim);
>>>>   calendar.setTime(last_upd_time);
>>>>   val r27 = new java.sql.Timestamp(last_upd_time.getTime);
>>>>
>>>>   *new* *sdp_d(r(0).trim.toInt, r(1).trim.toInt, r(2).trim,
>>>> r(3).trim.toInt, r(4).trim.toInt, r(5).trim, r(6).trim.toInt, r(7).trim,
>>>> r(8).trim.toDouble, r(9).trim.toDouble, r(10).trim, r(11).trim, r(12).trim,
>>>> r(13).trim, r(14).trim, r(15).trim, r(16).trim, r(17).trim, r(18).trim,
>>>> r(19).trim, r(20).trim, r(21).trim.toInt, r(22).trim, r23, r(24).trim,
>>>> r(25).trim, r26, r27, r(28).trim)*
>>>>   }).registerAsTable("sdp")
>>>>
>>>> Thanks
>>>> Best Regards
>>>>
>>>> On Wed, Feb 25, 2015 at 2:14 PM, anamika gupta <
>>>> anamika.guo...@gmail.com> wrote:
>>>>
>>>>> The link has proved helpful. I have been able to load data, register
>>>>> it as a table and perform simple queries. Thanks Akhil !!
>>>>>
>>>>> Though, I still look forward to knowing where I was going wrong with
>>>>> my previous technique of extending the Product Interface to overcome case
>>>>> class's limit of 22 fields.
>>>>>
>>>>> On Wed, Feb 25, 2015 at 9:45 AM, anamika gupta <
>>>>> anamika.guo...@gmail.com> wrote:
>>>>>
>>>>>> Hi Akhil
>>>>>>
>>>>>> I guess it skipped my attention. I would definitely give it a try.
>>>>>>
>>>>>> While I would still like to know what is the issue with the way I
>>>>>> have created schema?
>>>>>>
>>>>>> On Tue, Feb 24, 2015 at 4:35 PM, Akhil Das <
>>>>>> ak...@sigmoidanalytics.com> wrote:
>>>>>>
>>>>>>> Did you happen to have a look at
>>>>>>> https://spark.apache.org/docs/latest/sql-programming-guide.html#programmatically-specifying-the-schema
>>>>>>>
>>>>>>> Thanks
>>>>>>> Best Regards
>>>>>>>
>>>>>>> On Tue, Feb 24, 2015 at 3:39 PM, anu 
>>>>>>> wrote:
>>>>>>>
>>>>>>>> My issue is posted here on stack-overflow. What am I doing wrong
>>>>>>>> here?
>>>>>>>>
>>>>>>>>
>>>>>>>> http://stackoverflow.com/questions/28689186/facing-error-while-extending-scala-class-with-product-interface-to-overcome-limi
>>>>>>>>
>>>>>>>> --
>>>>>>>> View this message in context: Facing error while extending scala
>>>>>>>> class with Product interface to overcome limit of 22 fields in 
>>>>>>>> spark-shell
>>>>>>>> <http://apache-spark-user-list.1001560.n3.nabble.com/Facing-error-while-extending-scala-class-with-Product-interface-to-overcome-limit-of-22-fields-in-spl-tp21787.html>
>>>>>>>> Sent from the Apache Spark User List mailing list archive
>>>>>>>> <http://apache-spark-user-list.1001560.n3.nabble.com/> at
>>>>>>>> Nabble.com.
>>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>


Re: Facing error while extending scala class with Product interface to overcome limit of 22 fields in spark-shell

2015-02-26 Thread anamika gupta
m);
>>>   calendar.setTime(upto_time);
>>>   val r23 = new java.sql.Timestamp(upto_time.getTime);
>>>
>>>   val insert_time = sdf.parse(r(26).trim);
>>>   calendar.setTime(insert_time);
>>>   val r26 = new java.sql.Timestamp(insert_time.getTime);
>>>
>>>   val last_upd_time = sdf.parse(r(27).trim);
>>>   calendar.setTime(last_upd_time);
>>>   val r27 = new java.sql.Timestamp(last_upd_time.getTime);
>>>
>>>   *new* *sdp_d(r(0).trim.toInt, r(1).trim.toInt, r(2).trim,
>>> r(3).trim.toInt, r(4).trim.toInt, r(5).trim, r(6).trim.toInt, r(7).trim,
>>> r(8).trim.toDouble, r(9).trim.toDouble, r(10).trim, r(11).trim, r(12).trim,
>>> r(13).trim, r(14).trim, r(15).trim, r(16).trim, r(17).trim, r(18).trim,
>>> r(19).trim, r(20).trim, r(21).trim.toInt, r(22).trim, r23, r(24).trim,
>>> r(25).trim, r26, r27, r(28).trim)*
>>>   }).registerAsTable("sdp")
>>>
>>> Thanks
>>> Best Regards
>>>
>>> On Wed, Feb 25, 2015 at 2:14 PM, anamika gupta >> > wrote:
>>>
>>>> The link has proved helpful. I have been able to load data, register it
>>>> as a table and perform simple queries. Thanks Akhil !!
>>>>
>>>> Though, I still look forward to knowing where I was going wrong with my
>>>> previous technique of extending the Product Interface to overcome case
>>>> class's limit of 22 fields.
>>>>
>>>> On Wed, Feb 25, 2015 at 9:45 AM, anamika gupta <
>>>> anamika.guo...@gmail.com> wrote:
>>>>
>>>>> Hi Akhil
>>>>>
>>>>> I guess it skipped my attention. I would definitely give it a try.
>>>>>
>>>>> While I would still like to know what is the issue with the way I have
>>>>> created schema?
>>>>>
>>>>> On Tue, Feb 24, 2015 at 4:35 PM, Akhil Das >>>> > wrote:
>>>>>
>>>>>> Did you happen to have a look at
>>>>>> https://spark.apache.org/docs/latest/sql-programming-guide.html#programmatically-specifying-the-schema
>>>>>>
>>>>>> Thanks
>>>>>> Best Regards
>>>>>>
>>>>>> On Tue, Feb 24, 2015 at 3:39 PM, anu 
>>>>>> wrote:
>>>>>>
>>>>>>> My issue is posted here on stack-overflow. What am I doing wrong
>>>>>>> here?
>>>>>>>
>>>>>>>
>>>>>>> http://stackoverflow.com/questions/28689186/facing-error-while-extending-scala-class-with-product-interface-to-overcome-limi
>>>>>>>
>>>>>>> --
>>>>>>> View this message in context: Facing error while extending scala
>>>>>>> class with Product interface to overcome limit of 22 fields in 
>>>>>>> spark-shell
>>>>>>> <http://apache-spark-user-list.1001560.n3.nabble.com/Facing-error-while-extending-scala-class-with-Product-interface-to-overcome-limit-of-22-fields-in-spl-tp21787.html>
>>>>>>> Sent from the Apache Spark User List mailing list archive
>>>>>>> <http://apache-spark-user-list.1001560.n3.nabble.com/> at
>>>>>>> Nabble.com.
>>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>


Re: Facing error while extending scala class with Product interface to overcome limit of 22 fields in spark-shell

2015-02-26 Thread Patrick Varilly
;> r(8).trim.toDouble, r(9).trim.toDouble, r(10).trim, r(11).trim, r(12).trim,
>> r(13).trim, r(14).trim, r(15).trim, r(16).trim, r(17).trim, r(18).trim,
>> r(19).trim, r(20).trim, r(21).trim.toInt, r(22).trim, r23, r(24).trim,
>> r(25).trim, r26, r27, r(28).trim)*
>>   }).registerAsTable("sdp")
>>
>> Thanks
>> Best Regards
>>
>> On Wed, Feb 25, 2015 at 2:14 PM, anamika gupta 
>> wrote:
>>
>>> The link has proved helpful. I have been able to load data, register it
>>> as a table and perform simple queries. Thanks Akhil !!
>>>
>>> Though, I still look forward to knowing where I was going wrong with my
>>> previous technique of extending the Product Interface to overcome case
>>> class's limit of 22 fields.
>>>
>>> On Wed, Feb 25, 2015 at 9:45 AM, anamika gupta >> > wrote:
>>>
>>>> Hi Akhil
>>>>
>>>> I guess it skipped my attention. I would definitely give it a try.
>>>>
>>>> While I would still like to know what is the issue with the way I have
>>>> created schema?
>>>>
>>>> On Tue, Feb 24, 2015 at 4:35 PM, Akhil Das 
>>>> wrote:
>>>>
>>>>> Did you happen to have a look at
>>>>> https://spark.apache.org/docs/latest/sql-programming-guide.html#programmatically-specifying-the-schema
>>>>>
>>>>> Thanks
>>>>> Best Regards
>>>>>
>>>>> On Tue, Feb 24, 2015 at 3:39 PM, anu  wrote:
>>>>>
>>>>>> My issue is posted here on stack-overflow. What am I doing wrong here?
>>>>>>
>>>>>>
>>>>>> http://stackoverflow.com/questions/28689186/facing-error-while-extending-scala-class-with-product-interface-to-overcome-limi
>>>>>>
>>>>>> --
>>>>>> View this message in context: Facing error while extending scala
>>>>>> class with Product interface to overcome limit of 22 fields in 
>>>>>> spark-shell
>>>>>> <http://apache-spark-user-list.1001560.n3.nabble.com/Facing-error-while-extending-scala-class-with-Product-interface-to-overcome-limit-of-22-fields-in-spl-tp21787.html>
>>>>>> Sent from the Apache Spark User List mailing list archive
>>>>>> <http://apache-spark-user-list.1001560.n3.nabble.com/> at Nabble.com.
>>>>>>
>>>>>
>>>>>
>>>>
>>>
>>
>


Re: Facing error while extending scala class with Product interface to overcome limit of 22 fields in spark-shell

2015-02-25 Thread anamika gupta
I am now getting the following error. I cross-checked my types and
corrected three of them i.e. r26-->String, r27-->Timestamp,
r28-->Timestamp. This error still persists.

scala> sc.textFile("/home/cdhuser/Desktop/Sdp_d.csv").map(_.split(",")).map
{ r =>
 | val upto_time = sdf.parse(r(23).trim);
 | calendar.setTime(upto_time);
 | val r23 = new java.sql.Timestamp(upto_time.getTime)
 | val insert_time = sdf.parse(r(27).trim)
 | calendar.setTime(insert_time)
 | val r27 = new java.sql.Timestamp(insert_time.getTime)
 | val last_upd_time = sdf.parse(r(28).trim)
 | calendar.setTime(last_upd_time)
 | val r28 = new java.sql.Timestamp(last_upd_time.getTime)
 | new sdp_d(r(0).trim.toInt, r(1).trim.toInt, r(2).trim,
r(3).trim.toInt, r(4).trim.toInt, r(5).trim, r(6).trim.toInt, r(7).trim,
r(8).trim.toDouble, r(9).trim.toDouble, r(10).trim, r(11).trim, r(12).trim,
r(13).trim, r(14).trim, r(15).trim, r(16).trim, r(17).trim, r(18).trim,
r(19).trim, r(20).trim, r(21).trim.toInt, r(22).trim, r23, r(24).trim,
r(25).trim, r(26).trim, r27, r28)
 | }.registerAsTable("sdp_d")

:26: error: type mismatch;
 found   : Int
 required: Option[Int]
  new sdp_d(r(0).trim.toInt, r(1).trim.toInt, r(2).trim,
r(3).trim.toInt, r(4).trim.toInt, r(5).trim, r(6).trim.toInt, r(7).trim,
r(8).trim.toDouble, r(9).trim.toDouble, r(10).trim, r(11).trim, r(12).trim,
r(13).trim, r(14).trim, r(15).trim, r(16).trim, r(17).trim, r(18).trim,
r(19).trim, r(20).trim, r(21).trim.toInt, r(22).trim, r23, r(24).trim,
r(25).trim, r(26).trim, r27, r28)

On Wed, Feb 25, 2015 at 2:32 PM, Akhil Das 
wrote:

> It says sdp_d not found, since it is a class you need to instantiate it
> once. like:
>
> sc.textFile("derby.log").map(_.split(",")).map( r => {
>   val upto_time = sdf.parse(r(23).trim);
>   calendar.setTime(upto_time);
>   val r23 = new java.sql.Timestamp(upto_time.getTime);
>
>   val insert_time = sdf.parse(r(26).trim);
>   calendar.setTime(insert_time);
>   val r26 = new java.sql.Timestamp(insert_time.getTime);
>
>   val last_upd_time = sdf.parse(r(27).trim);
>   calendar.setTime(last_upd_time);
>   val r27 = new java.sql.Timestamp(last_upd_time.getTime);
>
>   *new* *sdp_d(r(0).trim.toInt, r(1).trim.toInt, r(2).trim,
> r(3).trim.toInt, r(4).trim.toInt, r(5).trim, r(6).trim.toInt, r(7).trim,
> r(8).trim.toDouble, r(9).trim.toDouble, r(10).trim, r(11).trim, r(12).trim,
> r(13).trim, r(14).trim, r(15).trim, r(16).trim, r(17).trim, r(18).trim,
> r(19).trim, r(20).trim, r(21).trim.toInt, r(22).trim, r23, r(24).trim,
> r(25).trim, r26, r27, r(28).trim)*
>   }).registerAsTable("sdp")
>
> Thanks
> Best Regards
>
> On Wed, Feb 25, 2015 at 2:14 PM, anamika gupta 
> wrote:
>
>> The link has proved helpful. I have been able to load data, register it
>> as a table and perform simple queries. Thanks Akhil !!
>>
>> Though, I still look forward to knowing where I was going wrong with my
>> previous technique of extending the Product Interface to overcome case
>> class's limit of 22 fields.
>>
>> On Wed, Feb 25, 2015 at 9:45 AM, anamika gupta 
>> wrote:
>>
>>> Hi Akhil
>>>
>>> I guess it skipped my attention. I would definitely give it a try.
>>>
>>> While I would still like to know what is the issue with the way I have
>>> created schema?
>>>
>>> On Tue, Feb 24, 2015 at 4:35 PM, Akhil Das 
>>> wrote:
>>>
>>>> Did you happen to have a look at
>>>> https://spark.apache.org/docs/latest/sql-programming-guide.html#programmatically-specifying-the-schema
>>>>
>>>> Thanks
>>>> Best Regards
>>>>
>>>> On Tue, Feb 24, 2015 at 3:39 PM, anu  wrote:
>>>>
>>>>> My issue is posted here on stack-overflow. What am I doing wrong here?
>>>>>
>>>>>
>>>>> http://stackoverflow.com/questions/28689186/facing-error-while-extending-scala-class-with-product-interface-to-overcome-limi
>>>>>
>>>>> --
>>>>> View this message in context: Facing error while extending scala
>>>>> class with Product interface to overcome limit of 22 fields in spark-shell
>>>>> <http://apache-spark-user-list.1001560.n3.nabble.com/Facing-error-while-extending-scala-class-with-Product-interface-to-overcome-limit-of-22-fields-in-spl-tp21787.html>
>>>>> Sent from the Apache Spark User List mailing list archive
>>>>> <http://apache-spark-user-list.1001560.n3.nabble.com/> at Nabble.com.
>>>>>
>>>>
>>>>
>>>
>>
>


Re: Facing error while extending scala class with Product interface to overcome limit of 22 fields in spark-shell

2015-02-25 Thread Akhil Das
It says sdp_d not found, since it is a class you need to instantiate it
once. like:

sc.textFile("derby.log").map(_.split(",")).map( r => {
  val upto_time = sdf.parse(r(23).trim);
  calendar.setTime(upto_time);
  val r23 = new java.sql.Timestamp(upto_time.getTime);

  val insert_time = sdf.parse(r(26).trim);
  calendar.setTime(insert_time);
  val r26 = new java.sql.Timestamp(insert_time.getTime);

  val last_upd_time = sdf.parse(r(27).trim);
  calendar.setTime(last_upd_time);
  val r27 = new java.sql.Timestamp(last_upd_time.getTime);

  *new* *sdp_d(r(0).trim.toInt, r(1).trim.toInt, r(2).trim,
r(3).trim.toInt, r(4).trim.toInt, r(5).trim, r(6).trim.toInt, r(7).trim,
r(8).trim.toDouble, r(9).trim.toDouble, r(10).trim, r(11).trim, r(12).trim,
r(13).trim, r(14).trim, r(15).trim, r(16).trim, r(17).trim, r(18).trim,
r(19).trim, r(20).trim, r(21).trim.toInt, r(22).trim, r23, r(24).trim,
r(25).trim, r26, r27, r(28).trim)*
  }).registerAsTable("sdp")

Thanks
Best Regards

On Wed, Feb 25, 2015 at 2:14 PM, anamika gupta 
wrote:

> The link has proved helpful. I have been able to load data, register it as
> a table and perform simple queries. Thanks Akhil !!
>
> Though, I still look forward to knowing where I was going wrong with my
> previous technique of extending the Product Interface to overcome case
> class's limit of 22 fields.
>
> On Wed, Feb 25, 2015 at 9:45 AM, anamika gupta 
> wrote:
>
>> Hi Akhil
>>
>> I guess it skipped my attention. I would definitely give it a try.
>>
>> While I would still like to know what is the issue with the way I have
>> created schema?
>>
>> On Tue, Feb 24, 2015 at 4:35 PM, Akhil Das 
>> wrote:
>>
>>> Did you happen to have a look at
>>> https://spark.apache.org/docs/latest/sql-programming-guide.html#programmatically-specifying-the-schema
>>>
>>> Thanks
>>> Best Regards
>>>
>>> On Tue, Feb 24, 2015 at 3:39 PM, anu  wrote:
>>>
>>>> My issue is posted here on stack-overflow. What am I doing wrong here?
>>>>
>>>>
>>>> http://stackoverflow.com/questions/28689186/facing-error-while-extending-scala-class-with-product-interface-to-overcome-limi
>>>>
>>>> --
>>>> View this message in context: Facing error while extending scala class
>>>> with Product interface to overcome limit of 22 fields in spark-shell
>>>> <http://apache-spark-user-list.1001560.n3.nabble.com/Facing-error-while-extending-scala-class-with-Product-interface-to-overcome-limit-of-22-fields-in-spl-tp21787.html>
>>>> Sent from the Apache Spark User List mailing list archive
>>>> <http://apache-spark-user-list.1001560.n3.nabble.com/> at Nabble.com.
>>>>
>>>
>>>
>>
>


Re: Facing error while extending scala class with Product interface to overcome limit of 22 fields in spark-shell

2015-02-25 Thread anamika gupta
The link has proved helpful. I have been able to load data, register it as
a table and perform simple queries. Thanks Akhil !!

Though, I still look forward to knowing where I was going wrong with my
previous technique of extending the Product Interface to overcome case
class's limit of 22 fields.

On Wed, Feb 25, 2015 at 9:45 AM, anamika gupta 
wrote:

> Hi Akhil
>
> I guess it skipped my attention. I would definitely give it a try.
>
> While I would still like to know what is the issue with the way I have
> created schema?
>
> On Tue, Feb 24, 2015 at 4:35 PM, Akhil Das 
> wrote:
>
>> Did you happen to have a look at
>> https://spark.apache.org/docs/latest/sql-programming-guide.html#programmatically-specifying-the-schema
>>
>> Thanks
>> Best Regards
>>
>> On Tue, Feb 24, 2015 at 3:39 PM, anu  wrote:
>>
>>> My issue is posted here on stack-overflow. What am I doing wrong here?
>>>
>>>
>>> http://stackoverflow.com/questions/28689186/facing-error-while-extending-scala-class-with-product-interface-to-overcome-limi
>>>
>>> ------------------
>>> View this message in context: Facing error while extending scala class
>>> with Product interface to overcome limit of 22 fields in spark-shell
>>> <http://apache-spark-user-list.1001560.n3.nabble.com/Facing-error-while-extending-scala-class-with-Product-interface-to-overcome-limit-of-22-fields-in-spl-tp21787.html>
>>> Sent from the Apache Spark User List mailing list archive
>>> <http://apache-spark-user-list.1001560.n3.nabble.com/> at Nabble.com.
>>>
>>
>>
>


Re: Facing error while extending scala class with Product interface to overcome limit of 22 fields in spark-shell

2015-02-25 Thread Petar Zecevic


I believe your class needs to be defined as a case class (as I answered 
on SO)..



On 25.2.2015. 5:15, anamika gupta wrote:

Hi Akhil

I guess it skipped my attention. I would definitely give it a try.

While I would still like to know what is the issue with the way I have 
created schema?


On Tue, Feb 24, 2015 at 4:35 PM, Akhil Das <mailto:ak...@sigmoidanalytics.com>> wrote:


Did you happen to have a look at

https://spark.apache.org/docs/latest/sql-programming-guide.html#programmatically-specifying-the-schema

Thanks
Best Regards

On Tue, Feb 24, 2015 at 3:39 PM, anu mailto:anamika.guo...@gmail.com>> wrote:

My issue is posted here on stack-overflow. What am I doing
wrong here?


http://stackoverflow.com/questions/28689186/facing-error-while-extending-scala-class-with-product-interface-to-overcome-limi


View this message in context: Facing error while extending
    scala class with Product interface to overcome limit of 22
    fields in spark-shell

<http://apache-spark-user-list.1001560.n3.nabble.com/Facing-error-while-extending-scala-class-with-Product-interface-to-overcome-limit-of-22-fields-in-spl-tp21787.html>
Sent from the Apache Spark User List mailing list archive
<http://apache-spark-user-list.1001560.n3.nabble.com/> at
Nabble.com.







Re: Facing error while extending scala class with Product interface to overcome limit of 22 fields in spark-shell

2015-02-24 Thread anamika gupta
Hi Akhil

I guess it skipped my attention. I would definitely give it a try.

While I would still like to know what is the issue with the way I have
created schema?

On Tue, Feb 24, 2015 at 4:35 PM, Akhil Das 
wrote:

> Did you happen to have a look at
> https://spark.apache.org/docs/latest/sql-programming-guide.html#programmatically-specifying-the-schema
>
> Thanks
> Best Regards
>
> On Tue, Feb 24, 2015 at 3:39 PM, anu  wrote:
>
>> My issue is posted here on stack-overflow. What am I doing wrong here?
>>
>>
>> http://stackoverflow.com/questions/28689186/facing-error-while-extending-scala-class-with-product-interface-to-overcome-limi
>>
>> ----------
>> View this message in context: Facing error while extending scala class
>> with Product interface to overcome limit of 22 fields in spark-shell
>> <http://apache-spark-user-list.1001560.n3.nabble.com/Facing-error-while-extending-scala-class-with-Product-interface-to-overcome-limit-of-22-fields-in-spl-tp21787.html>
>> Sent from the Apache Spark User List mailing list archive
>> <http://apache-spark-user-list.1001560.n3.nabble.com/> at Nabble.com.
>>
>
>


Re: Facing error while extending scala class with Product interface to overcome limit of 22 fields in spark-shell

2015-02-24 Thread Akhil Das
Did you happen to have a look at
https://spark.apache.org/docs/latest/sql-programming-guide.html#programmatically-specifying-the-schema

Thanks
Best Regards

On Tue, Feb 24, 2015 at 3:39 PM, anu  wrote:

> My issue is posted here on stack-overflow. What am I doing wrong here?
>
>
> http://stackoverflow.com/questions/28689186/facing-error-while-extending-scala-class-with-product-interface-to-overcome-limi
>
> --
> View this message in context: Facing error while extending scala class
> with Product interface to overcome limit of 22 fields in spark-shell
> <http://apache-spark-user-list.1001560.n3.nabble.com/Facing-error-while-extending-scala-class-with-Product-interface-to-overcome-limit-of-22-fields-in-spl-tp21787.html>
> Sent from the Apache Spark User List mailing list archive
> <http://apache-spark-user-list.1001560.n3.nabble.com/> at Nabble.com.
>


Facing error while extending scala class with Product interface to overcome limit of 22 fields in spark-shell

2015-02-24 Thread anu
My issue is posted here on stack-overflow. What am I doing wrong here?

http://stackoverflow.com/questions/28689186/facing-error-while-extending-scala-class-with-product-interface-to-overcome-limi




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Facing-error-while-extending-scala-class-with-Product-interface-to-overcome-limit-of-22-fields-in-spl-tp21787.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.