Oh, this is an internal class of our project and I had used it without
realizing the source.

Anyway, the idea is to  wrap the InternalRow in a class that derives from
Row. When you implement the functions of the trait 'Row ', the type
conversions from Row types to InternalRow types has to be done for each of
the types. But, as I can see, the primitive types (apart from String) don't
need conversions. Map and Array would need some handling.

I will check with the author of this code, I think this code can be
contributed to Spark.

Hemant
www.snappydata.io
linkedin.com/company/snappydata

On Wed, Oct 7, 2015 at 3:30 PM, Ophir Cohen <oph...@gmail.com> wrote:

> From which jar WrappedInternalRow comes from?
> It seems that I can't find it.
>
> BTW
> What I'm trying to do now is to create scala array from the fields and
> than create Row out of that array.
> The problem is that I get types mismatches...
>
> On Wed, Oct 7, 2015 at 8:03 AM, Hemant Bhanawat <hemant9...@gmail.com>
> wrote:
>
>> An approach can be to wrap your MutableRow in WrappedInternalRow which is
>> a child class of Row.
>>
>> Hemant
>> www.snappydata.io
>> linkedin.com/company/snappydata
>>
>>
>> On Tue, Oct 6, 2015 at 3:21 PM, Ophir Cohen <oph...@gmail.com> wrote:
>>
>>> Hi Guys,
>>> I'm upgrading to Spark 1.5.
>>>
>>> In our previous version (Spark 1.3 but it was OK on 1.4 as well) we
>>> created GenericMutableRow
>>> (org.apache.spark.sql.catalyst.expressions.GenericMutableRow) and return it
>>> as org.apache.spark.sql.Row
>>>
>>> Starting from Spark 1.5 GenericMutableRow isn't extends Row.
>>>
>>> What do you suggest to do?
>>> How can I convert GenericMutableRow to Row?
>>>
>>> Prompt answer will be highly appreciated!
>>> Thanks,
>>> Ophir
>>>
>>
>>
>

Reply via email to