But can't I just use HiveContext and use hive's functionality, which does
support subqueries?

On Fri, Feb 26, 2016 at 4:28 PM, Mohammad Tariq <donta...@gmail.com> wrote:

> Spark doesn't support subqueries in WHERE clause, IIRC. It supports
> subqueries only in the FROM clause as of now. See this ticket
> <https://issues.apache.org/jira/browse/SPARK-4226> for more on this.
>
>
>
> [image: http://]
>
> Tariq, Mohammad
> about.me/mti
> [image: http://]
> <http://about.me/mti>
>
>
> On Fri, Feb 26, 2016 at 7:01 AM, ayan guha <guha.a...@gmail.com> wrote:
>
>> Why is this not working for you? Are you trying on dataframe? What error
>> are you getting?
>>
>> On Thu, Feb 25, 2016 at 10:23 PM, Ashok Kumar <
>> ashok34...@yahoo.com.invalid> wrote:
>>
>>> Hi,
>>>
>>> What is the equivalent of this in Spark please
>>>
>>> select * from mytable where column1 in (select max(column1) from mytable)
>>>
>>> Thanks
>>>
>>
>>
>>
>> --
>> Best Regards,
>> Ayan Guha
>>
>
>


-- 
Best Regards,
Ayan Guha

Reply via email to