If you are referring to limit the # of columns you can select the columns
and describe.
df.select("col1", "col2").describe().show()

On Tue, Aug 2, 2016 at 6:39 AM, pseudo oduesp <pseudo20...@gmail.com> wrote:

> Hi
>  in spark 1.5.0  i used  descibe function with more than 100 columns .
> someone can tell me if any limit exsiste now ?
>
> thanks
>
>

Reply via email to