Re: 0.7.0 zeppelin.interpreters change: can't make pyspark be default Spark interperter

2016-12-08 Thread Ruslan Dautkhanov
I got a lucky jira number :-)

https://issues.apache.org/jira/browse/ZEPPELIN-1777

Thank you Jeff.



-- 
Ruslan Dautkhanov

On Thu, Dec 8, 2016 at 10:50 PM, Jeff Zhang  wrote:

> hmm, I think so, please file a ticket for it.
>
>
>
> Ruslan Dautkhanov 于2016年12月9日周五 下午1:49写道:
>
>> Hi Jeff,
>>
>> When I made pySpark as default - it works as expected;
>> except Setting UI. See screenshot below.
>>
>> Notice it shows %spark twice.
>> First time as default. 2nd one is not.
>> It should have been %pyspark (default), %spark, ..
>> as I made pyspark default.
>>
>> Is this a new bug in 0.7?
>>
>> [image: Inline image 1]
>>
>>
>> --
>> Ruslan Dautkhanov
>>
>> On Wed, Nov 30, 2016 at 7:34 PM, Jeff Zhang  wrote:
>>
>> Hi Ruslan,
>>
>> I miss another thing, You also need to delete file conf/interpreter.json
>> which store the original setting. Otherwise the original setting is always
>> loaded.
>>
>>
>> Ruslan Dautkhanov 于2016年12月1日周四 上午1:03写道:
>>
>> Got it. Thanks Jeff.
>>
>> I've downloaded
>> https://github.com/apache/zeppelin/blob/master/spark/src/main/resources/
>> interpreter-setting.json
>> and saved to $ZEPPELIN_HOME/interpreter/spark/
>> Then Moved  "defaultInterpreter": true,
>> from json section
>> "className": "org.apache.zeppelin.spark.SparkInterpreter",
>> to section
>> "className": "org.apache.zeppelin.spark.PySparkInterpreter",
>>
>> pySpark is still not default.
>>
>>
>>
>> --
>> Ruslan Dautkhanov
>>
>> On Tue, Nov 29, 2016 at 10:36 PM, Jeff Zhang  wrote:
>>
>> No, you don't need to create that directory, it should be in
>> $ZEPPELIN_HOME/interpreter/spark
>>
>>
>>
>>
>> Ruslan Dautkhanov 于2016年11月30日周三 下午12:12写道:
>>
>> Thank you Jeff.
>>
>> Do I have to create interpreter/spark directory in $ZEPPELIN_HOME/conf
>> or in $ZEPPELIN_HOME directory?
>> So zeppelin.interpreters in zeppelin-site.xml is deprecated in 0.7?
>>
>> Thanks!
>>
>>
>>
>> --
>> Ruslan Dautkhanov
>>
>> On Tue, Nov 29, 2016 at 6:54 PM, Jeff Zhang  wrote:
>>
>> The default interpreter is now defined in interpreter-setting.json
>>
>> You can update the following file to make pyspark as the default
>> interpreter and then copy it to folder interpreter/spark
>>
>> https://github.com/apache/zeppelin/blob/master/spark/src/main/resources/
>> interpreter-setting.json
>>
>>
>>
>> Ruslan Dautkhanov 于2016年11月30日周三 上午8:49写道:
>>
>> After 0.6.2 -> 0.7 upgrade, pySpark isn't a default Spark interpreter;
>> despite we have org.apache.zeppelin.spark.*PySparkInterpreter*
>> listed first in zeppelin.interpreters.
>>
>> zeppelin.interpreters in zeppelin-site.xml:
>>
>> 
>>   zeppelin.interpreters
>>   org.apache.zeppelin.spark.PySparkInterpreter,org.
>> apache.zeppelin.spark.SparkInterpreter
>> ...
>> 
>>
>>
>>
>> Any ideas how to fix this?
>>
>>
>> Thanks,
>> Ruslan
>>
>>
>>
>>
>>


Re: 0.7.0 zeppelin.interpreters change: can't make pyspark be default Spark interperter

2016-12-08 Thread Jeff Zhang
hmm, I think so, please file a ticket for it.



Ruslan Dautkhanov 于2016年12月9日周五 下午1:49写道:

> Hi Jeff,
>
> When I made pySpark as default - it works as expected;
> except Setting UI. See screenshot below.
>
> Notice it shows %spark twice.
> First time as default. 2nd one is not.
> It should have been %pyspark (default), %spark, ..
> as I made pyspark default.
>
> Is this a new bug in 0.7?
>
> [image: Inline image 1]
>
>
> --
> Ruslan Dautkhanov
>
> On Wed, Nov 30, 2016 at 7:34 PM, Jeff Zhang  wrote:
>
> Hi Ruslan,
>
> I miss another thing, You also need to delete file conf/interpreter.json
> which store the original setting. Otherwise the original setting is always
> loaded.
>
>
> Ruslan Dautkhanov 于2016年12月1日周四 上午1:03写道:
>
> Got it. Thanks Jeff.
>
> I've downloaded
>
> https://github.com/apache/zeppelin/blob/master/spark/src/main/resources/interpreter-setting.json
> and saved to $ZEPPELIN_HOME/interpreter/spark/
> Then Moved  "defaultInterpreter": true,
> from json section
> "className": "org.apache.zeppelin.spark.SparkInterpreter",
> to section
> "className": "org.apache.zeppelin.spark.PySparkInterpreter",
>
> pySpark is still not default.
>
>
>
> --
> Ruslan Dautkhanov
>
> On Tue, Nov 29, 2016 at 10:36 PM, Jeff Zhang  wrote:
>
> No, you don't need to create that directory, it should be in
> $ZEPPELIN_HOME/interpreter/spark
>
>
>
>
> Ruslan Dautkhanov 于2016年11月30日周三 下午12:12写道:
>
> Thank you Jeff.
>
> Do I have to create interpreter/spark directory in $ZEPPELIN_HOME/conf
> or in $ZEPPELIN_HOME directory?
> So zeppelin.interpreters in zeppelin-site.xml is deprecated in 0.7?
>
> Thanks!
>
>
>
> --
> Ruslan Dautkhanov
>
> On Tue, Nov 29, 2016 at 6:54 PM, Jeff Zhang  wrote:
>
> The default interpreter is now defined in interpreter-setting.json
>
> You can update the following file to make pyspark as the default
> interpreter and then copy it to folder interpreter/spark
>
>
> https://github.com/apache/zeppelin/blob/master/spark/src/main/resources/interpreter-setting.json
>
>
>
> Ruslan Dautkhanov 于2016年11月30日周三 上午8:49写道:
>
> After 0.6.2 -> 0.7 upgrade, pySpark isn't a default Spark interpreter;
> despite we have org.apache.zeppelin.spark.*PySparkInterpreter*
> listed first in zeppelin.interpreters.
>
> zeppelin.interpreters in zeppelin-site.xml:
>
> 
>   zeppelin.interpreters
>
> org.apache.zeppelin.spark.PySparkInterpreter,org.apache.zeppelin.spark.SparkInterpreter
> ...
> 
>
>
>
> Any ideas how to fix this?
>
>
> Thanks,
> Ruslan
>
>
>
>
>


Re: 0.7.0 zeppelin.interpreters change: can't make pyspark be default Spark interperter

2016-12-08 Thread Ruslan Dautkhanov
Hi Jeff,

When I made pySpark as default - it works as expected;
except Setting UI. See screenshot below.

Notice it shows %spark twice.
First time as default. 2nd one is not.
It should have been %pyspark (default), %spark, ..
as I made pyspark default.

Is this a new bug in 0.7?

[image: Inline image 1]


-- 
Ruslan Dautkhanov

On Wed, Nov 30, 2016 at 7:34 PM, Jeff Zhang  wrote:

> Hi Ruslan,
>
> I miss another thing, You also need to delete file conf/interpreter.json
> which store the original setting. Otherwise the original setting is always
> loaded.
>
>
> Ruslan Dautkhanov 于2016年12月1日周四 上午1:03写道:
>
>> Got it. Thanks Jeff.
>>
>> I've downloaded
>> https://github.com/apache/zeppelin/blob/master/spark/src/main/resources/
>> interpreter-setting.json
>> and saved to $ZEPPELIN_HOME/interpreter/spark/
>> Then Moved  "defaultInterpreter": true,
>> from json section
>> "className": "org.apache.zeppelin.spark.SparkInterpreter",
>> to section
>> "className": "org.apache.zeppelin.spark.PySparkInterpreter",
>>
>> pySpark is still not default.
>>
>>
>>
>> --
>> Ruslan Dautkhanov
>>
>> On Tue, Nov 29, 2016 at 10:36 PM, Jeff Zhang  wrote:
>>
>> No, you don't need to create that directory, it should be in
>> $ZEPPELIN_HOME/interpreter/spark
>>
>>
>>
>>
>> Ruslan Dautkhanov 于2016年11月30日周三 下午12:12写道:
>>
>> Thank you Jeff.
>>
>> Do I have to create interpreter/spark directory in $ZEPPELIN_HOME/conf
>> or in $ZEPPELIN_HOME directory?
>> So zeppelin.interpreters in zeppelin-site.xml is deprecated in 0.7?
>>
>> Thanks!
>>
>>
>>
>> --
>> Ruslan Dautkhanov
>>
>> On Tue, Nov 29, 2016 at 6:54 PM, Jeff Zhang  wrote:
>>
>> The default interpreter is now defined in interpreter-setting.json
>>
>> You can update the following file to make pyspark as the default
>> interpreter and then copy it to folder interpreter/spark
>>
>> https://github.com/apache/zeppelin/blob/master/spark/src/main/resources/
>> interpreter-setting.json
>>
>>
>>
>> Ruslan Dautkhanov 于2016年11月30日周三 上午8:49写道:
>>
>> After 0.6.2 -> 0.7 upgrade, pySpark isn't a default Spark interpreter;
>> despite we have org.apache.zeppelin.spark.*PySparkInterpreter*
>> listed first in zeppelin.interpreters.
>>
>> zeppelin.interpreters in zeppelin-site.xml:
>>
>> 
>>   zeppelin.interpreters
>>   org.apache.zeppelin.spark.PySparkInterpreter,org.
>> apache.zeppelin.spark.SparkInterpreter
>> ...
>> 
>>
>>
>>
>> Any ideas how to fix this?
>>
>>
>> Thanks,
>> Ruslan
>>
>>
>>
>>


Re: Export note as a PDF

2016-12-08 Thread Ruslan Dautkhanov
Thank you Hyunsung.

For various reasons we can't use Zeppelinhub.
One of the them being we have to run Zeppelin on-prem and don't depend on
any external resources.

I've created
https://issues.apache.org/jira/browse/ZEPPELIN-1774
"Export notebook as a pixel-perfect printable document, i.e. export as a
PDF"
Please vote up if you would find that useful too.

Thank you.



-- 
Ruslan Dautkhanov

On Wed, Dec 7, 2016 at 10:32 PM, Hyunsung Jo  wrote:

> Hi Ruslan,
>
> Not aware of Zeppelin's roadmap, but perhaps the tag line of the
> ZeppelinHub website (www.zeppelinhub.com) is hinting its feelings towards
> PDF:
> "ANALYZE, SHARE, AND REPEAT.
> Share your graphs and reports from Apache Zeppelin with anyone.
> Never send a graph in a PDF or Powerpoint again."
>
> Regards,
> Jo
>
>
> On Thu, Dec 8, 2016 at 2:00 PM Ruslan Dautkhanov 
> wrote:
>
>> Our users are looking for functionality similar to Jupyter's save
>> notebook as a PDF..
>> Is this in Zeppelin's roadmap somewhere?
>> Could not find any related JIRAs.
>>
>>
>> Thanks,
>> Ruslan Dautkhanov
>>
>


Re: Zeppeline becomes too slow after adding artifacts

2016-12-08 Thread karuppayya
There is an indicator show next to the interpreter name in the the
"Interpreters" page, to indicate the status of dependency loading.
1. spinner- There is something happening related to dependencies specified
2.Green dot- All deps were downloaded successfully.
3. Red dot- Some deps have not been successfully added.(clicking the red
dot will elaborate on the reason for failure.)
In which state is the dot, when the para is stuck.

On Wed, Dec 7, 2016 at 9:16 AM, Nabajyoti Dash 
wrote:

> I have tried rerunning it several times.
> But it didn't help.I allowed the paragraph to run for 15 min also but, it
> was still in running mode.
> Once I remove the hbase-client package it runs fine again..
>
> Is there anything that I am missing?I  just added the jar to the artifact
> section in spark dependancies ..
> Or something else?
>
>
>
> --
> View this message in context: http://apache-zeppelin-users-
> incubating-mailing-list.75479.x6.nabble.com/Zeppeline-
> becomes-too-slow-after-adding-artifacts-tp4689p4693.html
> Sent from the Apache Zeppelin Users (incubating) mailing list mailing list
> archive at Nabble.com.
>