How to write a style of printout in a application?
Hi, I am super beginner on here, but this is an amazing project, thx. And not sure all of them, so am sorry if this is a little bit silly question. I saw how to write some code on Zeppelin, at the same time, the same code can run on it as a application program like this. [in notebook] % hogehoge.scala But in the case of this type, how should i write the code in the application to use variety figures and so on? I could not find out the relation between a application / code and display procedure in the docs. :-b Any info and advices thank. -Keiji
NPE in SparkInterpreter.java
apache/zeppelin | | | | | | | | | | | apache/zeppelin zeppelin - Mirror of Apache Zeppelin | | | in the code I see // Some case, scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call throws an NPE We had issues that Zeppelin note errored out immediately without any hint messages, and we are seeing this NPE in the spark_interpreter log (Zeppelin 0.7.1 on AWS EMR cluster). can someone explain what's the likely cause for this error and how to prevent it from happening again? Thanks,
Re: Service 'sparkDriver' could not bind on port with dockerized zeppelin
Solved this. It appears that 'spark.driver.bindAddress' should point to docker container IP while 'spark.driver.host' to outer host IP Thanks anyway 2017-06-22 19:16 GMT+03:00 Иван Шаповалов: > Help needed. > > 1. got zeppelin running in a docker container > 2. got remote spark standalone cluster I want to run paragraphs against > > > I have: > - created a setting with > -- master - $MASTER_IP/7077 > -- 'spark.driver.host' - ip of the docker container 172.18.0.2 > -- 'spark.driver.port' - free port number (I have scanned range of > forwarded ports 4-40100 for a free one) 40099 > -- 'spark.driver.bindAddress' - host IP address > > I can see the following in logs when trying to run a paragraph > > ...WARN [2017-06-22 15:10:21,827] ({pool-2-thread-2} > Logging.scala[logWarning]:66) - Service 'sparkDriver' could not bind on > port 40114. Attempting port 40115. > ERROR [2017-06-22 15:10:21,835] ({pool-2-thread-2} > Logging.scala[logError]:91) - Error initializing SparkContext. > java.net.BindException: Cannot assign requested address: Service > 'sparkDriver' failed after 16 retries (starting from 40099)! Consider > explicitly setting the appropriate port for the service 'sparkDriver' (for > example spark.ui.port for SparkUI) to an available port or increasing > spark.port.maxRetries. > > But when I remove 'spark.driver.bindAddress' - paragraph job is > successfully submitted but apparently cluster cannot see the driver > > Caused by: java.io.IOException: Failed to connect to /172.18.0.2:40099 > > > Please help, any ideas are more than appreciated > Thanks in advance > > -- > Ivan Shapovalov > Kharkov, Ukraine > > -- Ivan Shapovalov Kharkov, Ukraine
Re: Service 'sparkDriver' could not bind on port with dockerized zeppelin
zeppelin version is 0.7.2 2017-06-22 19:16 GMT+03:00 Иван Шаповалов: > Help needed. > > 1. got zeppelin running in a docker container > 2. got remote spark standalone cluster I want to run paragraphs against > > > I have: > - created a setting with > -- master - $MASTER_IP/7077 > -- 'spark.driver.host' - ip of the docker container 172.18.0.2 > -- 'spark.driver.port' - free port number (I have scanned range of > forwarded ports 4-40100 for a free one) 40099 > -- 'spark.driver.bindAddress' - host IP address > > I can see the following in logs when trying to run a paragraph > > ...WARN [2017-06-22 15:10:21,827] ({pool-2-thread-2} > Logging.scala[logWarning]:66) - Service 'sparkDriver' could not bind on > port 40114. Attempting port 40115. > ERROR [2017-06-22 15:10:21,835] ({pool-2-thread-2} > Logging.scala[logError]:91) - Error initializing SparkContext. > java.net.BindException: Cannot assign requested address: Service > 'sparkDriver' failed after 16 retries (starting from 40099)! Consider > explicitly setting the appropriate port for the service 'sparkDriver' (for > example spark.ui.port for SparkUI) to an available port or increasing > spark.port.maxRetries. > > But when I remove 'spark.driver.bindAddress' - paragraph job is > successfully submitted but apparently cluster cannot see the driver > > Caused by: java.io.IOException: Failed to connect to /172.18.0.2:40099 > > > Please help, any ideas are more than appreciated > Thanks in advance > > -- > Ivan Shapovalov > Kharkov, Ukraine > > -- Ivan Shapovalov Kharkov, Ukraine
Service 'sparkDriver' could not bind on port with dockerized zeppelin
Help needed. 1. got zeppelin running in a docker container 2. got remote spark standalone cluster I want to run paragraphs against I have: - created a setting with -- master - $MASTER_IP/7077 -- 'spark.driver.host' - ip of the docker container 172.18.0.2 -- 'spark.driver.port' - free port number (I have scanned range of forwarded ports 4-40100 for a free one) 40099 -- 'spark.driver.bindAddress' - host IP address I can see the following in logs when trying to run a paragraph ...WARN [2017-06-22 15:10:21,827] ({pool-2-thread-2} Logging.scala[logWarning]:66) - Service 'sparkDriver' could not bind on port 40114. Attempting port 40115. ERROR [2017-06-22 15:10:21,835] ({pool-2-thread-2} Logging.scala[logError]:91) - Error initializing SparkContext. java.net.BindException: Cannot assign requested address: Service 'sparkDriver' failed after 16 retries (starting from 40099)! Consider explicitly setting the appropriate port for the service 'sparkDriver' (for example spark.ui.port for SparkUI) to an available port or increasing spark.port.maxRetries. But when I remove 'spark.driver.bindAddress' - paragraph job is successfully submitted but apparently cluster cannot see the driver Caused by: java.io.IOException: Failed to connect to /172.18.0.2:40099 Please help, any ideas are more than appreciated Thanks in advance -- Ivan Shapovalov Kharkov, Ukraine
Re: Notebook not responding
Doubt that. I am using test data set. After doing clear output I ran all paragraphs and did not see any issue. BTW, when I ran all paragraphs, I expected them to run in sequence. However the later ones ran ahead before the main script finished thereby running into error. Is there any configuration that says run all paragraphs in sequence? -Thanks Nikhil On Thu, Jun 22, 2017 at 11:44 AM, Sachin Jananiwrote: > You might have executed a query which has returned a large result set that > caused this issue. > > Regards, > Sachin Janani > > On Thu, Jun 22, 2017 at 11:42 AM, Nikhil Utane < > nikhil.subscri...@gmail.com> wrote: > >> Phew. I was able to recover by selecting "Clear Output" on the main page. >> After that I was able to open the notebook. >> >> Thanks >> Nikhil >> >> On Thu, Jun 22, 2017 at 11:26 AM, Nikhil Utane < >> nikhil.subscri...@gmail.com> wrote: >> >>> Hi, >>> >>> I am having a serious issue. >>> All of a sudden my notebook has stopped responding. The page doesn't >>> load in any of the browser. I am able to open Tutorial notebooks though. >>> I have restarted zeppelin daemon but it is not helping. >>> In the logs, I only see New Connection and Connection Closed messages. >>> The last changes I made were in a paragraph I added z.show() for two >>> separate tables. Dunno if that has anything to do or not. >>> >>> Please help !! >>> >>> -Regards >>> Nikhil >>> >> >> >
Re: Notebook not responding
Phew. I was able to recover by selecting "Clear Output" on the main page. After that I was able to open the notebook. Thanks Nikhil On Thu, Jun 22, 2017 at 11:26 AM, Nikhil Utanewrote: > Hi, > > I am having a serious issue. > All of a sudden my notebook has stopped responding. The page doesn't load > in any of the browser. I am able to open Tutorial notebooks though. > I have restarted zeppelin daemon but it is not helping. > In the logs, I only see New Connection and Connection Closed messages. > The last changes I made were in a paragraph I added z.show() for two > separate tables. Dunno if that has anything to do or not. > > Please help !! > > -Regards > Nikhil >