[jira] [Updated] (DLAB-321) [GCP][Spark Standalone cluster]: Playbook running fails using spark cluster kernel due to 'Class com.google.cloud.hadoop.fs.gcs.GoogleHadoopFileSystem not found'
[ https://issues.apache.org/jira/browse/DLAB-321?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Vira Vitanska updated DLAB-321: --- Labels: Debian DevOps (was: DevOps) Component/s: GCP > [GCP][Spark Standalone cluster]: Playbook running fails using spark cluster > kernel due to 'Class com.google.cloud.hadoop.fs.gcs.GoogleHadoopFileSystem > not found' > - > > Key: DLAB-321 > URL: https://issues.apache.org/jira/browse/DLAB-321 > Project: Apache DLab > Issue Type: Bug > Components: GCP >Reporter: Vira Vitanska >Assignee: Demyan Mysakovets >Priority: Critical > Labels: Debian, DevOps > Fix For: v.2.2 > > Attachments: GCP.PNG, GCP_autotest.PNG > > > *Steps to reproduce:* > # Run autotest on GCP for Data Engine or Create Spark cluster on > Jupyter/Zeppelin/Rstudio and run playbook by manual > > *Actual result:* > playbook running fails with error: > !GCP_autotest.PNG! > [AutoTests_GCP/91/console|http://35.166.222.81/view/AutoTests/job/AutoTests_GCP/91/console] > *Expected result:* > Autotest runs successfully -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: dev-unsubscr...@dlab.apache.org For additional commands, e-mail: dev-h...@dlab.apache.org
[jira] [Updated] (DLAB-321) [GCP][Spark Standalone cluster]: Playbook running fails using spark cluster kernel due to 'Class com.google.cloud.hadoop.fs.gcs.GoogleHadoopFileSystem not found'
[ https://issues.apache.org/jira/browse/DLAB-321?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Vira Vitanska updated DLAB-321: --- Summary: [GCP][Spark Standalone cluster]: Playbook running fails using spark cluster kernel due to 'Class com.google.cloud.hadoop.fs.gcs.GoogleHadoopFileSystem not found' (was: [GCP]: Playbook running fails using spark cluster kernel due to 'Class com.google.cloud.hadoop.fs.gcs.GoogleHadoopFileSystem not found') > [GCP][Spark Standalone cluster]: Playbook running fails using spark cluster > kernel due to 'Class com.google.cloud.hadoop.fs.gcs.GoogleHadoopFileSystem > not found' > - > > Key: DLAB-321 > URL: https://issues.apache.org/jira/browse/DLAB-321 > Project: Apache DLab > Issue Type: Bug >Reporter: Vira Vitanska >Assignee: Demyan Mysakovets >Priority: Critical > Labels: DevOps > Fix For: v.1.1 > > Attachments: GCP.PNG, GCP_autotest.PNG > > > *Steps to reproduce:* > # Run autotest on GCP for Data Engine or Create Spark cluster on > Jupyter/Zeppelin/Rstudio and run playbook by manual > > *Actual result:* > playbook running fails with error: > !GCP_autotest.PNG! > [AutoTests_GCP/91/console|http://35.166.222.81/view/AutoTests/job/AutoTests_GCP/91/console] > *Expected result:* > Autotest runs successfully -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: dev-unsubscr...@dlab.apache.org For additional commands, e-mail: dev-h...@dlab.apache.org