subscribe
subscribe
subscribe
subscribe
Test | | mojianan2015 | | mojianan2...@163.com |
Fwd: (send this email to subscribe)
-- Forwarded message - From: Madhuchaitanya Joshi Date: Wed, 19 Jan, 2022, 10:51 Subject: (send this email to subscribe) To: Hello team, I am trying to build and compile spark source code using intellij and eclipse. But I am getting jackson-bind.jar not found error in intellij. I tried generaye source and folder, mvn clean compile and rebuild project. Also invalidate cache I tried. But still not working. Please help me in this. I want to build and compile spark source code on eclipse/ intellij to understand flow of code. Thanks and regards, Madhuchaitanya Joshi
Re: subscribe user@spark.apache.org
You have to sign up by sending an email - see http://spark.apache.org/community.html for what to send where. On Tue, Jan 19, 2021 at 12:25 PM Peter Podlovics < peter.d.podlov...@gmail.com> wrote: > Hello, > > I would like to subscribe to the above mailing list. I already tried > subscribing through the webpage, but I still haven't received the email yet. > > Thanks, > Peter >
subscribe user@spark.apache.org
Hello, I would like to subscribe to the above mailing list. I already tried subscribing through the webpage, but I still haven't received the email yet. Thanks, Peter
[Spark Streaming] unsubscribe and subscribe kafka topic without stopping context
Hi All, I am using spark streaming with kafka, I am working on an application which requires me to process data based on user requests. I have created 2 dstream i.e. one per topic but I am unable to stop processing of data on user request or re-enable processing of data. I have asked question on stackoverflow as well .. https://stackoverflow.com/questions/65285184/spark-subscribe-and-unsubscribe-topic-from-spark-streaming-without-requiring-res Any help here will be appreciated. Thanks Sanjay
subscribe
subscribe
subscribe user@spark.apache.org
i want to subscribe user@spark.apache.org ??thanks a lot??
subscribe
Subscribe
- To unsubscribe e-mail: user-unsubscr...@spark.apache.org
subscribe
subscribe
subscribe
| | Genieliu | | feixiang...@163.com China | 签名由网易邮箱大师定制
SUBSCRIBE
subscribe
subscribe
subscribe
subscribe
Subscribe
(send this email to subscribe)
-- André Garcia Carneiro Software Engineer (11)982907780
Subscribe
subscribe
subscribe
subscribe
-- Regards, Sushil The information contained in this e-mail is confidential and/or proprietary to Capital One and/or its affiliates and may only be used solely in performance of work or services for Capital One. The information transmitted herewith is intended only for use by the individual or entity to which it is addressed. If the reader of this message is not the intended recipient, you are hereby notified that any review, retransmission, dissemination, distribution, copying or other use of, or taking of any action in reliance upon this information is strictly prohibited. If you have received this communication in error, please contact the sender and delete the material from your computer.
Re: Subscribe Multiple Topics Structured Streaming
I would like to know how to create stream and sink operations outside "main" method - just like another class which I can invoke from main. So that I can have different implementations for each topic which I subscribed in a specific class file. Is it a good practice or always the whole implementations should go inside "main" method? On Mon, Sep 17, 2018 at 11:35 PM naresh Goud wrote: > You can have below statement for multiple topics > > val dfStatus = spark.readStream. > format("kafka"). > option("subscribe", "utility-status, utility-critical"). > option("kafka.bootstrap.servers", "localhost:9092"). > option("startingOffsets", "earliest") > .load() > > > > > > On Mon, Sep 17, 2018 at 3:28 AM sivaprakash < > sivaprakashshanmu...@gmail.com> wrote: > >> Hi >> >> I have integrated Spark Streaming with Kafka in which Im listening 2 >> topics >> >> def main(args: Array[String]): Unit = { >> >> val schema = StructType( >> List( >> StructField("gatewayId", StringType, true), >> StructField("userId", StringType, true) >> ) >> ) >> >> val spark = SparkSession >> .builder >> .master("local[4]") >> .appName("DeviceAutomation") >> .getOrCreate() >> >> val dfStatus = spark.readStream. >> format("kafka"). >> option("subscribe", "utility-status, utility-critical"). >> option("kafka.bootstrap.servers", "localhost:9092"). >> option("startingOffsets", "earliest") >> .load() >> >> >> } >> >> Since I have few more topics to be listed and perform different >> operations I >> would like to move each topics into separate case class for better >> clarity. >> Is it possible? >> >> >> >> -- >> Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/ >> >> - >> To unsubscribe e-mail: user-unsubscr...@spark.apache.org >> >> -- > Thanks, > Naresh > www.linkedin.com/in/naresh-dulam > http://hadoopandspark.blogspot.com/ > > -- - Prakash.
Re: Subscribe Multiple Topics Structured Streaming
You can have below statement for multiple topics val dfStatus = spark.readStream. format("kafka"). option("subscribe", "utility-status, utility-critical"). option("kafka.bootstrap.servers", "localhost:9092"). option("startingOffsets", "earliest") .load() On Mon, Sep 17, 2018 at 3:28 AM sivaprakash wrote: > Hi > > I have integrated Spark Streaming with Kafka in which Im listening 2 topics > > def main(args: Array[String]): Unit = { > > val schema = StructType( > List( > StructField("gatewayId", StringType, true), > StructField("userId", StringType, true) > ) > ) > > val spark = SparkSession > .builder > .master("local[4]") > .appName("DeviceAutomation") > .getOrCreate() > > val dfStatus = spark.readStream. > format("kafka"). > option("subscribe", "utility-status, utility-critical"). > option("kafka.bootstrap.servers", "localhost:9092"). > option("startingOffsets", "earliest") > .load() > > > } > > Since I have few more topics to be listed and perform different operations > I > would like to move each topics into separate case class for better clarity. > Is it possible? > > > > -- > Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/ > > - > To unsubscribe e-mail: user-unsubscr...@spark.apache.org > > -- Thanks, Naresh www.linkedin.com/in/naresh-dulam http://hadoopandspark.blogspot.com/
Subscribe Multiple Topics Structured Streaming
Hi I have integrated Spark Streaming with Kafka in which Im listening 2 topics def main(args: Array[String]): Unit = { val schema = StructType( List( StructField("gatewayId", StringType, true), StructField("userId", StringType, true) ) ) val spark = SparkSession .builder .master("local[4]") .appName("DeviceAutomation") .getOrCreate() val dfStatus = spark.readStream. format("kafka"). option("subscribe", "utility-status, utility-critical"). option("kafka.bootstrap.servers", "localhost:9092"). option("startingOffsets", "earliest") .load() } Since I have few more topics to be listed and perform different operations I would like to move each topics into separate case class for better clarity. Is it possible? -- Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/ - To unsubscribe e-mail: user-unsubscr...@spark.apache.org
Re: Spark on Mesos: Spark issuing hundreds of SUBSCRIBE requests / second and crashing Mesos
That does sound like it could be it - I checked our libmesos version and it is 1.4.1. I'll try upgrading libmesos. Thanks. On Mon, Jul 23, 2018 at 12:13 PM Susan X. Huynh wrote: > Hi Nimi, > > This sounds similar to a bug I have come across before. See: > https://jira.apache.org/jira/browse/SPARK-22342?focusedCommentId=16429950&page=com.atlassian.jira.plugin.system.issuetabpanels%3Acomment-tabpanel#comment-16429950 > > It turned out to be a bug in libmesos (the client library used to > communicate with Mesos): "using a failoverTimeout of 0 with Mesos native > scheduler client can result in infinite subscribe loop" ( > https://issues.apache.org/jira/browse/MESOS-8171). It can be fixed by > upgrading to a version of libmesos that has the fix. > > Susan > > > On Fri, Jul 13, 2018 at 3:39 PM, Nimi W wrote: > >> I've come across an issue with Mesos 1.4.1 and Spark 2.2.1. We launch >> Spark tasks using the MesosClusterDispatcher in cluster mode. On a couple >> of occasions, we have noticed that when the Spark Driver crashes (to >> various causes - human error, network error), sometimes, when the Driver is >> restarted, it issues a hundreds of SUBSCRIBE requests to mesos / per second >> up until the Mesos Master node gets overwhelmed and crashes. It does this >> again to the next master node, over and over until it takes down all the >> master nodes. Usually the only thing that will fix is manually stopping the >> driver and restarting. >> >> Here is a snippet of the log of the mesos master, which just logs the >> repeated SUBSCRIBE command: >> https://gist.github.com/nemosupremo/28ef4acfd7ec5bdcccee9789c021a97f >> >> Here is the output of the spark framework: >> https://gist.github.com/nemosupremo/d098ef4def28ebf96c14d8f87aecd133 which >> also just repeats 'Transport endpoint is not connected' over and over. >> >> Thanks for any insights >> >> >> > > > -- > Susan X. Huynh > Software engineer, Data Agility > xhu...@mesosphere.com >
Re: Spark on Mesos: Spark issuing hundreds of SUBSCRIBE requests / second and crashing Mesos
Hi Nimi, This sounds similar to a bug I have come across before. See: https://jira.apache.org/jira/browse/SPARK-22342?focusedCommentId=16429950&page=com.atlassian.jira.plugin.system.issuetabpanels%3Acomment-tabpanel#comment-16429950 It turned out to be a bug in libmesos (the client library used to communicate with Mesos): "using a failoverTimeout of 0 with Mesos native scheduler client can result in infinite subscribe loop" ( https://issues.apache.org/jira/browse/MESOS-8171). It can be fixed by upgrading to a version of libmesos that has the fix. Susan On Fri, Jul 13, 2018 at 3:39 PM, Nimi W wrote: > I've come across an issue with Mesos 1.4.1 and Spark 2.2.1. We launch > Spark tasks using the MesosClusterDispatcher in cluster mode. On a couple > of occasions, we have noticed that when the Spark Driver crashes (to > various causes - human error, network error), sometimes, when the Driver is > restarted, it issues a hundreds of SUBSCRIBE requests to mesos / per second > up until the Mesos Master node gets overwhelmed and crashes. It does this > again to the next master node, over and over until it takes down all the > master nodes. Usually the only thing that will fix is manually stopping the > driver and restarting. > > Here is a snippet of the log of the mesos master, which just logs the > repeated SUBSCRIBE command: https://gist.github.com/nemosupremo/ > 28ef4acfd7ec5bdcccee9789c021a97f > > Here is the output of the spark framework: https://gist. > github.com/nemosupremo/d098ef4def28ebf96c14d8f87aecd133 which also just > repeats 'Transport endpoint is not connected' over and over. > > Thanks for any insights > > > -- Susan X. Huynh Software engineer, Data Agility xhu...@mesosphere.com
Spark on Mesos: Spark issuing hundreds of SUBSCRIBE requests / second and crashing Mesos
I've come across an issue with Mesos 1.4.1 and Spark 2.2.1. We launch Spark tasks using the MesosClusterDispatcher in cluster mode. On a couple of occasions, we have noticed that when the Spark Driver crashes (to various causes - human error, network error), sometimes, when the Driver is restarted, it issues a hundreds of SUBSCRIBE requests to mesos / per second up until the Mesos Master node gets overwhelmed and crashes. It does this again to the next master node, over and over until it takes down all the master nodes. Usually the only thing that will fix is manually stopping the driver and restarting. Here is a snippet of the log of the mesos master, which just logs the repeated SUBSCRIBE command: https://gist.github.com/nemosupremo/28ef4acfd7ec5bdcccee9789c021a97f Here is the output of the spark framework: https://gist.github.com/nemosupremo/d098ef4def28ebf96c14d8f87aecd133 which also just repeats 'Transport endpoint is not connected' over and over. Thanks for any insights
subscribe
Re: (send this email to subscribe)
You can run Spark app on Dataproc, which is Google's managed Spark and Hadoop service: https://cloud.google.com/dataproc/docs/ basically, you: * assemble a jar * create a cluster * submit a job to that cluster (with the jar) * delete a cluster when the job is done Before all that, one has to create a Cloud Platform project, enable billing and Dataproc API - but all this is explained in the docs. Cheers, Dinko On 4 January 2017 at 17:34, Anahita Talebi wrote: > > To whom it might concern, > > I have a question about running a spark code on Google cloud. > > Actually, I have a spark code and would like to run it using multiple > machines on Google cloud. Unfortunately, I couldn't find a good > documentation about how to do it. > > Do you have any hints which could help me to solve my problem? > > Have a nice day, > > Anahita > > - To unsubscribe e-mail: user-unsubscr...@spark.apache.org
Fwd: (send this email to subscribe)
To whom it might concern, I have a question about running a spark code on Google cloud. Actually, I have a spark code and would like to run it using multiple machines on Google cloud. Unfortunately, I couldn't find a good documentation about how to do it. Do you have any hints which could help me to solve my problem? Have a nice day, Anahita
Spark subscribe
Hi , Can you please add me to spark subscription list. Regards Pradeep S
Re: confirm subscribe to user@spark.apache.org
- To unsubscribe e-mail: user-unsubscr...@spark.apache.org
Re: Subscribe
Please Subscribe via the mailing list as described here: http://beam.incubator.apache.org/use/mailing-lists/ On Mon, Sep 26, 2016, 12:11 Lakshmi Rajagopalan wrote: > >
Subscribe
subscribe
subscribe
Subscribe
Subscribe
Re: subscribe
To subscribe, please go to http://spark.apache.org/community.html to join the mailing list. On Fri, Jan 8, 2016 at 3:58 AM Jeetendra Gangele wrote: > >
subscribe
subscribe
Re: subscribe
You should email users-subscr...@kafka.apache.org if you are trying to subscribe. On 3 January 2016 at 11:52, Rajdeep Dua wrote: > >
subscribe
Re: (send this email to subscribe)
To subscribe to the list, you need to send a mail to user-subscr...@spark.apache.org (see http://spark.apache.org/community.html for details and a subscribe link). On Wed, Nov 18, 2015 at 11:23 AM, Alex Luya wrote: > >
(send this email to subscribe)
subscribe
subscribe
Re: subscribe
https://www.youtube.com/watch?v=umDr0mPuyQc On Sat, Aug 22, 2015 at 8:01 AM, Ted Yu wrote: > See http://spark.apache.org/community.html > > Cheers > > On Sat, Aug 22, 2015 at 2:51 AM, Lars Hermes < > li...@hermes-it-consulting.de> wrote: > >> subscribe >> >> - >> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org >> For additional commands, e-mail: user-h...@spark.apache.org >> >> >
Re: subscribe
See http://spark.apache.org/community.html Cheers On Sat, Aug 22, 2015 at 2:51 AM, Lars Hermes wrote: > subscribe > > - > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org > For additional commands, e-mail: user-h...@spark.apache.org > >
subscribe
subscribe - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
Subscribe
Re: subscribe
https://www.youtube.com/watch?v=H07zYvkNYL8 On Mon, Aug 10, 2015 at 10:55 AM, Ted Yu wrote: > Please take a look at the first section of > https://spark.apache.org/community > > Cheers > > On Mon, Aug 10, 2015 at 10:54 AM, Phil Kallos > wrote: > >> please >> > >
Re: subscribe
Please take a look at the first section of https://spark.apache.org/community Cheers On Mon, Aug 10, 2015 at 10:54 AM, Phil Kallos wrote: > please >
subscribe
please
Re: subscribe
See http://spark.apache.org/community.html Cheers > On Aug 5, 2015, at 10:51 PM, Franc Carter > wrote: > > subscribe
Re: subscribe
Welcome aboard! Thanks Best Regards On Thu, Aug 6, 2015 at 11:21 AM, Franc Carter wrote: > subscribe >
subscribe
subscribe
Re: Subscribe
Please email user-subscr...@spark.apache.org > On Apr 8, 2015, at 6:28 AM, Idris Ali wrote: > > - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
Subscribe
Re: (send this email to subscribe)
Please send email to user-subscr...@spark.apache.org On Mon, Apr 6, 2015 at 6:52 AM, 林晨 wrote: > >
(send this email to subscribe)
Re: Subscribe
Send email to user-subscr...@spark.apache.org Cheers On Fri, Jan 16, 2015 at 11:51 AM, Andrew Musselman < andrew.mussel...@gmail.com> wrote: > >
Subscribe
Re: (send this email to subscribe)
There is no need to include user@spark.apache.org in subscription request. FYI On Fri, Jan 2, 2015 at 7:36 AM, Pankaj wrote: > >
(send this email to subscribe)
Re: subscribe me to the list
Hi Ningjun Please send email to this address to get subscribed: user-subscr...@spark.apache.org On Dec 5, 2014, at 10:36 PM, Wang, Ningjun (LNG-NPV) wrote: > I would like to subscribe to the user@spark.apache.org > > > Regards, > > Ningjun Wang > Consulting Software Engineer > LexisNexis > 121 Chanlon Road > New Providence, NJ 07974-1541
subscribe me to the list
I would like to subscribe to the user@spark.apache.org<mailto:user@spark.apache.org> Regards, Ningjun Wang Consulting Software Engineer LexisNexis 121 Chanlon Road New Providence, NJ 07974-1541
subscribe
- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
subscribe
- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
subscribe
Re: confirm subscribe to user@spark.apache.org
On Fri, Jul 11, 2014 at 3:11 PM, wrote: > Hi! This is the ezmlm program. I'm managing the > user@spark.apache.org mailing list. > > To confirm that you would like > >veera...@gmail.com > > added to the user mailing list, please send > a short reply to this address: > >user-sc.1405116686.kijenhjamnjaodhflpgc-veeran54= > gmail@spark.apache.org > > Usually, this happens when you just hit the "reply" button. > If this does not work, simply copy the address and paste it into > the "To:" field of a new message. > > This confirmation serves two purposes. First, it verifies that I am able > to get mail through to you. Second, it protects you in case someone > forges a subscription request in your name. > > Please note that ALL Apache dev- and user- mailing lists are publicly > archived. Do familiarize yourself with Apache's public archive policy at > > http://www.apache.org/foundation/public-archives.html > > prior to subscribing and posting messages to user@spark.apache.org. > If you're not sure whether or not the policy applies to this mailing list, > assume it does unless the list name contains the word "private" in it. > > Some mail programs are broken and cannot handle long addresses. If you > cannot reply to this request, instead send a message to > and put the > entire address listed above into the "Subject:" line. > > > --- Administrative commands for the user list --- > > I can handle administrative requests automatically. Please > do not send them to the list address! Instead, send > your message to the correct command address: > > To subscribe to the list, send a message to: > > > To remove your address from the list, send a message to: > > > Send mail to the following for info and FAQ for this list: > > > > Similar addresses exist for the digest list: > > > > To get messages 123 through 145 (a maximum of 100 per request), mail: > > > To get an index with subject and author for messages 123-456 , mail: > > > They are always returned as sets of 100, max 2000 per request, > so you'll actually get 100-499. > > To receive all messages with the same subject as message 12345, > send a short message to: > > > The messages should contain one line or word of text to avoid being > treated as sp@m, but I will ignore their content. > Only the ADDRESS you send to is important. > > You can start a subscription for an alternate address, > for example "john@host.domain", just add a hyphen and your > address (with '=' instead of '@') after the command word: > > > To stop subscription for this address, mail: > > > In both cases, I'll send a confirmation message to that address. When > you receive it, simply reply to it to complete your subscription. > > If despite following these instructions, you do not get the > desired results, please contact my owner at > user-ow...@spark.apache.org. Please be patient, my owner is a > lot slower than I am ;-) > > --- Enclosed is a copy of the request I received. > > Return-Path: > Received: (qmail 70277 invoked by uid 99); 11 Jul 2014 22:11:26 - > Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) > by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 11 Jul 2014 22:11:26 > + > X-ASF-Spam-Status: No, hits=-0.3 required=10.0 > > tests=ASF_LIST_OPS,FREEMAIL_ENVFROM_END_DIGIT,HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS > X-Spam-Check-By: apache.org > Received-SPF: pass (athena.apache.org: domain of veera...@gmail.com > designates 209.85.212.173 as permitted sender) > Received: from [209.85.212.173] (HELO mail-wi0-f173.google.com) > (209.85.212.173) > by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 11 Jul 2014 22:11:20 > + > Received: by mail-wi0-f173.google.com with SMTP id cc10so349270wib.6 > for ; Fri, 11 Jul 2014 15:10:58 > -0700 (PDT) > DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; > d=gmail.com; s=20120113; > h=mime-version:date:message-id:subject:from:to:content-type; > bh=kidlw3R7uWQaPspPvOk8WJFI36NQFLw02hzB1Mp9UVc=; > > b=WiLScFUuJZYgoF7St7OB4FdLcnRq4xvu1zO90rcJ3RlcLI2cT77fVe/KhCXDeanjwe > > 9570nq83zivE2a/suKw/6j90hM/eGWas1Dw+N63myi69AN6V9q2FZICazw/WcPfVAPGY > > Vl7/OjjjAdIEDJ9bBglJ857FpkpOZ3ES+ZhmQb3xnEmqCyDMMfWDPeX7q8ZyHhJCkTgY > > EQuc6tD2Qco9Q9tYlqxv0gnqZQLR5RqgOnt/HzDE2b9Hrz+QUfmI039x6g5AQ7BKMI9h > > GHn2TTXJ31eGH+Iin0TG/SBLs8OKCttD0OeS+1XFH5zAHSSFlc734BDb5LQnBkqGDpIE > hU8g== > MIME-Version: 1.0 > X-Received: by 10.194.87.97 with SMTP id w1mr2272592wjz.42.14051
Re: subscribe
send this to 'user-request', not 'user' 2014-03-10 17:32 GMT+08:00 hequn cheng : > hi >
subscribe
hi
subscribe
hi