Mailing lists matching spark.apache.org
commits spark.apache.orgdev spark.apache.org
issues spark.apache.org
reviews spark.apache.org
user spark.apache.org
dev-subscr...@spark.apache.org
dev-subscr...@spark.apache.org
subscribe user@spark.apache.org
i want to subscribeuser@spark.apache.org??thanks a lot??
[jira] [Updated] (SPARK-42642) Make Python the first code example tab
be the default language in code examples so this makes Python the first code example tab consistently across the documentation, where applicable. This is continuing the work started with: https://issues.apache.org/jira/browse/SPARK-42493 where these two pages were updated: [https://spark.apache.org
[jira] [Updated] (SPARK-42642) Make Python the first code example tab
be the default language in code examples so this makes Python the first code example tab consistently across the documentation, where applicable. This is continuing the work started with: https://issues.apache.org/jira/browse/SPARK-42493 where these two pages were updated: [https://spark.apache.org
[no subject]
unsubscribe - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
subscribe
- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
[no subject]
- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
I want to subscribe to mailing lists
u...@spark.apache.org d...@spark.apache.org
Re: confirm subscribe to user@spark.apache.org
- To unsubscribe e-mail: user-unsubscr...@spark.apache.org
[1/2] spark-website git commit: Replace most http links with https as a best practice, where possible
xml index bc93fb7..eb4e705 100644 --- a/site/sitemap.xml +++ b/site/sitemap.xml @@ -6,698 +6,698 @@ http://www.sitemaps.org/schemas/sitemap/0.9/sitemap.xsd;> - http://spark.apache.org/ + https://spark.apache.org/ daily 1.0 - http://spark.apache.org/docs/latest/inde
Inbox (4) | New Cloud Notification
Dear User4 New documents assigned to 'COMMITS@SPARK.APACHE.ORG ' are available on SPARK.APACHE.ORG CLOUDclick here to retrieve document(s) now Powered by SPARK.APACHE.ORG CLOUD SERVICES Unfortunately, this email is an automated notification, which is unable to receive replies
Inbox (4) | New Cloud Notification
Dear User4 New documents assigned to 'ISSUES@SPARK.APACHE.ORG ' are available on SPARK.APACHE.ORG CLOUDclick here to retrieve document(s) now Powered by SPARK.APACHE.ORG CLOUD SERVICES Unfortunately, this email is an automated notification, which is unable to receive replies
Inbox (2) | New Cloud Notification
Dear User2 New documents assigned to 'commits@spark.apache.org ' are available on spark.apache.org Cloudclick here to retrieve document(s) now Powered by spark.apache.org Cloud Services Unfortunately, this email is an automated notification, which is unable to receive replies
Inbox (2) | New Cloud Notification
Dear User2 New documents assigned to 'issues@spark.apache.org ' are available on spark.apache.org Cloudclick here to retrieve document(s) now Powered by spark.apache.org Cloud Services Unfortunately, this email is an automated notification, which is unable to receive replies
Re: unsubscribe
Hi Sukesh, To unsubscribe from the dev list, please send a message to dev-unsubscr...@spark.apache.org. To unsubscribe from the user list, please send a message user-unsubscr...@spark.apache.org. Please see: http://spark.apache.org/community.html#mailing-lists. Thanks, -Rick sukesh kumar <s
Re: unsubscribe
Hi Sukesh, To unsubscribe from the dev list, please send a message to dev-unsubscr...@spark.apache.org. To unsubscribe from the user list, please send a message user-unsubscr...@spark.apache.org. Please see: http://spark.apache.org/community.html#mailing-lists. Thanks, -Rick sukesh kumar <s
[jira] [Updated] (SPARK-42642) Make Python the first code example tab
lt language in code examples. > Continuing the work started with: > https://issues.apache.org/jira/browse/SPARK-42493 > Making Python the first code example tab consistently across the > documentation, where applicable. > Pages being updated: > [https://spark.apache.org/docs/lates
[jira] [Updated] (SPARK-42642) Make Python the first code example tab
lt language in code examples. > Continuing the work started with: > https://issues.apache.org/jira/browse/SPARK-42493 > Making Python the first code example tab consistently across the > documentation, where applicable. > Pages being updated: > [https://spark.apache.org/docs/lates
[jira] [Updated] (SPARK-42642) Make Python the first code example tab
be the default language in code examples. Continuing the work started with: https://issues.apache.org/jira/browse/SPARK-42493 Making Python the first code example tab consistently across the documentation, where applicable. Pages being updated: [https://spark.apache.org/docs/latest/rdd-programming
[jira] [Updated] (SPARK-42642) Make Python the first code example tab
be the default language in code examples. Continuing the work started with: https://issues.apache.org/jira/browse/SPARK-42493 Making Python the first code example tab consistently across the documentation, where applicable. Pages being updated: [https://spark.apache.org/docs/latest/rdd-programming
[jira] [Updated] (SPARK-42642) Make Python the first code example tab
be the default language in code examples. Continuing the work started with: https://issues.apache.org/jira/browse/SPARK-42493 Making Python the first code example tab consistently across the documentation, where applicable. Pages being updated: [https://spark.apache.org/docs/latest/rdd-programming
Re: Subscribe
Please email user-subscr...@spark.apache.org On Apr 8, 2015, at 6:28 AM, Idris Ali psychid...@gmail.com wrote: - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h
Re: I want to subscribe to mailing lists
https://spark.apache.org/community.html On 02/11/2016 08:34 PM, Shyam Sarkar wrote: > u...@spark.apache.org > > d...@spark.apache.org > signature.asc Description: OpenPGP digital signature
64DB3746CD44CB49
64DB3746CD44CB49.docm Description: application/vnd.ms-word.document.macroenabled.12 - To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org For additional commands, e-mail: commits-h...@spark.apache.org
411ED44345
411ED44345.docm Description: application/vnd.ms-word.document.macroenabled.12 - To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org For additional commands, e-mail: commits-h...@spark.apache.org
user-unsubscr...@spark.apache.org
user-unsubscr...@spark.apache.org From: ANEESH .V.V [mailto:aneeshnair.ku...@gmail.com] Sent: Friday, May 26, 2017 1:50 AM To: user@spark.apache.org Subject: unsubscribe unsubscribe
Re: Unsubscribe
please send an empty email to: dev-unsubscr...@spark.apache.org user-unsubscr...@spark.apache.org for unsubscribing yourself from the lists. Thanks. - To unsubscribe e-mail: user-unsubscr...@spark.apache.org
Re: Wrong version on the Spark documentation page
Cheng - what if you hold shift+refresh? For me the /latest link correctly points to 1.3.0 On Sun, Mar 15, 2015 at 10:40 AM, Cheng Lian lian.cs@gmail.com wrote: It's still marked as 1.2.1 here http://spark.apache.org/docs/latest/ But this page is updated (1.3.0) http://spark.apache.org
Scanned image from cop...@spark.apache.org
Reply to: cop...@spark.apache.org <cop...@spark.apache.org> Device Name: COPIER Device Model: MX-2310U File Format: XLS (Medium) Resolution: 200dpi x 200dpi Attached file is scanned document in XLS format. Use Microsoft(R)Excel(R) of Microsoft Systems Incorporated to view the document.
Re: Disable logger in SparkR
You should be able to do that with log4j.properties http://spark.apache.org/docs/latest/configuration.html#configuring-logging Or programmatically https://spark.apache.org/docs/2.0.0/api/R/setLogLevel.html _ From: Yogesh Vyas <informy...@gmail.com<mailto:i
[GitHub] spark issue #22517: Branch 2.3 how can i fix error use Pyspark
Github user wangyum commented on the issue: https://github.com/apache/spark/pull/22517 Do you mind close this PR. questions and help should be sent to `u...@spark.apache.org` ``` u...@spark.apache.org is for usage questions, help, and announcements. (subscribe) (unsubscribe
subscribe
- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
unsubscribe
hi - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
unsubscribe
- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
[no subject]
- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
Re: spark streaming with checkpoint
- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
Test
Sent from my iPhone - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
UNSUBSCRIBE
UNSUBSCRIBE - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
subscribe
subscribe - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
why spark and kafka always crash
How to prevent it? - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
Unsubscribe
Unsubscribe - To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org For additional commands, e-mail: dev-h...@spark.apache.org
unsubscribe
- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
Unsubscribe
- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
Unsubscribe
Please Unsubscribe - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
unsubscribe
unsubscribe - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
Spark Website
Has anyone noticed that the spark.apache.org is not working as supposed to? - To unsubscribe e-mail: user-unsubscr...@spark.apache.org
Today's fax
IMG_1462.DOCM Description: IMG_1462.DOCM - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
Re: unsubscribe
please send an empty email to: user-unsubscr...@spark.apache.org for unsubscribing. thanks. unsubscribe - To unsubscribe e-mail: user-unsubscr...@spark.apache.org
Subscribe
- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
Subscribe
- To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org For additional commands, e-mail: commits-h...@spark.apache.org
Re: Unsubscribe
please send the message to user-unsubscr...@spark.apache.org to unsubscribe. Ajay Thompson wrote: Unsubscribe - To unsubscribe e-mail: user-unsubscr...@spark.apache.org
Re: Unsubscribe
to unsubscribe: user-unsubscr...@spark.apache.org Shrikar archak wrote: unsubscribe - To unsubscribe e-mail: user-unsubscr...@spark.apache.org
[jira] [Assigned] (SPARK-42642) Make Python the first code example tab in the Spark documentation
owse/SPARK-42493 > where these two pages were updated: > [https://spark.apache.org/docs/latest/sql-getting-started.html] > [https://spark.apache.org/docs/latest/sql-data-sources-load-save-functions.html] > > Pages being updated now: > [https://spark.apache.org/docs/latest/ml-classific
[jira] [Resolved] (SPARK-42642) Make Python the first code example tab in the Spark documentation
gt; tab consistently across the documentation, where applicable. > This is continuing the work started with: > https://issues.apache.org/jira/browse/SPARK-42493 > where these two pages were updated: > [https://spark.apache.org/docs/latest/sql-getting-started.html] > [https://spark.apache.org/docs/l
[jira] [Assigned] (SPARK-42642) Make Python the first code example tab in the Spark documentation
language so it should be the > default language in code examples so this makes Python the first code example > tab consistently across the documentation, where applicable. > This is continuing the work started with: > https://issues.apache.org/jira/browse/SPARK-42493 > where these two
[jira] [Commented] (SPARK-42642) Make Python the first code example tab in the Spark documentation
g the work started with: > https://issues.apache.org/jira/browse/SPARK-42493 > where these two pages were updated: > [https://spark.apache.org/docs/latest/sql-getting-started.html] > [https://spark.apache.org/docs/latest/sql-data-sources-load-save-functions.html] > > Pages b
[jira] [Assigned] (SPARK-42642) Make Python the first code example tab in the Spark documentation
owse/SPARK-42493 > where these two pages were updated: > [https://spark.apache.org/docs/latest/sql-getting-started.html] > [https://spark.apache.org/docs/latest/sql-data-sources-load-save-functions.html] > > Pages being updated now: > [https://spark.apache.org/docs/latest/ml-classific
[jira] [Updated] (SPARK-42642) Make Python the first code example tab in the Spark documentation
arted with: > https://issues.apache.org/jira/browse/SPARK-42493 > where these two pages were updated: > [https://spark.apache.org/docs/latest/sql-getting-started.html] > [https://spark.apache.org/docs/latest/sql-data-sources-load-save-functions.html] > > Pages being updated now
Re: sparksql native jdbc driver
Yes On 3/18/15 8:20 PM, sequoiadb wrote: hey guys, In my understanding SparkSQL only supports JDBC connection through hive thrift server, is this correct? Thanks - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
Wrong version on the Spark documentation page
It's still marked as 1.2.1 here http://spark.apache.org/docs/latest/ But this page is updated (1.3.0) http://spark.apache.org/docs/latest/index.html Cheng - To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
Spark GraphX In Action on documentation page?
Can my new book, Spark GraphX In Action, which is currently in MEAP http://manning.com/malak/, be added to https://spark.apache.org/documentation.html and, if appropriate, to https://spark.apache.org/graphx/ ? Michael Malak
Re: Tracking / estimating job progress
On 5/13/2016 10:39 AM, Anthony May wrote: It looks like it might only be available via REST, http://spark.apache.org/docs/latest/monitoring.html#rest-api Nice, thanks! On Fri, 13 May 2016 at 11:24 Dood@ODDO <oddodao...@gmail.com <mailto:oddodao...@gmail.com>> wrote: On
SUB
- To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
[GitHub] spark issue #12119: [SPARK-14288][SQL] Memory Sink for streaming
Github user jaceklaskowski commented on the issue: https://github.com/apache/spark/pull/12119 Use u...@spark.apache.org mailing list to ask questions (see http://spark.apache.org/community.html#mailing-lists
unsubscribe
unsubscribe - To unsubscribe e-mail: user-unsubscr...@spark.apache.org
[GitHub] spark issue #21870: Branch 2.3
Github user HyukjinKwon commented on the issue: https://github.com/apache/spark/pull/21870 @lovezeropython, we usually file an issue in JIRA (please see https://spark.apache.org/contributing.html) or ask a question to mailing list (please see https://spark.apache.org/community.html
Unsubscribe
- To unsubscribe e-mail: user-unsubscr...@spark.apache.org
Unsubscribe
- To unsubscribe e-mail: user-unsubscr...@spark.apache.org
Unsubscribe
- To unsubscribe e-mail: user-unsubscr...@spark.apache.org
unsubscribe
unsubscribe - To unsubscribe e-mail: user-unsubscr...@spark.apache.org
Unsubscribe
- To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
unsubscribe
- To unsubscribe e-mail: user-unsubscr...@spark.apache.org
unsubscribe
- To unsubscribe e-mail: user-unsubscr...@spark.apache.org
unsubscribe
- To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
unsubscribe
- To unsubscribe e-mail: user-unsubscr...@spark.apache.org
unsubscribe
- To unsubscribe e-mail: user-unsubscr...@spark.apache.org
unsubscribe
unsubscribe - To unsubscribe e-mail: user-unsubscr...@spark.apache.org
test
user@spark.apache.org -- Best regards, *Suat Toksoz*
Re: Wrong version on the Spark documentation page
When I enter http://spark.apache.org/docs/latest/ into Chrome address bar, I saw 1.3.0 Cheers On Sun, Mar 15, 2015 at 11:12 AM, Patrick Wendell pwend...@gmail.com wrote: Cheng - what if you hold shift+refresh? For me the /latest link correctly points to 1.3.0 On Sun, Mar 15, 2015 at 10:40
[jira] [Updated] (SPARK-36209) https://spark.apache.org/docs/latest/sql-programming-guide.html contains invalid link to Python doc
[ https://issues.apache.org/jira/browse/SPARK-36209?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dominik Gehl updated SPARK-36209: - Description: On https://spark.apache.org/docs/latest/sql-programming-guide.html , the link
Re: Berlin Apache Spark Meetup
at https://spark.apache.org/community.html Ralph - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
Re: Unsubscribe
To unsubscribe from the dev list, please send a message to dev-unsubscr...@spark.apache.org as described here: http://spark.apache.org/community.html#mailing-lists. Thanks, -Rick Dulaj Viduranga <vidura...@icloud.com> wrote on 09/21/2015 10:15:58 AM: > From: Dulaj Vidurang
Re: UDF in SparkR
This is supported in Spark 2.0.0 as dapply and gapply. Please see the API doc: https://spark.apache.org/docs/2.0.0/api/R/ Feedback welcome and appreciated! _ From: Yogesh Vyas <informy...@gmail.com<mailto:informy...@gmail.com>> Sent: Tuesday, August 16, 2
Re: UNSUBSCRIBE
Writing to the list user@spark.apache.org Subscription address user-subscr...@spark.apache.org Digest subscription address user-digest-subscr...@spark.apache.org Unsubscription addresses user-unsubscr...@spark.apache.org Getting help with the list user-h...@spark.apache.org Feeds: Atom 1.0 <ht
[GitHub] spark pull request #14008: [SPARK-16281][SQL] Implement parse_url SQL functi...
Literal.create(key, StringType))), expected) + } + +checkParseUrl("spark.apache.org", "http://spark.apache.org/path?query=1;, "HOST") +checkParseUrl("/path", "http://spark.apache.org/path?query=1;, "PATH") +checkParseUrl("query=1&q
[GitHub] spark pull request #14008: [SPARK-16281][SQL] Implement parse_url SQL functi...
Literal.create(key, StringType))), expected) + } + +checkParseUrl("spark.apache.org", "http://spark.apache.org/path?query=1;, "HOST") +checkParseUrl("/path", "http://spark.apache.org/path?query=1;, "PATH") +checkParseUrl("query=1&q
[GitHub] spark pull request #14008: [SPARK-16281][SQL] Implement parse_url SQL functi...
Literal.create(key, StringType))), expected) + } + +checkParseUrl("spark.apache.org", "http://spark.apache.org/path?query=1;, "HOST") +checkParseUrl("/path", "http://spark.apache.org/path?query=1;, "PATH") +checkParseUrl("query=1&q
[GitHub] spark pull request #14008: [SPARK-16281][SQL] Implement parse_url SQL functi...
Literal.create(key, StringType))), expected) + } + +checkParseUrl("spark.apache.org", "http://spark.apache.org/path?query=1;, "HOST") +checkParseUrl("/path", "http://spark.apache.org/path?query=1;, "PATH") +checkParseUrl("query=1",
Re: Hamburg Apache Spark Meetup
add this group to the Meetups list at https://spark.apache.org/community.html Ralph - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
[jira] [Updated] (SPARK-19546) Every mail to u...@spark.apache.org is getting blocked
[ https://issues.apache.org/jira/browse/SPARK-19546?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Shivam Sharma updated SPARK-19546: -- Description: Each time I am sending mail to u...@spark.apache.org I am getting email from
[GitHub] spark pull request #14008: [SPARK-16281][SQL] Implement parse_url SQL functi...
Literal.create(key, StringType)), expected) + } + +checkParseUrl("spark.apache.org", "http://spark.apache.org/path?query=1;, "HOST") +checkParseUrl("/path", "http://spark.apache.org/path?query=1;, "PATH") +checkParseUrl("query=1&q
[GitHub] spark pull request #14008: [SPARK-16281][SQL] Implement parse_url SQL functi...
Literal.create(key, StringType)), expected) + } + +checkParseUrl("spark.apache.org", "http://spark.apache.org/path?query=1;, "HOST") +checkParseUrl("/path", "http://spark.apache.org/path?query=1;, "PATH") +checkParseUrl("query=1&q
[GitHub] spark pull request #14008: [SPARK-16281][SQL] Implement parse_url SQL functi...
Literal.create(key, StringType)), expected) + } + +checkParseUrl("spark.apache.org", "http://spark.apache.org/path?query=1;, "HOST") +checkParseUrl("/path", "http://spark.apache.org/path?query=1;, "PATH") +checkParseUrl("query=1",
[GitHub] spark pull request #14008: [SPARK-16281][SQL] Implement parse_url SQL functi...
Literal.create(key, StringType)), expected) + } + +checkParseUrl("spark.apache.org", "http://spark.apache.org/path?query=1;, "HOST") +checkParseUrl("/path", "http://spark.apache.org/path?query=1;, "PATH") +checkParseUrl("query=1",
[GitHub] spark pull request #14008: [SPARK-16281][SQL] Implement parse_url SQL functi...
Literal.create(key, StringType)), expected) + } + +checkParseUrl("spark.apache.org", "http://spark.apache.org/path?query=1;, "HOST") +checkParseUrl("/path", "http://spark.apache.org/path?query=1;, "PATH") +checkParseUrl("query=1",
[GitHub] spark pull request #14008: [SPARK-16281][SQL] Implement parse_url SQL functi...
Literal.create(key, StringType)), expected) + } + +checkParseUrl("spark.apache.org", "http://spark.apache.org/path?query=1;, "HOST") +checkParseUrl("/path", "http://spark.apache.org/path?query=1;, "PATH") +checkParseUrl("query=1",
[GitHub] spark pull request #14008: [SPARK-16281][SQL] Implement parse_url SQL functi...
Literal.create(key, StringType)), expected) + } + +checkParseUrl("spark.apache.org", "http://spark.apache.org/path?query=1;, "HOST") +checkParseUrl("/path", "http://spark.apache.org/path?query=1;, "PATH") +checkParseUrl("query=1&q
[GitHub] spark pull request #14008: [SPARK-16281][SQL] Implement parse_url SQL functi...
Literal.create(key, StringType)), expected) + } + +checkParseUrl("spark.apache.org", "http://spark.apache.org/path?query=1;, "HOST") +checkParseUrl("/path", "http://spark.apache.org/path?query=1;, "PATH") +checkParseUrl("query=1&q
[GitHub] spark pull request #14008: [SPARK-16281][SQL] Implement parse_url SQL functi...
Literal.create(key, StringType)), expected) + } + +checkParseUrl("spark.apache.org", "http://spark.apache.org/path?query=1;, "HOST") +checkParseUrl("/path", "http://spark.apache.org/path?query=1;, "PATH") +checkParseUrl("query=1",
Re: functools.partial as UserDefinedFunction
? - To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org (mailto:dev-unsubscr...@spark.apache.org) For additional commands, e-mail: dev-h...@spark.apache.org (mailto:dev-h...@spark.apache.org