Hi Yin, i’m using spark-hive dependency and tests for my app work for
spark1.3.1.
seems it’s something with hive sbt. Running from spark-shell next
statement works, but from sbt console in rc3 i get next error:
scala val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc)
15/05/29
Also have the same issue - all tests fail because of HiveContext / derby
lock.
|Cause: javax.jdo.JDOFatalDataStoreException: Unable to open a test
connection to the given database. JDBC url =
jdbc:derby:;databaseName=metastore_db;create=true, username = APP.
Terminating connection pool (set
Justin,
If you are creating multiple HiveContexts in tests, you need to assign a
temporary metastore location for every HiveContext (like what we do at here
https://github.com/apache/spark/blob/master/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveContext.scala#L527-L543).
Otherwise, they
Hey jameszhouyi,
Since SPARK-7119 is not a regression from earlier versions, we won't
hold the release for it. However, please comment on the JIRA if it is
affecting you... it will help us prioritize the bug.
- Patrick
On Fri, May 22, 2015 at 8:41 PM, jameszhouyi yiaz...@gmail.com wrote:
We
I'm working on one of the Palantir teams using Spark, and here is our
feedback:
We have encountered three issues when upgrading to spark 1.4.0. I'm not
sure they qualify as a -1, as they come from using non-public APIs and
multiple spark contexts for the purposes of testing, but I do want to
Thanks for catching this. I'll check with Patrick to see why the R API docs
are not getting included.
On Fri, May 22, 2015 at 2:44 PM, Andrew Psaltis psaltis.and...@gmail.com
wrote:
All,
Should all the docs work from
http://people.apache.org/~pwendell/spark-1.4.0-rc1-docs/ ? If so the R
API
All,
Should all the docs work from
http://people.apache.org/~pwendell/spark-1.4.0-rc1-docs/ ? If so the R API
docs 404.
On Tue, May 19, 2015 at 11:10 AM, Patrick Wendell pwend...@gmail.com
wrote:
Please vote on releasing the following candidate as Apache Spark
version 1.4.0!
The tag to be
Thanks for the feedback. As you stated UDTs are explicitly not a public
api as we knew we were going to be making breaking changes to them. We
hope to stabilize / open them up in future releases. Regarding the Hive
issue, have you tried using TestHive instead. This is what we use for
testing
Thanks Andrew, the doc issue should be fixed in RC2 (if not, please
chine in!). R was missing in the build envirionment.
- Patrick
On Fri, May 22, 2015 at 3:33 PM, Shivaram Venkataraman
shiva...@eecs.berkeley.edu wrote:
Thanks for catching this. I'll check with Patrick to see why the R API docs
We came across a Spark SQL issue
(https://issues.apache.org/jira/browse/SPARK-7119) that cause query to fail.
I not sure that if vote -1 to this RC1.
--
View this message in context:
Signature, hashes, LICENSE/NOTICE, source tarball looks OK. I built
for Hadoop 2.6 (-Pyarn -Phive -Phadoop-2.6) on Ubuntu from source and
tests pass. The release looks OK except that I'd like to resolve the
Blockers before giving a +1.
I'm seeing some test failures, and wanted to cross-check with
-1
discovered I accidentally removed master worker json endpoints, will
restore
https://issues.apache.org/jira/browse/SPARK-7760
On Tue, May 19, 2015 at 11:10 AM, Patrick Wendell pwend...@gmail.com
wrote:
Please vote on releasing the following candidate as Apache Spark version
1.4.0!
The
Quick tests from my side - looks OK. The results are same or very similar
to 1.3.1. Will add dataframes et al in future tests.
+1 (non-binding, of course)
1. Compiled OSX 10.10 (Yosemite) OK Total time: 17:42 min
mvn clean package -Pyarn -Dyarn.version=2.6.0 -Phadoop-2.4
HI all,
I've created another release repository where the release is
identified with the version 1.4.0-rc1:
https://repository.apache.org/content/repositories/orgapachespark-1093/
On Tue, May 19, 2015 at 5:36 PM, Krishna Sankar ksanka...@gmail.com wrote:
Quick tests from my side - looks OK.
Punya,
Let me see if I can publish these under rc1 as well. In the future
this will all be automated but current it's a somewhat manual task.
- Patrick
On Tue, May 19, 2015 at 9:32 AM, Punyashloka Biswal
punya.bis...@gmail.com wrote:
When publishing future RCs to the staging repository, would
Thanks! I realize that manipulating the published version in the pom is a
bit inconvenient but it's really useful to have clear version identifiers
when we're juggling different versions and testing them out. For example,
this will come in handy when we compare 1.4.0-rc1 and 1.4.0-rc2 in a couple
A couple of other process things:
1. Please *keep voting* (+1/-1) on this thread even if we find some
issues, until we cut RC2. This lets us pipeline the QA.
2. The SQL team owes a JIRA clean-up (forthcoming shortly)... there
are still a few Blocker's that aren't.
On Tue, May 19, 2015 at 9:10
Before I vote, I wanted to point out there are still 9 Blockers for 1.4.0.
I'd like to use this status to really mean must happen before the
release. Many of these may be already fixed, or aren't really blockers --
can just be updated accordingly.
I bet at least one will require further work if
18 matches
Mail list logo