Repository: spark
Updated Branches:
refs/heads/branch-1.3 93fee7b90 - 6fda4c136
[SPARK-5652][Mllib] Use broadcasted weights in LogisticRegressionModel
`LogisticRegressionModel`'s `predictPoint` should directly use broadcasted
weights. This pr also fixes the compilation errors of two unit
Repository: spark
Updated Branches:
refs/heads/branch-1.2 f318af0fd - 09da688b0
[SPARK-4989][CORE] backport for branch-1.2 catch eventlog exception for wrong
eventlog conf
JIRA is [SPARK-4989](https://issues.apache.org/jira/browse/SPARK-4989)
Author: Zhang, Liye liye.zh...@intel.com
Closes
Repository: spark
Updated Branches:
refs/heads/master 80f3bcb58 - fb6c0cbac
[HOTFIX] Fix test build break in ExecutorAllocationManagerSuite.
This was caused because #3486 added a new field to ExecutorInfo and #4369
added new tests that created ExecutorInfos. These patches were merged in
Repository: spark
Updated Branches:
refs/heads/master d34f79c8d - 70e5b030a
[SPARK-5628] Add version option to spark-ec2
Every proper command line tool should include a `--version` option or something
similar.
This PR adds this to `spark-ec2` using the standard functionality provided by
Repository: spark
Updated Branches:
refs/heads/branch-1.3 ab0ffde3e - 921121d57
[SPARK-5650][SQL] Support optional 'FROM' clause
In Hive, 'FROM' clause is optional. This pr supports it.
Author: Liang-Chi Hsieh vii...@gmail.com
Closes #4426 from viirya/optional_from and squashes the
Repository: spark
Updated Branches:
refs/heads/master 3eccf29ce - b62c35245
[SQL][HiveConsole][DOC] HiveConsole `correct hiveconsole imports`
Sorry for that PR #4330 has some mistakes.
I correct it so it works correctly now.
Author: OopsOutOfMemory victorshen...@126.com
Closes #4389
Repository: spark
Updated Branches:
refs/heads/branch-1.3 d82260606 - 1b148adfc
[SPARK-5278][SQL] Introduce UnresolvedGetField and complete the check of
ambiguous reference to fields
When the `GetField` chain(`a.b.c.d.`) is interrupted by `GetItem` like
`a.b[0].c.d`, then the check
Repository: spark
Updated Branches:
refs/heads/master 4793c8402 - 3d3ecd774
[SPARK-5586][Spark Shell][SQL] Make `sqlContext` available in spark shell
Result is like this
```
15/02/05 13:41:22 INFO SparkILoop: Created spark context..
Spark context available as sc.
15/02/05 13:41:22 INFO
Repository: spark
Updated Branches:
refs/heads/branch-1.3 52386cf44 - 9387dc1c8
[SPARK-5593][Core]Replace BlockManagerListener with ExecutorListener in
ExecutorAllocationListener
More strictly, in ExecutorAllocationListener, we need to replace
onBlockManagerAdded, onBlockManagerRemoved with
Repository: spark
Updated Branches:
refs/heads/branch-1.3 540f474cf - ab0ffde3e
[SPARK-5628] Add version option to spark-ec2
Every proper command line tool should include a `--version` option or something
similar.
This PR adds this to `spark-ec2` using the standard functionality provided by
Repository: spark
Updated Branches:
refs/heads/master 9ad56ad2a - cc6e53119
[SPARK-5653][YARN] In ApplicationMaster rename isDriver to isClusterMode
in ApplicationMaster rename isDriver to isClusterMode,because in Client it uses
isClusterMode,ApplicationMaster should keep consistent with it
Repository: spark
Updated Branches:
refs/heads/branch-1.3 faccdcbc2 - 4ff8855e8
[SPARK-5653][YARN] In ApplicationMaster rename isDriver to isClusterMode
in ApplicationMaster rename isDriver to isClusterMode,because in Client it uses
isClusterMode,ApplicationMaster should keep consistent with
Repository: spark
Updated Branches:
refs/heads/master c01b9852e - 9792bec59
[SPARK-4877] Allow user first classes to extend classes in the parent.
Previously, the classloader isolation was almost too good, such
that if a child class needed to load/reference a class that was
only available in
Repository: spark
Updated Branches:
refs/heads/branch-1.3 2dc94cd8b - 52386cf44
[SPARK-4877] Allow user first classes to extend classes in the parent.
Previously, the classloader isolation was almost too good, such
that if a child class needed to load/reference a class that was
only available
Repository: spark
Updated Branches:
refs/heads/master 4cdb26c17 - 32e964c41
SPARK-2450 Adds executor log links to Web UI
Adds links to stderr/stdout in the executor tab of the webUI for:
1) Standalone
2) Yarn client
3) Yarn cluster
This tries to add the log url support in a general way so as
Repository: spark
Updated Branches:
refs/heads/branch-1.3 921121d57 - 779e28b6d
[SPARK-5640] Synchronize ScalaReflection where necessary
Author: Tobias Schlatter tob...@meisch.ch
Closes #4431 from gzm0/sync-scala-refl and squashes the following commits:
c5da21e [Tobias Schlatter]
Repository: spark
Updated Branches:
refs/heads/master fb6c0cbac - af2a2a263
[SPARK-4361][Doc] Add more docs for Hadoop Configuration
I'm trying to point out reusing a Configuration in these APIs is dangerous. Any
better idea?
Author: zsxwing zsxw...@gmail.com
Closes #3225 from
Repository: spark
Updated Branches:
refs/heads/branch-1.3 0fc35dafe - 3c34d62c4
[SPARK-5595][SPARK-5603][SQL] Add a rule to do PreInsert type casting and field
renaming and invalidating in memory cache after INSERT
This PR adds a rule to Analyzer that will add preinsert data type casting and
Repository: spark
Updated Branches:
refs/heads/master 1a88f20de - fe3740c4c
[SPARK-5636] Ramp up faster in dynamic allocation
A recent patch #4051 made the initial number default to 0. With this change,
any Spark application using dynamic allocation's default settings will ramp up
very
Repository: spark
Updated Branches:
refs/heads/master 32e964c41 - 0d74bd7fd
[SPARK-] Enable UISeleniumSuite tests
This patch enables UISeleniumSuite, a set of tests for the Spark application
web UI. These tests were previously disabled because they were slow, but I
think we now have
Repository: spark
Updated Branches:
refs/heads/branch-1.3 e74dd0478 - 93fee7b90
[SPARK-] Enable UISeleniumSuite tests
This patch enables UISeleniumSuite, a set of tests for the Spark application
web UI. These tests were previously disabled because they were slow, but I
think we now
Repository: spark
Updated Branches:
refs/heads/branch-1.3 9e828f429 - 528dd34fe
[SPARK-4361][Doc] Add more docs for Hadoop Configuration
I'm trying to point out reusing a Configuration in these APIs is dangerous. Any
better idea?
Author: zsxwing zsxw...@gmail.com
Closes #3225 from
Repository: spark
Updated Branches:
refs/heads/master a958d6097 - 0b7eb3f3b
[SPARK-5324][SQL] Results of describe can't be queried
Make below code works.
```
sql(DESCRIBE test).registerTempTable(describeTest)
sql(SELECT * FROM describeTest).collect()
```
Author: OopsOutOfMemory
Repository: spark
Updated Branches:
refs/heads/branch-1.3 cc66a3cb7 - 0fc35dafe
[SPARK-5324][SQL] Results of describe can't be queried
Make below code works.
```
sql(DESCRIBE test).registerTempTable(describeTest)
sql(SELECT * FROM describeTest).collect()
```
Author: OopsOutOfMemory
Repository: spark
Updated Branches:
refs/heads/master cc6e53119 - 1a88f20de
SPARK-4337. [YARN] Add ability to cancel pending requests
Author: Sandy Ryza sa...@cloudera.com
Closes #4141 from sryza/sandy-spark-4337 and squashes the following commits:
a98bd20 [Sandy Ryza] Andrew's comments
Repository: spark
Updated Branches:
refs/heads/branch-1.3 3c34d62c4 - 2abaa6e97
[SQL][HiveConsole][DOC] HiveConsole `correct hiveconsole imports`
Sorry for that PR #4330 has some mistakes.
I correct it so it works correctly now.
Author: OopsOutOfMemory victorshen...@126.com
Closes
Repository: spark
Updated Branches:
refs/heads/branch-1.3 2abaa6e97 - d82260606
[SQL][Minor] Remove cache keyword in SqlParser
Since cache keyword already defined in `SparkSQLParser` and `SqlParser` of
catalyst is a more general parser which should not cover keywords related to
underlying
Repository: spark
Updated Branches:
refs/heads/master b62c35245 - bc3635608
[SQL][Minor] Remove cache keyword in SqlParser
Since cache keyword already defined in `SparkSQLParser` and `SqlParser` of
catalyst is a more general parser which should not cover keywords related to
underlying
Repository: spark
Updated Branches:
refs/heads/master bc3635608 - 4793c8402
[SPARK-5278][SQL] Introduce UnresolvedGetField and complete the check of
ambiguous reference to fields
When the `GetField` chain(`a.b.c.d.`) is interrupted by `GetItem` like
`a.b[0].c.d`, then the check of
Repository: spark
Updated Branches:
refs/heads/branch-1.3 4ff8855e8 - 8007a4f20
[SPARK-5470][Core]use defaultClassLoader to load classes in KryoSerializer
Now KryoSerializer load classes of classesToRegister at the time of its
initialization. when we set spark.kryo.classesToRegister=class1,
Repository: spark
Updated Branches:
refs/heads/branch-1.3 8007a4f20 - f6613fc3f
Update ec2-scripts.md
Change spark-version from 1.1.0 to 1.2.0 in the example for spark-ec2/Launch
Cluster.
Author: Miguel Peralvo miguel.pera...@gmail.com
Closes #4300 from MiguelPeralvo/patch-1 and squashes
Repository: spark
Updated Branches:
refs/heads/master fe3740c4c - c01b9852e
[SPARK-5396] Syntax error in spark scripts on windows.
Modified syntax error in spark-submit2.cmd. Command prompt doesn't have
defined operator.
Author: Masayoshi TSUZUKI tsudu...@oss.nttdata.co.jp
Closes #4428
Repository: spark
Updated Branches:
refs/heads/branch-1.3 0a903059c - 2dc94cd8b
[SPARK-5396] Syntax error in spark scripts on windows.
Modified syntax error in spark-submit2.cmd. Command prompt doesn't have
defined operator.
Author: Masayoshi TSUZUKI tsudu...@oss.nttdata.co.jp
Closes #4428
Repository: spark
Updated Branches:
refs/heads/branch-1.3 528dd34fe - 540f474cf
[SPARK-2945][YARN][Doc]add doc for spark.executor.instances
https://issues.apache.org/jira/browse/SPARK-2945
spark.executor.instances works. As this JIRA recommended, we should add docs
for this common config.
Repository: spark
Updated Branches:
refs/heads/branch-1.3 779e28b6d - cc66a3cb7
[SPARK-5619][SQL] Support 'show roles' in HiveContext
Author: q00251598 qiyad...@huawei.com
Closes #4397 from watermen/SPARK-5619 and squashes the following commits:
f819b6c [q00251598] Support show roles in
Repository: spark
Updated Branches:
refs/heads/branch-1.2 36f70de83 - d89964f86
SPARK-5613: Catch the ApplicationNotFoundException exception to avoid thread
from getting killed on yarn restart.
[SPARK-5613] Added a catch block to catch the ApplicationNotFoundException.
Without this catch
Repository: spark
Updated Branches:
refs/heads/branch-1.3 11dbf7137 - 9fa29a629
[SPARK-4874] [CORE] Collect record count metrics
Collects record counts for both Input/Output and Shuffle Metrics. For the
input/output metrics, it just appends the counter every time the iterators get
accessed.
Repository: spark
Updated Branches:
refs/heads/master 2bda1c1d3 - 61073f832
[SPARK-4994][network]Cleanup removed executors' ShuffleInfo in yarn shuffle
service
when the application is completed, yarn's nodemanager can remove application's
local-dirs.but all executors' metadata of completed
Repository: spark
Updated Branches:
refs/heads/branch-1.3 caca15a4c - af6ddf8b6
[SPARK-4994][network]Cleanup removed executors' ShuffleInfo in yarn shuffle
service
when the application is completed, yarn's nodemanager can remove application's
local-dirs.but all executors' metadata of
Repository: spark
Updated Branches:
refs/heads/master 76c4bf59f - c4021401e
[SQL] [Minor] HiveParquetSuite was disabled by mistake, re-enable them
!-- Reviewable:start --
[img src=https://reviewable.io/review_button.png; height=40 alt=Review on
Repository: spark
Updated Branches:
refs/heads/branch-1.3 09feecc7c - 11dbf7137
[HOTFIX] Fix the maven build after adding sqlContext to spark-shell
Follow up to #4387 to fix the build break.
Author: Michael Armbrust mich...@databricks.com
Closes #4443 from marmbrus/fixMaven and squashes the
Repository: spark
Updated Branches:
refs/heads/master 3d3ecd774 - 0f3a36071
[SPARK-4983] Insert waiting time before tagging EC2 instances
The boto API doesn't support tag EC2 instances in the same call that launches
them.
We add a five-second wait so EC2 has enough time to propagate the
Repository: spark
Updated Branches:
refs/heads/branch-1.2 09da688b0 - 36f70de83
[SPARK-4983] Insert waiting time before tagging EC2 instances
The boto API doesn't support tag EC2 instances in the same call that launches
them.
We add a five-second wait so EC2 has enough time to propagate the
Repository: spark
Updated Branches:
refs/heads/branch-1.3 2ef9853e7 - 2872d8344
[SPARK-4983] Insert waiting time before tagging EC2 instances
The boto API doesn't support tag EC2 instances in the same call that launches
them.
We add a five-second wait so EC2 has enough time to propagate the
Repository: spark
Updated Branches:
refs/heads/branch-1.3 c950058e9 - 400580228
[SQL] [Minor] HiveParquetSuite was disabled by mistake, re-enable them
!-- Reviewable:start --
[img src=https://reviewable.io/review_button.png; height=40 alt=Review on
Repository: spark
Updated Branches:
refs/heads/branch-1.3 400580228 - 11b28b9b4
[SPARK-5601][MLLIB] make streaming linear algorithms Java-friendly
Overload `trainOn`, `predictOn`, and `predictOnValues`.
CC freeman-lab
Author: Xiangrui Meng m...@databricks.com
Closes #4432 from
Repository: spark
Updated Branches:
refs/heads/master c4021401e - 0e23ca9f8
[SPARK-5601][MLLIB] make streaming linear algorithms Java-friendly
Overload `trainOn`, `predictOn`, and `predictOnValues`.
CC freeman-lab
Author: Xiangrui Meng m...@databricks.com
Closes #4432 from
Repository: spark
Updated Branches:
refs/heads/master 0e23ca9f8 - e772b4e4e
SPARK-5403: Ignore UserKnownHostsFile in SSH calls
See https://issues.apache.org/jira/browse/SPARK-5403
Author: Grzegorz Dubicki grzegorz.dubi...@gmail.com
Closes #4196 from grzegorz-dubicki/SPARK-5403 and squashes
Repository: spark
Updated Branches:
refs/heads/branch-1.3 11b28b9b4 - 3d99741b2
SPARK-5403: Ignore UserKnownHostsFile in SSH calls
See https://issues.apache.org/jira/browse/SPARK-5403
Author: Grzegorz Dubicki grzegorz.dubi...@gmail.com
Closes #4196 from grzegorz-dubicki/SPARK-5403 and
Repository: spark
Updated Branches:
refs/heads/branch-1.3 1b148adfc - 2ef9853e7
[SPARK-5586][Spark Shell][SQL] Make `sqlContext` available in spark shell
Result is like this
```
15/02/05 13:41:22 INFO SparkILoop: Created spark context..
Spark context available as sc.
15/02/05 13:41:22 INFO
Repository: spark
Updated Branches:
refs/heads/branch-1.3 1d3234165 - 09feecc7c
[SPARK-5600] [core] Clean up FsHistoryProvider test, fix app sort order.
Clean up some test setup code to remove duplicate instantiation of the
provider. Also make sure unfinished apps are sorted correctly.
Repository: spark
Updated Branches:
refs/heads/master 57961567e - dcd1e42d6
[SPARK-4874] [CORE] Collect record count metrics
Collects record counts for both Input/Output and Shuffle Metrics. For the
input/output metrics, it just appends the counter every time the iterators get
accessed.
Repository: spark
Updated Branches:
refs/heads/branch-1.3 9fa29a629 - caca15a4c
[SPARK-5444][Network]Add a retry to deal with the conflict port in netty server.
If the `spark.blockMnager.port` had conflicted with a specific port, Spark will
throw an exception and exit.
So add a retry to
Repository: spark
Updated Branches:
refs/heads/master dcd1e42d6 - 2bda1c1d3
[SPARK-5444][Network]Add a retry to deal with the conflict port in netty server.
If the `spark.blockMnager.port` had conflicted with a specific port, Spark will
throw an exception and exit.
So add a retry to avoid
Repository: spark
Updated Branches:
refs/heads/branch-1.3 af6ddf8b6 - c950058e9
[SQL] Use TestSQLContext in Java tests
Sometimes tests were failing due to the creation of multiple `SparkContext`s in
a single JVM.
Author: Michael Armbrust mich...@databricks.com
Closes #4441 from
Repository: spark
Updated Branches:
refs/heads/master 61073f832 - 76c4bf59f
[SQL] Use TestSQLContext in Java tests
Sometimes tests were failing due to the creation of multiple `SparkContext`s in
a single JVM.
Author: Michael Armbrust mich...@databricks.com
Closes #4441 from
[SPARK-5388] Provide a stable application submission gateway for standalone
cluster mode
The goal is to provide a stable, REST-based application submission gateway that
is not inherently based on Akka, which is unstable across versions. This PR
targets standalone cluster mode, but is
[SPARK-5388] Provide a stable application submission gateway for standalone
cluster mode
The goal is to provide a stable, REST-based application submission gateway that
is not inherently based on Akka, which is unstable across versions. This PR
targets standalone cluster mode, but is
Repository: spark
Updated Branches:
refs/heads/branch-1.3 3d99741b2 - 6ec0cdc14
http://git-wip-us.apache.org/repos/asf/spark/blob/6ec0cdc1/core/src/main/scala/org/apache/spark/deploy/rest/SubmitRestProtocolResponse.scala
--
Repository: spark
Updated Branches:
refs/heads/master e772b4e4e - 1390e56fa
http://git-wip-us.apache.org/repos/asf/spark/blob/1390e56f/core/src/main/scala/org/apache/spark/deploy/rest/SubmitRestProtocolResponse.scala
--
diff
Repository: spark
Updated Tags: refs/tags/v1.2.0-snapshot1 [deleted] 38c1fbd96
-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org
Repository: spark
Updated Tags: refs/tags/v1.2.1-rc2 [deleted] b77f87673
-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org
Repository: spark
Updated Tags: refs/tags/v1.2.0-snapshot0 [deleted] bc0987579
-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org
Repository: spark
Updated Tags: refs/tags/v1.2.0-rc1 [deleted] 1056e9ec1
-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org
Repository: spark
Updated Tags: refs/tags/v1.2.1-rc1 [deleted] 3e2d7d310
-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org
Repository: spark
Updated Branches:
refs/heads/branch-1.3 87e0f0dc6 - 1d3234165
SPARK-5633 pyspark saveAsTextFile support for compression codec
See https://issues.apache.org/jira/browse/SPARK-5633 for details
Author: Vladimir Vladimirov vladimir.vladimi...@magnetic.com
Closes #4403 from
Repository: spark
Updated Branches:
refs/heads/master 65181b751 - b3872e00d
SPARK-5633 pyspark saveAsTextFile support for compression codec
See https://issues.apache.org/jira/browse/SPARK-5633 for details
Author: Vladimir Vladimirov vladimir.vladimi...@magnetic.com
Closes #4403 from
Repository: spark
Updated Tags: refs/tags/v1.2.0-rc2 [deleted] a428c446e
-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org
Repository: spark
Updated Tags: refs/tags/v1.2.1 [created] b6eaf77d4
-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org
Repository: spark
Updated Branches:
refs/heads/branch-1.3 156839181 - 0a903059c
[SPARK-5636] Ramp up faster in dynamic allocation
A recent patch #4051 made the initial number default to 0. With this change,
any Spark application using dynamic allocation's default settings will ramp up
very
Repository: spark
Updated Branches:
refs/heads/branch-1.3 3feb798cb - e74dd0478
SPARK-2450 Adds executor log links to Web UI
Adds links to stderr/stdout in the executor tab of the webUI for:
1) Standalone
2) Yarn client
3) Yarn cluster
This tries to add the log url support in a general way
Repository: spark
Updated Branches:
refs/heads/branch-1.3 9387dc1c8 - 3feb798cb
[SPARK-5618][Spark Core][Minor] Optimise utility code.
Author: Makoto Fukuhara fuku...@gmail.com
Closes #4396 from fukuo33/fix-unnecessary-regex and squashes the following
commits:
cd07fd6 [Makoto Fukuhara] fix
Repository: spark
Updated Branches:
refs/heads/master 0d74bd7fd - 80f3bcb58
[SPARK-5652][Mllib] Use broadcasted weights in LogisticRegressionModel
`LogisticRegressionModel`'s `predictPoint` should directly use broadcasted
weights. This pr also fixes the compilation errors of two unit test
Repository: spark
Updated Branches:
refs/heads/master 0b7eb3f3b - 3eccf29ce
[SPARK-5595][SPARK-5603][SQL] Add a rule to do PreInsert type casting and field
renaming and invalidating in memory cache after INSERT
This PR adds a rule to Analyzer that will add preinsert data type casting and
Repository: spark
Updated Branches:
refs/heads/master 24dbc50b9 - 856928979
[SPARK-5582] [history] Ignore empty log directories.
Empty log directories are not useful at the moment, but if one ends
up showing in the log root, it breaks the code that checks for log
directories.
Author: Marcelo
Repository: spark
Updated Branches:
refs/heads/master 856928979 - ed3aac791
[SPARK-5470][Core]use defaultClassLoader to load classes in KryoSerializer
Now KryoSerializer load classes of classesToRegister at the time of its
initialization. when we set spark.kryo.classesToRegister=class1, it
Repository: spark
Updated Branches:
refs/heads/master f6ba813af - 24dbc50b9
[SPARK-5157][YARN] Configure more JVM options properly when we use
ConcMarkSweepGC for AM.
When we set `SPARK_USE_CONC_INCR_GC`, ConcurrentMarkSweepGC works on the AM.
Actually, if ConcurrentMarkSweepGC is set for
Repository: spark
Updated Branches:
refs/heads/master f827ef4d7 - cf6778e8d
[Build] Set all Debian package permissions to 755
755 means the owner can read, write, and execute, and everyone else can just
read and execute. I think that's what we want here since without execute
permissions
Repository: spark
Updated Branches:
refs/heads/master 575d2df35 - f6ba813af
[Minor] Remove permission for execution from spark-shell.cmd
.cmd files in bin is not set permission for execution except for
spark-shell.cmd.
Let's unify that.
Author: Kousuke Saruta saru...@oss.nttdata.co.jp
Repository: spark
Updated Branches:
refs/heads/master cf6778e8d - 37d35ab53
[SPARK-5416] init Executor.threadPool before ExecutorSource
Some ExecutorSource metrics can NPE by attempting to reference the
threadpool otherwise.
Author: Ryan Williams ryan.blake.willi...@gmail.com
Closes #4212
Repository: spark
Updated Branches:
refs/heads/branch-1.3 45b95e7d2 - f408db6a3
[SPARK-5013] [MLlib] Added documentation and sample data file for
GaussianMixture
Simple description and code samples (and sample data) for GaussianMixture
Author: Travis Galoppo tjg2...@columbia.edu
Closes
Repository: spark
Updated Branches:
refs/heads/branch-1.3 f408db6a3 - ffdb2e9b5
[SPARK-5380][GraphX] Solve an ArrayIndexOutOfBoundsException when build graph
with a file format error
When I build a graph with a file format error, there will be an
ArrayIndexOutOfBoundsException
Author:
Repository: spark
Updated Branches:
refs/heads/branch-1.3 ffdb2e9b5 - 7c5468164
[Minor] Remove permission for execution from spark-shell.cmd
.cmd files in bin is not set permission for execution except for
spark-shell.cmd.
Let's unify that.
Author: Kousuke Saruta saru...@oss.nttdata.co.jp
Repository: spark
Updated Branches:
refs/heads/branch-1.3 7c5468164 - 25d80444c
[SPARK-5157][YARN] Configure more JVM options properly when we use
ConcMarkSweepGC for AM.
When we set `SPARK_USE_CONC_INCR_GC`, ConcurrentMarkSweepGC works on the AM.
Actually, if ConcurrentMarkSweepGC is set
Repository: spark
Updated Branches:
refs/heads/branch-1.3 25d80444c - faccdcbc2
[SPARK-5582] [history] Ignore empty log directories.
Empty log directories are not useful at the moment, but if one ends
up showing in the log root, it breaks the code that checks for log
directories.
Author:
85 matches
Mail list logo