Anyone can help? Thanks a lot !
2015-03-16 11:45 GMT+08:00 lonely Feb :
> yes
>
> 2015-03-16 11:43 GMT+08:00 Mridul Muralidharan :
>
>> Cross region as in different data centers ?
>>
>> - Mridul
>>
>> On Sun, Mar 15, 2015 at 8:08 PM, lonely Feb wrote:
>> > Hi all, i meet up with a problem that t
snapshot is pushed. If you verify I'll publish the new artifacts.
On Sun, Mar 15, 2015 at 1:14 AM, Yu Ishikawa
wrote:
> David Hall who is a breeze creator told me that it's a bug. So, I made a
> jira
> ticket about this issue. We need to upgrade breeze from 0.11.1 to 0.11.2 or
> later in order t
yes
2015-03-16 11:43 GMT+08:00 Mridul Muralidharan :
> Cross region as in different data centers ?
>
> - Mridul
>
> On Sun, Mar 15, 2015 at 8:08 PM, lonely Feb wrote:
> > Hi all, i meet up with a problem that torrent broadcast hang out in my
> > spark cluster (1.2, standalone) , particularly ser
Cross region as in different data centers ?
- Mridul
On Sun, Mar 15, 2015 at 8:08 PM, lonely Feb wrote:
> Hi all, i meet up with a problem that torrent broadcast hang out in my
> spark cluster (1.2, standalone) , particularly serious when driver and
> executors are cross-region. when i read the
Thx. But this method is in BlockTransferService.scala of spark which i can
not replace unless i rewrite the core code. I wonder if it is handled
somewhere already.
2015-03-16 11:27 GMT+08:00 Chester Chen :
> can you just replace "Duration.Inf" with a shorter duration ? how about
>
> import
can you just replace "Duration.Inf" with a shorter duration ? how about
import scala.concurrent.duration._
val timeout = new Timeout(10 seconds)
Await.result(result.future, timeout.duration)
or
val timeout = new FiniteDuration(10, TimeUnit.SECONDS)
Await.resu
Hi all, i meet up with a problem that torrent broadcast hang out in my
spark cluster (1.2, standalone) , particularly serious when driver and
executors are cross-region. when i read the code of broadcast i found that
a sync block read here:
def fetchBlockSync(host: String, port: Int, execId: Str
Thanks!
On Sat, Mar 14, 2015 at 3:31 AM, Michael Armbrust
wrote:
> Here is the JIRA: https://issues.apache.org/jira/browse/SPARK-6315
>
> On Thu, Mar 12, 2015 at 11:00 PM, Michael Armbrust >
> wrote:
>
> > We are looking at the issue and will likely fix it for Spark 1.3.1.
> >
> > On Thu, Mar 1
When I enter http://spark.apache.org/docs/latest/ into Chrome address bar,
I saw 1.3.0
Cheers
On Sun, Mar 15, 2015 at 11:12 AM, Patrick Wendell
wrote:
> Cheng - what if you hold shift+refresh? For me the /latest link
> correctly points to 1.3.0
>
> On Sun, Mar 15, 2015 at 10:40 AM, Cheng Lian
Cheng - what if you hold shift+refresh? For me the /latest link
correctly points to 1.3.0
On Sun, Mar 15, 2015 at 10:40 AM, Cheng Lian wrote:
> It's still marked as 1.2.1 here http://spark.apache.org/docs/latest/
>
> But this page is updated (1.3.0)
> http://spark.apache.org/docs/latest/index.htm
It's still marked as 1.2.1 here http://spark.apache.org/docs/latest/
But this page is updated (1.3.0)
http://spark.apache.org/docs/latest/index.html
Cheng
-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additio
Hey Andrew,
Would you please create a JIRA ticket for this? To preserve
compatibility with existing Hive JDBC/ODBC drivers, Spark SQL's
HiveThriftServer intercepts some HiveServer2 components and injects
Spark stuff into it. This makes the implementation details are somewhat
hacky (e.g. a bun
David Hall who is a breeze creator told me that it's a bug. So, I made a jira
ticket about this issue. We need to upgrade breeze from 0.11.1 to 0.11.2 or
later in order to fix the bug, when the new version of breeze will be
released.
[SPARK-6341] Upgrade breeze from 0.11.1 to 0.11.2 or later - ASF
It's a bug in breeze's side. Once David fixes it and publishes it to
maven, we can upgrade to breeze 0.11.2. Please file a jira ticket for
this issue. thanks.
Sincerely,
DB Tsai
---
Blog: https://www.dbtsai.com
On Sun, Mar 15, 2015 at 12:45 AM
Hi all,
Is there any bugs to divide a Breeze sparse vector at Spark v1.3.0-rc3? When
I tried to divide a sparse vector at Spark v1.3.0-rc3, I got a wrong result
if the target vector has any zero values.
Spark v1.3.0-rc3 depends on Breeze v0.11.1. And Breeze v0.11.1 seems to have
any bugs to divid
15 matches
Mail list logo