Can you use this command ?

patch -p1 -i 1893.patch

Cheers


On Wed, Aug 27, 2014 at 7:41 PM, arthur.hk.c...@gmail.com <
arthur.hk.c...@gmail.com> wrote:

> Hi Ted,
>
> I tried the following steps to apply the patch 1893 but got Hunk FAILED,
> can you please advise how to get thru this error? or is my spark-1.0.2
> source not the correct one?
>
> Regards
> Arthur
>
> wget http://d3kbcqa49mib13.cloudfront.net/spark-1.0.2.tgz
> tar -vxf spark-1.0.2.tgz
> cd spark-1.0.2
> wget https://github.com/apache/spark/pull/1893.patch
> patch  < 1893.patch
> patching file pom.xml
> Hunk #1 FAILED at 45.
> Hunk #2 FAILED at 110.
> 2 out of 2 hunks FAILED -- saving rejects to file pom.xml.rej
> patching file pom.xml
> Hunk #1 FAILED at 54.
> Hunk #2 FAILED at 72.
> Hunk #3 FAILED at 171.
> 3 out of 3 hunks FAILED -- saving rejects to file pom.xml.rej
> can't find file to patch at input line 267
> Perhaps you should have used the -p or --strip option?
> The text leading up to this was:
> --------------------------
> |
> |From cd58437897bf02b644c2171404ccffae5d12a2be Mon Sep 17 00:00:00 2001
> |From: tedyu <yuzhih...@gmail.com>
> |Date: Mon, 11 Aug 2014 15:57:46 -0700
> |Subject: [PATCH 3/4] SPARK-1297 Upgrade HBase dependency to 0.98 - add
> | description to building-with-maven.md
> |
> |---
> | docs/building-with-maven.md | 3 +++
> | 1 file changed, 3 insertions(+)
> |
> |diff --git a/docs/building-with-maven.md b/docs/building-with-maven.md
> |index 672d0ef..f8bcd2b 100644
> |--- a/docs/building-with-maven.md
> |+++ b/docs/building-with-maven.md
> --------------------------
> File to patch:
>
>
>
> On 28 Aug, 2014, at 10:24 am, Ted Yu <yuzhih...@gmail.com> wrote:
>
> You can get the patch from this URL:
> https://github.com/apache/spark/pull/1893.patch
>
> BTW 0.98.5 has been released - you can specify 0.98.5-hadoop2 in the
> pom.xml
>
> Cheers
>
>
> On Wed, Aug 27, 2014 at 7:18 PM, arthur.hk.c...@gmail.com <
> arthur.hk.c...@gmail.com> wrote:
>
>> Hi Ted,
>>
>> Thank you so much!!
>>
>> As I am new to Spark, can you please advise the steps about how to apply
>> this patch to my spark-1.0.2 source folder?
>>
>> Regards
>> Arthur
>>
>>
>> On 28 Aug, 2014, at 10:13 am, Ted Yu <yuzhih...@gmail.com> wrote:
>>
>> See SPARK-1297
>>
>>  The pull request is here:
>> https://github.com/apache/spark/pull/1893
>>
>>
>> On Wed, Aug 27, 2014 at 6:57 PM, arthur.hk.c...@gmail.com <
>> arthur.hk.c...@gmail.com> wrote:
>>
>>> (correction: "Compilation Error:  Spark 1.0.2 with HBase 0.98” , please
>>> ignore if duplicated)
>>>
>>>
>>> Hi,
>>>
>>> I need to use Spark with HBase 0.98 and tried to compile Spark 1.0.2
>>> with HBase 0.98,
>>>
>>> My steps:
>>> wget http://d3kbcqa49mib13.cloudfront.net/spark-1.0.2.tgz
>>> tar -vxf spark-1.0.2.tgz
>>> cd spark-1.0.2
>>>
>>> edit project/SparkBuild.scala, set HBASE_VERSION
>>>   // HBase version; set as appropriate.
>>>   val HBASE_VERSION = "0.98.2"
>>>
>>>
>>> edit pom.xml with following values
>>>     <hadoop.version>2.4.1</hadoop.version>
>>>     <protobuf.version>2.5.0</protobuf.version>
>>>     <yarn.version>${hadoop.version}</yarn.version>
>>>     <hbase.version>0.98.5</hbase.version>
>>>     <zookeeper.version>3.4.6</zookeeper.version>
>>>     <hive.version>0.13.1</hive.version>
>>>
>>>
>>> SPARK_HADOOP_VERSION=2.4.1 SPARK_YARN=true sbt/sbt clean assembly
>>> but it fails because of UNRESOLVED DEPENDENCIES "hbase;0.98.2"
>>>
>>> Can you please advise how to compile Spark 1.0.2 with HBase 0.98? or
>>> should I set HBASE_VERSION back to “0.94.6"?
>>>
>>> Regards
>>> Arthur
>>>
>>>
>>>
>>>
>>> [warn]  ::::::::::::::::::::::::::::::::::::::::::::::
>>> [warn]  ::          UNRESOLVED DEPENDENCIES         ::
>>> [warn]  ::::::::::::::::::::::::::::::::::::::::::::::
>>> [warn]  :: org.apache.hbase#hbase;0.98.2: not found
>>> [warn]  ::::::::::::::::::::::::::::::::::::::::::::::
>>>
>>> sbt.ResolveException: unresolved dependency:
>>> org.apache.hbase#hbase;0.98.2: not found
>>>         at sbt.IvyActions$.sbt$IvyActions$$resolve(IvyActions.scala:217)
>>>         at sbt.IvyActions$$anonfun$update$1.apply(IvyActions.scala:126)
>>>         at sbt.IvyActions$$anonfun$update$1.apply(IvyActions.scala:125)
>>>         at sbt.IvySbt$Module$$anonfun$withModule$1.apply(Ivy.scala:116)
>>>         at sbt.IvySbt$Module$$anonfun$withModule$1.apply(Ivy.scala:116)
>>>         at sbt.IvySbt$$anonfun$withIvy$1.apply(Ivy.scala:104)
>>>         at sbt.IvySbt.sbt$IvySbt$$action$1(Ivy.scala:51)
>>>         at sbt.IvySbt$$anon$3.call(Ivy.scala:60)
>>>         at xsbt.boot.Locks$GlobalLock.withChannel$1(Locks.scala:98)
>>>         at
>>> xsbt.boot.Locks$GlobalLock.xsbt$boot$Locks$GlobalLock$$withChannelRetries$1(Locks.scala:81)
>>>         at
>>> xsbt.boot.Locks$GlobalLock$$anonfun$withFileLock$1.apply(Locks.scala:102)
>>>         at xsbt.boot.Using$.withResource(Using.scala:11)
>>>         at xsbt.boot.Using$.apply(Using.scala:10)
>>>         at
>>> xsbt.boot.Locks$GlobalLock.ignoringDeadlockAvoided(Locks.scala:62)
>>>         at xsbt.boot.Locks$GlobalLock.withLock(Locks.scala:52)
>>>         at xsbt.boot.Locks$.apply0(Locks.scala:31)
>>>         at xsbt.boot.Locks$.apply(Locks.scala:28)
>>>         at sbt.IvySbt.withDefaultLogger(Ivy.scala:60)
>>>         at sbt.IvySbt.withIvy(Ivy.scala:101)
>>>         at sbt.IvySbt.withIvy(Ivy.scala:97)
>>>         at sbt.IvySbt$Module.withModule(Ivy.scala:116)
>>>         at sbt.IvyActions$.update(IvyActions.scala:125)
>>>         at
>>> sbt.Classpaths$$anonfun$sbt$Classpaths$$work$1$1.apply(Defaults.scala:1170)
>>>         at
>>> sbt.Classpaths$$anonfun$sbt$Classpaths$$work$1$1.apply(Defaults.scala:1168)
>>>         at
>>> sbt.Classpaths$$anonfun$doWork$1$1$$anonfun$73.apply(Defaults.scala:1191)
>>>         at
>>> sbt.Classpaths$$anonfun$doWork$1$1$$anonfun$73.apply(Defaults.scala:1189)
>>>         at sbt.Tracked$$anonfun$lastOutput$1.apply(Tracked.scala:35)
>>>         at sbt.Classpaths$$anonfun$doWork$1$1.apply(Defaults.scala:1193)
>>>         at sbt.Classpaths$$anonfun$doWork$1$1.apply(Defaults.scala:1188)
>>>         at sbt.Tracked$$anonfun$inputChanged$1.apply(Tracked.scala:45)
>>>         at sbt.Classpaths$.cachedUpdate(Defaults.scala:1196)
>>>         at
>>> sbt.Classpaths$$anonfun$updateTask$1.apply(Defaults.scala:1161)
>>>         at
>>> sbt.Classpaths$$anonfun$updateTask$1.apply(Defaults.scala:1139)
>>>         at scala.Function1$$anonfun$compose$1.apply(Function1.scala:47)
>>>         at
>>> sbt.$tilde$greater$$anonfun$$u2219$1.apply(TypeFunctions.scala:42)
>>>         at sbt.std.Transform$$anon$4.work(System.scala:64)
>>>         at
>>> sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:237)
>>>         at
>>> sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:237)
>>>         at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:18)
>>>         at sbt.Execute.work(Execute.scala:244)
>>>         at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:237)
>>>         at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:237)
>>>         at
>>> sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:160)
>>>         at sbt.CompletionService$$anon$2.call(CompletionService.scala:30)
>>>         at
>>> java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
>>>         at java.util.concurrent.FutureTask.run(FutureTask.java:138)
>>>         at
>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:439)
>>>         at
>>> java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
>>>         at java.util.concurrent.FutureTask.run(FutureTask.java:138)
>>>         at
>>> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895)
>>>         at
>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
>>>         at java.lang.Thread.run(Thread.java:662)
>>> [error] (examples/*:update) sbt.ResolveException: unresolved dependency:
>>> org.apache.hbase#hbase;0.98.2: not found
>>> [error] Total time: 270 s, completed Aug 28, 2014 9:42:05 AM
>>> ---------------------------------------------------------------------
>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>
>>>
>>
>>
>
>

Reply via email to