Could you try putting that file in hdfs and try like:
LOAD DATA INPATH 'hdfs://sigmoid/test/kv1.txt' INTO TABLE src_spark
Thanks
Best Regards
On Thu, Mar 26, 2015 at 2:07 PM, Akhil Das
wrote:
> When you run it in local mode ^^
>
> Thanks
> Best Regards
>
> On Thu, Mar 26, 2015 at 2:06 PM, ÐΞ€ρ
When you run it in local mode ^^
Thanks
Best Regards
On Thu, Mar 26, 2015 at 2:06 PM, ÐΞ€ρ@Ҝ (๏̯͡๏) wrote:
> I don;t think thats correct. load data local should pick input from local
> directory.
>
> On Thu, Mar 26, 2015 at 1:59 PM, Akhil Das
> wrote:
>
>> Not sure, but you can create that pat
I don;t think thats correct. load data local should pick input from local
directory.
On Thu, Mar 26, 2015 at 1:59 PM, Akhil Das
wrote:
> Not sure, but you can create that path in all workers and put that file in
> it.
>
> Thanks
> Best Regards
>
> On Thu, Mar 26, 2015 at 1:56 PM, ÐΞ€ρ@Ҝ (๏̯͡๏)
Not sure, but you can create that path in all workers and put that file in
it.
Thanks
Best Regards
On Thu, Mar 26, 2015 at 1:56 PM, ÐΞ€ρ@Ҝ (๏̯͡๏) wrote:
> The Hive command
>
> LOAD DATA LOCAL INPATH
> '/home/dvasthimal/spark1.3/spark-1.3.0-bin-hadoop2.4/examples/src/main/resources/kv1.txt'
> IN
The Hive command
LOAD DATA LOCAL INPATH
'/home/dvasthimal/spark1.3/spark-1.3.0-bin-hadoop2.4/examples/src/main/resources/kv1.txt'
INTO TABLE src_spark
1. LOCAL INPATH. if i push to HDFS then how will it work ?
2. I cant use sc.addFile, cause i want to run Hive (Spark SQL) queries.
On Thu, Mar 2
Now its clear that the workers are not having the file kv1.txt in their
local filesystem. You can try putting that in hdfs and use the URI to that
file or try adding the file with sc.addFile
Thanks
Best Regards
On Thu, Mar 26, 2015 at 1:38 PM, ÐΞ€ρ@Ҝ (๏̯͡๏) wrote:
> Does not work
>
> 15/03/26 0
Does not work
15/03/26 01:07:05 INFO HiveMetaStore.audit: ugi=dvasthimal
ip=unknown-ip-addr cmd=get_table : db=default tbl=src_spark
15/03/26 01:07:06 ERROR ql.Driver: FAILED: SemanticException Line 1:23
Invalid path
''/home/dvasthimal/spark1.3/spark-1.3.0-bin-hadoop2.4/examples/src/main/resources
Try to give the complete path to the file kv1.txt.
On 26 Mar 2015 11:48, "ÐΞ€ρ@Ҝ (๏̯͡๏)" wrote:
> I am now seeing this error.
>
>
>
>
>
> 15/03/25 19:44:03 ERROR yarn.ApplicationMaster: User class threw
> exception: FAILED: SemanticException Line 1:23 Invalid path
> ''examples/src/main/resources/
I am now seeing this error.
15/03/25 19:44:03 ERROR yarn.ApplicationMaster: User class threw exception:
FAILED: SemanticException Line 1:23 Invalid path
''examples/src/main/resources/kv1.txt'': No files matching path
file:/hadoop/10/scratch/local/usercache/dvasthimal/appcache/application_14267
You can do it in $SPARK_HOME/conf/spark-defaults.con
spark.driver.extraJavaOptions -XX:MaxPermSize=512m
Thanks.
Zhan Zhang
On Mar 25, 2015, at 7:25 PM, ÐΞ€ρ@Ҝ (๏̯͡๏)
mailto:deepuj...@gmail.com>> wrote:
Where and how do i pass this or other JVM argument ?
-XX:MaxPermSize=512m
On Wed, Mar 25,
Where and how do i pass this or other JVM argument ?
-XX:MaxPermSize=512m
On Wed, Mar 25, 2015 at 11:36 PM, Zhan Zhang wrote:
> I solve this by increase the PermGen memory size in driver.
>
> -XX:MaxPermSize=512m
>
> Thanks.
>
> Zhan Zhang
>
> On Mar 25, 2015, at 10:54 AM, ÐΞ€ρ@Ҝ (๏̯͡๏) w
I solve this by increase the PermGen memory size in driver.
-XX:MaxPermSize=512m
Thanks.
Zhan Zhang
On Mar 25, 2015, at 10:54 AM, ÐΞ€ρ@Ҝ (๏̯͡๏)
mailto:deepuj...@gmail.com>> wrote:
I am facing same issue, posted a new thread. Please respond.
On Wed, Jan 14, 2015 at 4:38 AM, Zhan Zhang
mailt
I am facing same issue, posted a new thread. Please respond.
On Wed, Jan 14, 2015 at 4:38 AM, Zhan Zhang wrote:
> Hi Folks,
>
> I am trying to run hive context in yarn-cluster mode, but met some error.
> Does anybody know what cause the issue.
>
> I use following cmd to build the distribution:
>
Hi Folks,
I am trying to run hive context in yarn-cluster mode, but met some error. Does
anybody know what cause the issue.
I use following cmd to build the distribution:
./make-distribution.sh -Phive -Phive-thriftserver -Pyarn -Phadoop-2.4
15/01/13 17:59:42 INFO cluster.YarnClusterSchedu
14 matches
Mail list logo