Re: why does hadoop creates /tmp/hadoop-user/hadoop-unjar-xxxx/ dir and unjar my fat jar?

2014-10-25 Thread Yang
I thought this might be because that hadoop wants to pack everything (including the -files dfs cache files) into one single jar, so I removed the -files commands I have. but it still extracts the jar. this is rather confusing On Fri, Oct 24, 2014 at 11:51 AM, Yang tedd...@gmail.com wrote:

Load csv files into drill tables

2014-10-25 Thread lapro1
Hi, I'd like to ask the following: I'm students yet, and I'm writing my thesis. My problem is, that I'd like to import a csv file into a new drill table, but I don't know how it's possibe. I give the following command, but I get exception: CREATE TABLE nameposTable3 AS SELECT first_name,

Re: Load csv files into drill tables

2014-10-25 Thread Harsh J
Your question would be better answered asked on the Drill community lists instead of here: http://incubator.apache.org/drill/community.html#mailinglists On Sat, Oct 25, 2014 at 8:26 PM, lapro1 lap...@gmail.com wrote: Hi, I'd like to ask the following: I'm students yet, and I'm writing my

Re: why does hadoop creates /tmp/hadoop-user/hadoop-unjar-xxxx/ dir and unjar my fat jar?

2014-10-25 Thread Harsh J
If you use 'hadoop jar' to invoke your application, this is the default behaviour. The reason it is done is that the utility supports use of jars-within-jar feature, that lets one pack additional dependency jars into an application as a lib/ subdirectory under the root of the main jar. It is not