Let us say my log data that I want to place a log file into hive. And the log
file itself looks something like this:
Event_time, event_type, event_data_blob
And the blob data looks like
Key1=value1;key2=value2;key3=value3 ... keyn=valuen
This looks like maybe I start like this:
Create table
We've gotten this error a couple of times too - it is very misleading, not
correct at all. IIRC, I determined the root cause is selecting too many
input files (even though those do NOT get passed as arguments to transform
script). For example, this happened once we had a lot of dynamic
Usually this is caused by not having the mysql jdbc driver on the
classpath (it's not default included in hive).
Just put the mysql jdbc driver in the hive folder under lib/
On 03/02/2011 03:15 PM, Ajo Fod wrote:
I've checked the mysql connection with a separate java file with the
same string.
On Wed, Mar 2, 2011 at 9:27 AM, Sunderlin, Mark
mark.sunder...@teamaol.comwrote:
Let us say my log data that I want to place a log file into hive. And
the log file itself looks something like this:
Event_time, event_type, event_data_blob
And the blob data looks like
Refer to this http://dev.bizo.com/2011/02/columns-in-hive.html
http://dev.bizo.com/2011/02/columns-in-hive.htmlHTH
- Youngwoo
2011/3/2 Sunderlin, Mark mark.sunder...@teamaol.com
Let us say my log data that I want to place a log file into hive. And
the log file itself looks something like
Hi Bennie,
Thanks for the response !
I had CLASSPATH set to include
/usr/share/java/mysql.jar
... in addition, I just copied the mysql.jar to the lib directory of hive.
I still get the same bug.
Any other ideas?
Thanks,
-Ajo
On Wed, Mar 2, 2011 at 7:01 AM, Bennie Schut bsc...@ebuddy.com
This definitely looks like a CLASSPATH error.
Where did you get the mysql.jar from ? Can you open it up and make sure that
it includes the com.mysql.jdbc.Driver namespace ?
I am guessing the mysql.jar is not the one that you need. you can download a
new one from the mysql website.
To be clear,
I'm wondering if my configuration/stack is wrong, or if I'm trying to do
something that is not supported in Hive.
My goal is to choose a compression scheme for Hadoop/Hive and while
comparing configurations, I'm finding that I can't get BZip2 or Gzip to work
with the RCfile format.
Is that