Solved...
select * from tabName where deleteDate is NULL;
My bad..
On Thursday, 27 March 2014 11:58 AM, Rishabh Bhardwaj
wrote:
Hi all,
I have a table 'tabname' which has a field 'deleteDate' with type BIGINT.There
are records with NULL value in this field.I want to retrieve all those rec
Hi all,
I have a table 'tabname' which has a field 'deleteDate' with type BIGINT.There
are records with NULL value in this field.I want to retrieve all those records.
But when I execute Query,
select * from tabName where deleteDate<>NULL;
I don't get any record in the result set.
What is wrong i
Well you cN use Json serde for this
Sent from my iPhone
> On Mar 26, 2014, at 8:40 PM, "Swagatika Tripathy"
> wrote:
>
> Hi ,
> The use case is we have some unstructured data fetched from Mongo DB and
> stored in a particular location. Our task is to load those data into our
> staging and co
Hi ,
The use case is we have some unstructured data fetched from Mongo DB and
stored in a particular location. Our task is to load those data into our
staging and core hive tables in form of rows and columns.eg if the data is
in key value pair like:
{
Id: bigint(12346),
Name:string(ABC),
Subjects:
I have a join query where i am joining huge tables and i am trying to
optimize this hive query.
INSERT OVERWRITE TABLE result
SELECT /*+ STREAMTABLE(product) */
i.IMAGE_ID,
p.PRODUCT_NO,
p.STORE_NO,
p.PRODUCT_CAT_NO,
p.CAPTION,
p.PRODUCT_DESC,
p.IMAGE1_ID,
p
That helped. Thanks
On Wed, Mar 26, 2014 at 10:52 AM, Stephen Sprague wrote:
> the error message is correct. remember the partition columns are not
> stored with the data and by doing a "select *" that's what doing. And this
> has nothing to do with ORC either its a Hive thing. :)
>
> so your
Hi Siddharth,
we need to store the unstructured data in internal hive tables. Have u
tried something similar?
On Wed, Mar 26, 2014 at 10:33 PM, Shrikanth Shankar wrote:
> https://github.com/mongodb/mongo-hadoop is from the mongo folks themselves
>
> Shrikanth
>
>
> On Wed, Mar 26, 2014 at 10:
https://github.com/mongodb/mongo-hadoop is from the mongo folks themselves
Shrikanth
On Wed, Mar 26, 2014 at 10:01 AM, Nitin Pawar wrote:
> take a look at https://github.com/yc-huang/Hive-mongo
>
>
> On Wed, Mar 26, 2014 at 10:29 PM, Swagatika Tripathy <
> swagatikat...@gmail.com> wrote:
>
>> H
Hi Swagatika
You can create external tables to Mongo and can process it using hive. New
mongo connectors have added support for hive. Did you try that?
Sent from my iPhone
> On Mar 26, 2014, at 9:59 AM, "Swagatika Tripathy"
> wrote:
>
> Hi,
> We have some files stored in MongoDB , mostly in k
take a look at https://github.com/yc-huang/Hive-mongo
On Wed, Mar 26, 2014 at 10:29 PM, Swagatika Tripathy <
swagatikat...@gmail.com> wrote:
> Hi,
> We have some files stored in MongoDB , mostly in key value format. We need
> to parse those files and store it into Hive tables.
>
> Any inputs on
MongoDB tores in Json Format. Use Json serde in hive while loading.
-Ashok
From: Swagatika Tripathy [mailto:swagatikat...@gmail.com]
Sent: Thursday, March 27, 2014 12:59 AM
To: user@hive.apache.org
Subject: READING FILE FROM MONGO DB
Hi,
We have some files stored in MongoDB , mostly in key value
Hi,
We have some files stored in MongoDB , mostly in key value format. We need
to parse those files and store it into Hive tables.
Any inputs on this will be appreciated.
Thanks,
Swagatika
the error message is correct. remember the partition columns are not
stored with the data and by doing a "select *" that's what doing. And this
has nothing to do with ORC either its a Hive thing. :)
so your second approach was close. just omit the partition columns yr, mo,
day.
On Wed, Mar 26
I am running 4 hive queries in sequence, through an oozie workflow.
The 2nd, 3rd and 4th queries depend upon the output of the previous
queries..
Looking for ways to add some validations in the workflow something like :
if the count (*) output of previous query is (say) not zero then proceed
Hello,
I'm trying to convert managed partitioned text table into compressed orc
partitioned table.
I created the a new table with the same schema but when I try inserting
data the errors says there are different number of columns.
I tried doing
>From table a insert into table b(yr=2014, mo=01,
15 matches
Mail list logo