I exported sql table into .sql file and would like to import this into hive
Best, Patcharee
On 23. nov. 2016 10:40, Markovitz, Dudu wrote:
Hi Patcharee
The question is not clear.
Dudu
-Original Message-
From: patcharee [mailto:patcharee.thong...@uni.no]
Sent: Wednesday, November 23
Hi,
How can I import .sql file into hive?
Best, Patcharee
It works on Hive cli
Patcharee
On 10/24/2016 11:51 AM, Mich Talebzadeh wrote:
does this work ok through Hive cli?
Dr Mich Talebzadeh
LinkedIn
/https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw/
http://talebzadehmich.wordpress.com
*Disclaimer:* Use it at
nder why I got this error because I query just ONE line. Any ideas?
Thanks,
Patcharee
rom org.apache.hadoop.hive.ql.exec.DDLTask. GC overhead limit
exceeded (state=08S01,code=1)
How to solve this? How to identify if this error is from the client
(beeline) or from hiveserver2?
Thanks,
Patcharee
Hi,
It works after I altered add partition. Thanks!
My partitioned orc file (directory) is created by Spark, therefore hive
is not aware of the partitions automatically.
Best,
Patcharee
On 13. nov. 2015 13:08, Elliot West wrote:
Have you added the partitions to the meta store?
ALTER TABLE
Hi,
It work with non-partition ORC, but does not work with (2-column)
partitioned ORC.
Thanks,
Patcharee
On 09. nov. 2015 10:55, Elliot West wrote:
Hi,
You can create a table and point the location property to the folder
containing your ORC file:
CREATE EXTERNAL TABLE orc_table
Hi,
How can I query an orc file (*.orc) by Hive? This orc file is created by
other apps, like spark, mr.
Thanks,
Patcharee
Hi,
For the orc format, which scenario that bloom filter is better than
min-max index?
Best,
Patcharee
it is supposed to be
- Type: struct
Any ideas how this happened and how I can fix it. Please suggest me.
BR,
Patcharee
whole table, not one-by-one partition?
Thanks,
Patcharee
ddl page, it seems
only bucket table can be sorted.
Any suggestions please
BR,
Patcharee
data, like select count(*) from Table, any more, just got error line 1:1
character '' not supported here, no matter Tez or MR engine.
How can you solve the problem in your case?
BR,
Patcharee
On 18. juli 2015 21:26, Nitin Pawar wrote:
can you tell exactly what steps you did/?
al
This select * from table limit 5; works, but not others. So?
Patcharee
On 18. juli 2015 12:08, Nitin Pawar wrote:
can you do select * from table limit 5;
On Sat, Jul 18, 2015 at 3:35 PM, patcharee <mailto:patcharee.thong...@uni.no>> wrote:
Hi,
I am using hive 0.14 with T
upported here
line 1:137 character '' not supported here
line 1:138 character '' not supported here
line 1:139 character '' not supported here
line 1:140 character '' not supported here
line 1:141 character '' not supported here
line 1:142 character '' not supported here
line 1:143 character '' not supported here
line 1:144 character '' not supported here
line 1:145 character '' not supported here
line 1:146 character '' not supported here
BR,
Patcharee
Actually it works on mr. So the problem is from tez. thanks!
BR,
Patcharee
On 30. juni 2015 10:23, Nitin Pawar wrote:
can you try doing same by changing the query engine from tez to mr1?
not sure if its hive bug or tez bug
On Tue, Jun 30, 2015 at 1:46 PM, patcharee <mailto:patcharee.th
5)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
]
DAG failed due to vertex failure. failedVertices:1 killedVertices:0
FAILED: Execution Error, return code 2 from
org.apache.hadoop.hive.ql.exec.DDLTask
BR,
Patcharee
at com.sun.proxy.$Proxy37.alter_partition(Unknown Source)
at
org.apache.hadoop.hive.ql.metadata.Hive.alterPartition(Hive.java:469)
... 26 more
BR,
Patcharee
1.59 s
OK
7.157847455.192524
7.157847455.192524
7.157847455.192524
7.157847455.192524
7.157847455.192524
Patcharee
On 27. mai 2015 18:12, Bhagwan S. Soni wrote:
could you also provide some sample dataset for these two columns?
On Wed, May 27, 2
records
matched the condition. What can be wrong? I am using Hive 0.14
BR,
Patcharee
org.apache.hadoop.ipc.Server$Handler.run(Server.java:2033)
What could be the cause of this exception? Any ideas?
BR,
Patcharee
On 24. april 2015 10:27, Prasanth Jayachandran wrote:
You can download the branch-0.14 source code from
https://github.com/apache/hive/tree/branch-0.14, apply
HIVE-9529-branch
Hi,
The sandbox 2.2 comes with hive 0.14. Does it also have the bug? If so,
how can I patch hive on sandbox?
BR,
Patcharee
On 24. april 2015 09:42, Prasanth Jayachandran wrote:
Hi
This has been fixed recently https://issues.apache.org/jira/browse/HIVE-9529.
Merging is triggered in two
suggestions.
BR,
Patcharee
which could be the cause of the
problem. Please let me know how to fix it.
BR,
Patcharee
On 21. april 2015 13:10, Gopal Vijayaraghavan wrote:
alter table concatenate do not work? I have a dynamic
partitioned table (stored as orc). I tried to alter concatenate, but it
did not work. See my
(st=0.8)
Moved:
'hdfs://service-test-1-0.testlocal:8020/apps/hive/warehouse/orc_merge5a/st=0.8/00_0'
to trash at:
hdfs://service-test-1-0.testlocal:8020/user/patcharee/.Trash/Current
Moved:
'hdfs://service-test-1-0.testlocal:8020/apps/hive/warehouse/orc_merge5a/st=0.8/02_0
15:23
/apps/hive/warehouse/coordinate/zone=2/part-r-0
-rwxr-xr-x 1 root hdfs 29049 2015-04-20 15:23
/apps/hive/warehouse/coordinate/zone=2/part-r-1
-rwxr-xr-x 1 root hdfs 29075 2015-04-20 15:23
/apps/hive/warehouse/coordinate/zone=2/part-r-2
Any ideas?
BR,
Patcharee
Hi,
Is there any example to read/query orc file using orc file input format
from Map-Reduce job or Spark job?
BR,
Patcharee
Hi,
I have a hive table with a column which was changed its name. Pig is not
able to load data from this column, it is all empty.
Any ideas how to fix it?
BR,
Patcharee
After I changed org.apache.hcatalog.pig.HCatStorer() to
org.apache.hive.hcatalog.pig.HCatStorer(), it worked.
Patcharee
On 01/14/2015 02:57 PM, Patcharee Thongtra wrote:
Hi,
I am having a weird problem. I created a table in orc format:
Create table
cked the table 'cossin', zone is NULL instead on 2.
Any ideas?
BR,
Patcharee
It works. Thanks!
Patcharee
On 01/13/2015 10:15 AM, Devopam Mittra wrote:
please try the following and report observation:
WHERE long = CAST(-41.338276 AS FLOAT)
regards
Devopam
On Tue, Jan 13, 2015 at 2:25 PM, Patcharee Thongtra
mailto:patcharee.thong...@uni.no>> wrote:
Hi,
fully
OK
Time taken: 14.262 seconds
hive> select long from test_float;
select long from test_float
Status: Finished successfully
OK
-41.338276
Time taken: 6.843 seconds, Fetched: 1 row(s)
Any ideas? I am using hive version 0.13.
BR,
Patcharee
;
the column 'z' is integer, but I got an error
ERROR grunt.Grunt: ERROR 1066: Unable to open iterator for alias
z_order. Backend error : java.lang.String cannot be cast to
java.lang.Integer
Any ideas?
Patcharee
Hi,
Can I add a partition column?
Patcharee
On 06/02/2014 10:25 PM, Mohammad Tariq wrote:
Hi Patcharee,
You can definitely add new columns. This is how it is done :
*ALTER TABLE table_name ADD|REPLACE COLUMNS (col_name data_type
[COMMENT col_comment], ...)*
For more info on Hive DDL you
ph float
Is it possible to alter this table later by adding more columns?
Patcharee
.0andLaterReleases.1
mentions that hive 0.13 supports timestamp and in pig we should provide
DateTime to store in timestamp column. I did as mentioned but I got the
exception.
Any suggestion is appreciated.
Patcharee
36 matches
Mail list logo