Re: New to hive.

2013-07-17 Thread Bharati Adkar
Hi Tariq,

No Problems,
It was the hive.jar.path property that was not being set. Figured it out and 
fixed it. 
Got the plan.xml and jobconf.xml now will debug hadoop to get the rest of info.

Thanks,
Warm regards,
Bharati
On Jul 17, 2013, at 12:08 PM, Mohammad Tariq donta...@gmail.com wrote:

 Hello ma'm,
 
 Apologies first of all for responding so late. Stuck with some urgent 
 deliverables. Was out of touch for a while.
  
 java.io.IOException: Cannot run program 
 /Users/bharati/hive-0.11.0/src/testutils/hadoop (in directory 
 /Users/bharati/eclipse/tutorial/src): error=13, Permission denied
   at java.lang.ProcessBuilder.processException(ProcessBuilder.java:478)
   at java.lang.ProcessBuilder.start(ProcessBuilder.java:457)
   at java.lang.Runtime.exec(Runtime.java:593)
   at java.lang.Runtime.exec(Runtime.java:431)
   at 
 org.apache.hadoop.hive.ql.exec.MapRedTask.execute(MapRedTask.java:269)
   at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:144)
   at 
 org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
   at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1355)
   at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1139)
   at org.apache.hadoop.hive.ql.Driver.run(Driver.java:945)
   at 
 org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.execute(HiveServer.java:198)
   at 
 org.apache.hadoop.hive.service.ThriftHive$Processor$execute.getResult(ThriftHive.java:644)
   at 
 org.apache.hadoop.hive.service.ThriftHive$Processor$execute.getResult(ThriftHive.java:1)
   at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
   at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
   at 
 org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:206)
   at 
 java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895)
   at 
 java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
   at java.lang.Thread.run(Thread.java:680)
 Caused by: java.io.IOException: error=13, Permission denied
   at java.lang.UNIXProcess.forkAndExec(Native Method)
   at java.lang.UNIXProcess.init(UNIXProcess.java:53)
   at java.lang.ProcessImpl.start(ProcessImpl.java:91)
   at java.lang.ProcessBuilder.start(ProcessBuilder.java:452)
   ... 17 more
 
 Please make sure you have proper permissions set for this path.
 
 Warm Regards,
 Tariq
 cloudfront.blogspot.com
 
 
 On Wed, Jul 17, 2013 at 8:03 PM, Puneet Khatod puneet.kha...@tavant.com 
 wrote:
 Hi,
  
 There are many online tutorials and blogs to provide quick get-set-go sort of 
 information. To start with you can learn Hadoop. For detailed knowledge you 
 will have to go through e-books as mentioned by Lefty.
 These books are bulky but will provide every bit of hadoop.
 
 I recently came across an android app called 'Big data Xpert', which has tips 
 and tricks about big data technologies. I think that it can be quick and good 
 reference for beginners as well as experienced developers..
 For reference:
 https://play.google.com/store/apps/details?id=com.mobiknights.xpert.bigdata
 
  
 
 Thanks,
 
 Puneet
 
  
 
 From: Lefty Leverenz [mailto:le...@hortonworks.com] 
 Sent: Thursday, June 20, 2013 11:05 AM
 To: user@hive.apache.org
 Subject: Re: New to hive.
 
  
 
 Programming Hive and Hadoop: The Definitive Guide are available at the 
 O'Reilly website (http://oreilly.com/) and on Amazon. 
 
  
 
 But don't forget the Hive wiki:
 
 Hive Home -- https://cwiki.apache.org/confluence/display/Hive/Home 
 Getting Started -- 
 https://cwiki.apache.org/confluence/display/Hive/GettingStarted
 Hive Tutorial -- https://cwiki.apache.org/confluence/display/Hive/Tutorial
 – Lefty
 
  
 
  
 
 On Wed, Jun 19, 2013 at 7:02 PM, Mohammad Tariq donta...@gmail.com wrote:
 
 Hello ma'am,
 
  
 
   Hive queries are parsed using ANTLR and and are converted into 
 corresponding MR jobs(actually a lot of things happen under the hood). I had 
 answered a similar question few days ago on SO, you might find it helpful. 
 But I would suggest you to go through the original paper which explains all 
 these things in proper detail. I would also recommend you to go through the 
 book Programming Hive. It's really nice.
 
  
 
 HTH
 
 
 
 Warm Regards,
 
 Tariq
 
 cloudfront.blogspot.com
 
  
 
 On Thu, Jun 20, 2013 at 4:24 AM, Bharati bharati.ad...@mparallelo.com wrote:
 
 Hi Folks,
 
 I am new to hive and need information, tutorials etc that you can point to. I 
 have installed hive to work with MySQL.
 
  I can run queries. Now I would like to understand how the map and reduce 
 classes are created and how I can look at the data for the map job and map 
 class the hive query generates.  Also is there a way to create custom map 
 classes.
 I would appreciate if anyone can help me get started.
 
 Thanks,
 Bharati
 
 Sent from my iPad
 
 Fortigate Filtered

Exception Running Hive in eclipse

2013-07-16 Thread Bharati Adkar
Hi Folks,


I am running Hive in eclipse. I think I am missing a hive conf property either 
in the xml  file or the debug configuration. The jar file name is set to null.
13/07/16 09:55:23 INFO exec.ExecDriver: Executing: 
/Users/bharati/hadoop/bin/hadoop jar null 
org.apache.hadoop.hive.ql.exec.ExecDriver  -plan 
file:/tmp/bharati/hive_2013-07-16_09-53-30_441_6800320206008341895/-local-10003/plan.xml
   -jobconffile 
file:/tmp/bharati/hive_2013-07-16_09-53-30_441_6800320206008341895/-local-10002/jobconf.xml

Thanks,

Warm Regards,
Bharati


Here is the console output.


13/07/16 09:51:21 WARN common.LogUtils: DEPRECATED: Ignoring hive-default.xml 
found on the CLASSPATH at 
/Users/bharati/eclipse/tutorial/src/conf/hive-default.xml
13/07/16 09:51:21 INFO service.HiveServer: Starting hive server on port 10001 
with 100 min worker threads and 2147483647 max worker threads
13/07/16 09:51:21 INFO service.HiveServer: TCP keepalive = true
13/07/16 09:52:26 INFO metastore.HiveMetaStore: 0: Opening raw store with 
implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
13/07/16 09:52:26 INFO metastore.ObjectStore: ObjectStore, initialize called
13/07/16 09:52:26 ERROR DataNucleus.Plugin: Bundle org.eclipse.jdt.core 
requires org.eclipse.core.resources but it cannot be resolved.
13/07/16 09:52:26 ERROR DataNucleus.Plugin: Bundle org.eclipse.jdt.core 
requires org.eclipse.core.runtime but it cannot be resolved.
13/07/16 09:52:26 ERROR DataNucleus.Plugin: Bundle org.eclipse.jdt.core 
requires org.eclipse.text but it cannot be resolved.
13/07/16 09:52:27 INFO metastore.ObjectStore: Setting MetaStore object pin 
classes with 
hive.metastore.cache.pinobjtypes=Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order
13/07/16 09:52:27 INFO metastore.ObjectStore: Initialized ObjectStore
Hive history 
file=/tmp/bharati/hive_job_log_bharati_6000@Bharati-Adkars-iMac.local_201307160952_1763040067.txt
13/07/16 09:52:28 INFO exec.HiveHistory: Hive history 
file=/tmp/bharati/hive_job_log_bharati_6000@Bharati-Adkars-iMac.local_201307160952_1763040067.txt
13/07/16 09:53:30 INFO service.HiveServer: Putting temp output to file 
/tmp/bharati/bharati_6000@Bharati-Adkars-iMac.local_201307160952376261539654619795.pipeout
13/07/16 09:53:30 INFO service.HiveServer: Running the query: set 
hive.fetch.output.serde = org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe
13/07/16 09:53:30 INFO service.HiveServer: Putting temp output to file 
/tmp/bharati/bharati_6000@Bharati-Adkars-iMac.local_201307160952376261539654619795.pipeout
13/07/16 09:53:30 INFO service.HiveServer: Running the query: select count(*) 
from fact_mobiledata
13/07/16 09:53:30 INFO ql.Driver: PERFLOG method=Driver.run
13/07/16 09:53:30 INFO ql.Driver: PERFLOG method=TimeToSubmit
13/07/16 09:53:30 INFO ql.Driver: PERFLOG method=compile
13/07/16 09:53:30 INFO parse.ParseDriver: Parsing command: select count(*) from 
fact_mobiledata
13/07/16 09:53:30 INFO parse.ParseDriver: Parse Completed
13/07/16 09:53:30 INFO parse.SemanticAnalyzer: Starting Semantic Analysis
13/07/16 09:53:30 INFO parse.SemanticAnalyzer: Completed phase 1 of Semantic 
Analysis
13/07/16 09:53:30 INFO parse.SemanticAnalyzer: Get metadata for source tables
13/07/16 09:53:31 INFO metastore.HiveMetaStore: 0: get_table : db=default 
tbl=fact_mobiledata
13/07/16 09:53:31 INFO HiveMetaStore.audit: ugi=bharati ip=unknown-ip-addr  
cmd=get_table : db=default tbl=fact_mobiledata  
13/07/16 09:53:31 INFO metastore.HiveMetaStore: 0: Opening raw store with 
implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
13/07/16 09:53:31 INFO metastore.ObjectStore: ObjectStore, initialize called
13/07/16 09:53:31 INFO metastore.ObjectStore: Initialized ObjectStore
13/07/16 09:53:31 INFO parse.SemanticAnalyzer: Get metadata for subqueries
13/07/16 09:53:31 INFO parse.SemanticAnalyzer: Get metadata for destination 
tables
13/07/16 09:53:31 INFO parse.SemanticAnalyzer: Completed getting MetaData in 
Semantic Analysis
13/07/16 09:53:31 INFO ppd.OpProcFactory: Processing for FS(6)
13/07/16 09:53:31 INFO ppd.OpProcFactory: Processing for SEL(5)
13/07/16 09:53:31 INFO ppd.OpProcFactory: Processing for GBY(4)
13/07/16 09:53:31 INFO ppd.OpProcFactory: Processing for RS(3)
13/07/16 09:53:31 INFO ppd.OpProcFactory: Processing for GBY(2)
13/07/16 09:53:31 INFO ppd.OpProcFactory: Processing for SEL(1)
13/07/16 09:53:31 INFO ppd.OpProcFactory: Processing for TS(0)
13/07/16 09:53:31 INFO physical.MetadataOnlyOptimizer: Looking for table scans 
where optimization is applicable
13/07/16 09:53:31 INFO physical.MetadataOnlyOptimizer: Found 0 metadata only 
table scans
13/07/16 09:53:31 INFO parse.SemanticAnalyzer: Completed plan generation
13/07/16 09:53:31 INFO ql.Driver: Semantic Analysis Completed
13/07/16 09:53:31 INFO exec.ListSinkOperator: Initializing Self 7 OP
13/07/16 09:53:31 INFO exec.ListSinkOperator: Operator 7 OP initialized
13/07/16 09:53:31 INFO exec.ListSinkOperator: Initialization Done 7 OP

Build hadoop in eclipse.

2013-06-28 Thread Bharati Adkar
Hi  Folks,
I did not have success using http://wiki.apache.org/hadoop/EclipseEnvironment .
I followed the tutorial
http://blog.shiftehfar.org/?p=647 to build hadoop in eclipse.

 

I get this error when I build hadoop-1.0.4 in eclipse juno.


Description ResourcePathLocationType
Access restriction: The method open() from the type ResolverConfiguration is 
not accessible due to restriction on required library 
/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Classes/classes.jar 
 SecurityUtil.java   
/Hadoop-1.0.4/hadoop-1.0.4/src/core/org/apache/hadoop/security  line 437
Java Problem
Access restriction: The type ResolverConfiguration is not accessible due to 
restriction on required library 
/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Classes/classes.jar 
SecurityUtil.java   
/Hadoop-1.0.4/hadoop-1.0.4/src/core/org/apache/hadoop/security  line 437
Java Problem
Access restriction: The type IPAddressUtil is not accessible due to restriction 
on required library 
/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Classes/classes.jar 
SecurityUtil.java   
/Hadoop-1.0.4/hadoop-1.0.4/src/core/org/apache/hadoop/security  line 48 Java 
Problem
Access restriction: The type ResolverConfiguration is not accessible due to 
restriction on required library 
/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Classes/classes.jar 
SecurityUtil.java   
/Hadoop-1.0.4/hadoop-1.0.4/src/core/org/apache/hadoop/security  line 47 Java 
Problem
Access restriction: The method textToNumericFormatV6(String) from the type 
IPAddressUtil is not accessible due to restriction on required library 
/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Classes/classes.jar 
  SecurityUtil.java   
/Hadoop-1.0.4/hadoop-1.0.4/src/core/org/apache/hadoop/security  line 460
Java Problem
Access restriction: The method textToNumericFormatV4(String) from the type 
IPAddressUtil is not accessible due to restriction on required library 
/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Classes/classes.jar 
  SecurityUtil.java   
/Hadoop-1.0.4/hadoop-1.0.4/src/core/org/apache/hadoop/security  line 456
Java Problem
Access restriction: The type IPAddressUtil is not accessible due to restriction 
on required library 
/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Classes/classes.jar 
SecurityUtil.java   
/Hadoop-1.0.4/hadoop-1.0.4/src/core/org/apache/hadoop/security  line 458
Java Problem
Access restriction: The method isIPv6LiteralAddress(String) from the type 
IPAddressUtil is not accessible due to restriction on required library 
/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Classes/classes.jar 
   SecurityUtil.java   
/Hadoop-1.0.4/hadoop-1.0.4/src/core/org/apache/hadoop/security  line 458
Java Problem
Access restriction: The type IPAddressUtil is not accessible due to restriction 
on required library 
/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Classes/classes.jar 
SecurityUtil.java   
/Hadoop-1.0.4/hadoop-1.0.4/src/core/org/apache/hadoop/security  line 460
Java Problem
Access restriction: The method searchlist() from the type ResolverConfiguration 
is not accessible due to restriction on required library 
/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Classes/classes.jar 
   SecurityUtil.java   
/Hadoop-1.0.4/hadoop-1.0.4/src/core/org/apache/hadoop/security  line 437
Java Problem
Access restriction: The type IPAddressUtil is not accessible due to restriction 
on required library 
/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Classes/classes.jar 
SecurityUtil.java   
/Hadoop-1.0.4/hadoop-1.0.4/src/core/org/apache/hadoop/security  line 454
Java Problem
Access restriction: The method isIPv4LiteralAddress(String) from the type 
IPAddressUtil is not accessible due to restriction on required library 
/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Classes/classes.jar 
   SecurityUtil.java   
/Hadoop-1.0.4/hadoop-1.0.4/src/core/org/apache/hadoop/security  line 454
Java Problem
Access restriction: The type IPAddressUtil is not accessible due to restriction 
on required library 
/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Classes/classes.jar 
SecurityUtil.java   
/Hadoop-1.0.4/hadoop-1.0.4/src/core/org/apache/hadoop/security  line 456
Java Problem

Any ideas of what I might be missing.

I have added all the jars required. The permissions on 
/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Classes/classes.jar 
are default. 



Thanks,
Bharati

New to hive.

2013-06-19 Thread Bharati
Hi Folks,

I am new to hive and need information, tutorials etc that you can point to. I 
have installed hive to work with MySQL.

 I can run queries. Now I would like to understand how the map and reduce 
classes are created and how I can look at the data for the map job and map 
class the hive query generates.  Also is there a way to create custom map 
classes.
I would appreciate if anyone can help me get started.

Thanks,
Bharati

Sent from my iPadFortigate Filtered