Re:Re: Is ther a way to run one test of special unit test?

2015-01-18 Thread lulynn_2008
at 6:55 PM, lulynn_2008 lulynn_2...@163.com wrote: Hi All, There are multiple tests in one Test* file. Is there a way to just run only one pointed test? Thanks

Is ther a way to run one test of special unit test?

2015-01-16 Thread lulynn_2008
Hi All, There are multiple tests in one Test* file. Is there a way to just run only one pointed test? Thanks

Re:Re: Is it possible to add cmx as pig.tmpfilecompression.code supported codec?

2014-11-19 Thread lulynn_2008
/NOAA_Weather_csv/2011/99-53019-2011.csv.gz' using PigStorage(',') as (projectname:chararray); STORE data INTO '/comCodecGzip'; Cheers, Krishna On Tue, Nov 18, 2014 at 2:04 PM, lulynn_2008 lulynn_2...@163.com wrote: BTW, cmx is com.ibm.biginsights.compress.CmxCodec, the related jar is ibm

Re:Is it possible to add cmx as pig.tmpfilecompression.code supported codec?

2014-11-18 Thread lulynn_2008
BTW, cmx is com.ibm.biginsights.compress.CmxCodec, the related jar is ibm-compression.jar. At 2014-11-18 15:49:53, lulynn_2008 lulynn_2...@163.com wrote: Hi All, I am trying to use CMX as temp file compression codec, i.e SET pig.tmpfilecompression true; SET pig.tmpfilecompression.codec cmx

Is it possible to add cmx as pig.tmpfilecompression.code supported codec?

2014-11-18 Thread lulynn_2008
Hi All, I am trying to use CMX as temp file compression codec, i.e SET pig.tmpfilecompression true; SET pig.tmpfilecompression.codec cmx; but following errors happened: Caused by: java.io.IOException: Invalid temporary file compression codec []. Expected compression codecs are gz and lzo from

Is it necessary to export PIG_OPTS=-Dhive.metastore.uris=thrift://hostname:port when using -useHCatalog?

2014-11-10 Thread lulynn_2008
Hi All, From trunk, pig script has added ${HIVE_HOME}/conf into pig classpath. Then when using -useHCatalog, is it necessary to export PIG_OPTS=-Dhive.metastore.uris=thrift://hostname:port? Besides, my hivemetastore is using loca mode, and I am planning to configure HiveMetaStore to the remote

Error ERROR 2088: Fetch failed. Couldn't retrieve result happened during HCatLoader() then DUMP

2014-10-14 Thread lulynn_2008
Hi All, I was running HCatStore and HCatLoader in pig grunt. But encounter ERROR 2088: Fetch failed. Couldn't retrieve result. Please help give a glance and give your suggestions. Thanks. Test case: 1. Create table in hive: create table junit_unparted_basic(a int, b string) stored as RCFILE

Encounter 401 error during mvn-deploy

2014-08-15 Thread lulynn_2008
Hi All, I was uploading pig artifacts to my private repository. I assume I just need to generate ~/.m2/settings.xml and run target mvn-deploy. Here is one server seting in settings.xml: server idprivate-open-source-components-snapshots/id usernamedeployment/username

which ant targets create artifacts and pom files, and release a component?

2014-08-01 Thread lulynn_2008
Hi. Could you tell which ant targets create artifacts and pom files, and release a component? Thanks

Please help review patch for PIG-4047: Replace pig-withouthadoop jar with pig-core jar and pig core dependencies

2014-07-06 Thread lulynn_2008
:34 PM, lulynn_2008 lulynn_2...@163.com wrote: Hi Deniel, with new struntured pig package, the scripts run succeeded with pig-0.12.0. I just did following: 1. devided without hadoop jar into pig core and pig core dependencies. 2. save jars in 1# in lib directory 3. in pig script, always add

Re:Re: Re: Questions about dependencies included in withouth-hadoop jar file and lib directory.

2014-07-01 Thread lulynn_2008
it depending on your script. Try the following script: a = load 'studenttab10k' as (name:chararray, age:int, gpa:double); b = filter a by name matches '.*or.*'; dump b; Thanks, Daniel On Tue, Jun 24, 2014 at 7:23 PM, lulynn_2008 lulynn_2...@163.com wrote: Hi Deniel, Thanks for your detail. For your

Encounter DNS error during load hbase table via pig grunt

2014-06-30 Thread lulynn_2008
Hi All, Following are the test case and error. Do you have any suggestion or comment? Thanks Test case: create hbase table in hbase shell: create 'employees', 'SN', 'department', 'address' put 'employees', 'Hong', 'address:country', 'China' load and dump the table in pig grunt: A = load

Re:Re: Encounter DNS error during load hbase table via pig grunt

2014-06-30 Thread lulynn_2008
;; global options: +cmd ;; connection timed out; no servers could be reached At 2014-06-30 04:29:16, Gordon Wang gw...@gopivotal.com wrote: Make sure you can resolve 9.181.64.230 in cmd. use dig 9.181.64.230 to check. On Mon, Jun 30, 2014 at 4:14 PM, lulynn_2008 lulynn_2...@163.com

Re:Re: Questions about dependencies included in withouth-hadoop jar file and lib directory.

2014-06-24 Thread lulynn_2008
give your suggestions. Thanks At 2014-06-25 02:34:03, Daniel Dai da...@hortonworks.com wrote: On Mon, Jun 23, 2014 at 8:13 PM, lulynn_2008 lulynn_2...@163.com wrote: Hi All, In build.xml in branch-0.13, 1. following jars are included into lib directory during packaging: copy

Questions about dependencies included in withouth-hadoop jar file and lib directory.

2014-06-23 Thread lulynn_2008
Hi All, In build.xml in branch-0.13, 1. following jars are included into lib directory during packaging: copy todir=${tar.dist.dir}/lib fileset dir=${ivy.lib.dir} includes=jython-*.jar/ fileset dir=${ivy.lib.dir} includes=jruby-*.jar/ fileset

Re:Re: what version will be the pig next release?

2014-06-22 Thread lulynn_2008
. On Thu, Jun 19, 2014 at 12:25 AM, lulynn_2008 lulynn_2...@163.com wrote: Hi All, Do you know what version will be the pig next release? And when will this versin will be released. Thanks

Re:Re: Re: what version will be the pig next release?

2014-06-22 Thread lulynn_2008
development. It should work with both jdk 6 and jdk 7. On Sun, Jun 22, 2014 at 7:56 PM, Cheolsoo Park piaozhe...@gmail.com wrote: Yes. I am running 0.13 on jdk 7 in beta test at work. On Sun, Jun 22, 2014 at 7:51 PM, lulynn_2008 lulynn_2...@163.com wrote: Thanks Cheolsoo. In git, I can find

Re:Re: Congratulations to Cheolsoo Park the new Apache Pig project chair

2014-03-21 Thread lulynn_2008
Add my congratulations. At 2014-03-21 06:45:18,Aniket Mokashi aniket...@gmail.com wrote: Woo!! Congrats Cheolsoo... On Thu, Mar 20, 2014 at 4:25 AM, Rohini Palaniswamy rohini.adi...@gmail.com wrote: Thanks Julien. Great job last year. Congratulations, Cheolsoo!!! Well deserved.