Sorry for the slow response. On Sun, Oct 4, 2015 at 11:44 AM, Gunith Devasurendra <[email protected]> wrote:
> Hello Aaron, > > Thank you for the kind welcome and advice on starting up. > > Happy to say that I completed most of the setup steps without a problem. > The only problem I got is when I was defining and querying. Please find the > lines of the console as follows, > > gunith@gunith-X555LA:~/apps/apache-blur-h.2.6.0.b.0.3.0.incubating-bin/bin$ > ./blur shell > blur (default)> create -t testtable -c 11 -l hdfs://namenode/data/testtable > java.net.UnknownHostException: namenode > blur (default)> create -t testtable -c 11 -l > hdfs://localhost/data/testtable > java.net.ConnectException: Call From gunith-X555LA/127.0.1.1 to > localhost:8020 failed on connection exception: java.net.ConnectException: > Connection refused; For more details see: > http://wiki.apache.org/hadoop/ConnectionRefused > blur (default)> create -t testtable -c 11 -l file:///data/testtable > Table [testtable] has already exists. > Looks like you need to delete the table and recreate with just -l file:///data/testtable. The "create -t testtable -c 11 -l hdfs://localhost/data/testtable" likely created the table in ZK but not on disk and when you ran "create -t testtable -c 11 -l file:///data/testtable" the table already existed in ZK. > blur (default)> mutate testtable rowid1 recordid1 fam0 col1:value1 > Shard [shard-00000004] in table [testtable] is not being served by this > server. > blur (default)> > blur (default)> query testtable fam0.col1:value1 > The docs are out of date try "query testtable -query fam0.col1:value1" instead. > usage: query <tablename> [<options>] > -disableRowQuery Disables row query. (Enabled by default) > -facet <facet> Specify facet to be executed with this > query. > -fetch <fetch> Specify the number of elements to fetch in > a single page. > -h Displays help for this command. > -max <max> Specify the maximum amount of time to > allow query to execute. > -min <min> Specify the minimum number of results > required before returning from query. > -query <arg> * Query string. > -recordFilter <recordFilter> Specify record filter. > -rowFilter <rowFilter> Specify row filter. > -rowId <rowId> Specify the rowId to execute the query > against (this reduces the spray to other > shards). > -scoreType <scoreType> Specify the scoring type. > -sort <sort> Specify a sort to be applied to this query > <family> <column> [<reverse>]. > -start <start> Specify the starting position (paging). > -width <width> Specify max column width for display. > blur (default)> > > Any idea as to what am doing wrong? > > Also, another thing: I tried to open the project on Eclipse (as a Maven > project with profile 'binary'), but some of the Hadoop imports show as > missing classes. What IDE do you guys use? If it's Eclipse, am I missing a > step in creating the project? > Some of the projects do not contain java classes. There are some that simply assembly the projects into tars or parcels. The distribution-src, distribution-bin, cdh-parcel fall into this category. Thanks, Aaron > > Thanks and Best Regards, > Gunith > > > Date: Sat, 3 Oct 2015 20:32:55 -0400 > > Subject: Re: Newbie: How can I contribute? > > From: [email protected] > > To: [email protected] > > > > Welcome! > > > > First I would checkout master from the git repo and build the project. > > > > git clone https://git-wip-us.apache.org/repos/asf/incubator-blur.git > > > > mvn install > > > > add -DskipTests > > > > add -Pbinary to get binary artifacts. > > > > After that checkout > > http://incubator.apache.org/blur/docs/0.2.3/getting-started.html for a > > getting started. We recently have moved to java8 so that is a > > requirement. Also the getting started is close but could be slightly > > different now. Let us know how it goes. Thanks! > > > > Aaron > > > > > > > > On Sat, Oct 3, 2015 at 12:48 AM, Gunith Devasurendra <[email protected] > > > > wrote: > > > > > Hi, > > > > > > I'm a Java developer of around 6 years from Sri Lanka and I found your > > > project interesting, despite not having much domain experience. > > > Am keen to contribute. I would be grateful if someone can help me > start? > > > > > > Thanks and Best Regards, > > > Gunith > > > > > > > > > > >
