Dawid, Perhaps a dumb question, but did you execute a CREATE TABLE statement in sqlline for the tables you're importing into? Phoenix needs to be told the schema of the table (i.e. it's not enough to just create the table in HBase). Thanks, James
On Mon, Jun 8, 2015 at 10:02 AM, Dawid <[email protected]> wrote: > Any suggestions? Some clues what to check? > > > On 05.06.2015 23:21, Dawid wrote: > > Yes I can see it in hbase-shell. > > Sorry for the bad links, i haven't used private repositories on github. So I > moved the files to a gist: > https://gist.github.com/dawidwys/3aba8ba618140756da7c > Hope this times it will work. > > On 05.06.2015 23:09, Ravi Kiran wrote: > > Hi Dawid, > Do you see the data when you run a simple scan or count of the table in > Hbase shell ? > > FYI. The links lead me to a 404 : File not found. > > Regards > Ravi > > On Fri, Jun 5, 2015 at 1:17 PM, Dawid <[email protected]> wrote: >> >> Hi, >> I was trying to code some utilities to bulk load data through HFiles from >> Spark RDDs. >> I was trying to took the pattern of CSVBulkLoadTool. I managed to generate >> some HFiles and load them into HBase, but i can't see the rows using >> sqlline. I would be more than grateful for any suggestions. >> >> The classes can be accessed at: >> >> https://github.com/dawidwys/gate/blob/master/src/main/scala/pl/edu/pw/elka/phoenix/BulkPhoenixLoader.scala >> >> https://github.com/dawidwys/gate/blob/master/src/main/scala/pl/edu/pw/elka/phoenix/ExtendedProductRDDFunctions.scala >> >> Thanks in advance >> >> Dawid Wysakowicz >> >> > > > -- > Pozdrawiam > Dawid > > > -- > Pozdrawiam > Dawid
