can launch Mappers on the same node of the
> RegionServer hosting your Region and avoid any reading any data over the
> network.
>
> This is just an optimization.
>
> On 4/30/19 10:12 AM, Shawn Li wrote:
> > Hi,
> >
> > The number of Map in Phoenix Mapreduc
e
network.
This is just an optimization.
On 4/30/19 10:12 AM, Shawn Li wrote:
Hi,
The number of Map in Phoenix Mapreduce is determined by table region
number. My question is: if the region is split due to other injection
process while Phoenix Mapreduce job is running, do we lose reading some
Hi,
The number of Map in Phoenix Mapreduce is determined by table region
number. My question is: if the region is split due to other injection
process while Phoenix Mapreduce job is running, do we lose reading some
data due to this split? As now we have more regions than maps, and the maps
only
Hey Anil,
Check out the MultiHfileOutputFormat class.
You can see how AbstractBulkLoadTool invokes it inside the `submitJob`
method.
On 12/28/17 5:33 AM, Anil wrote:
HI Team,
I was looking at the PhoenixOutputFormat and PhoenixRecordWriter.java ,
could not see connection autocommit is set
HI Team,
I was looking at the PhoenixOutputFormat and PhoenixRecordWriter.java ,
could not see connection autocommit is set to false. Did i miss something
here ?
Is there any way to read from phoenix table and create HFiles for bulk
import instead of committing every record (batch).
I have
I have been using https://phoenix.apache.org/pig_integration.html for years
with much success.
Hope this helps,
Steve
On Fri, Mar 24, 2017 at 7:40 AM, Anil wrote:
> Hi,
>
> I have two table called PERSON and PERSON_DETAIL. i need to populate the
> of the person Detail
Hi,
I have two table called PERSON and PERSON_DETAIL. i need to populate the of
the person Detail info into Person record.
Does phoenix map reduce support Multiple mappers from multiple tables
through MultipleInput ?
Currently i am populating consolidated details information into a temporary
Hello,
I have phoenix table which have both child and parent records.
now i have created a phoenix mapreduce job to populate few columns of
parent record into child record.
Two ways of populating parent columns into child record are
1.
a. Get the parent columns information by phoenix query