Hi List;
I have a table with 9,961,914 rows in it (see the describe of
bigtab_stats_fact_tmp14 below)
I also have a table with 7,785 rows in it (see the describe of
xsegment_dim below)
I'm running the join shown below and it takes 10 hours and
eventually runs out of disk space on a
Also, I'm running version 8.3 on a centOS box with 2 dual core CPU's
and 32Gig of ram
On May 16, 2008, at 12:58 AM, kevin kempter wrote:
Sorry I goofed on the query text Here's the correct query:
select
f14.xpublisher_dim_id,
f14.xtime_dim_id,
f14.xlocation_dim_id,
f14.xreferrer_dim_id,
I have a table with 9,961,914 rows in it (see the describe of
bigtab_stats_fact_tmp14 below)
I also have a table with 7,785 rows in it (see the describe of xsegment_dim
below)
I'm running the join shown below and it takes 10 hours and eventually runs
out of disk space on a 1.4TB file
Sorry I goofed on the query text Here's the correct query:
select
f14.xpublisher_dim_id,
f14.xtime_dim_id,
f14.xlocation_dim_id,
f14.xreferrer_dim_id,
f14.xsite_dim_id,
f14.xsystem_cfg_dim_id,
f14.xaffiliate_dim_id,
f14.customer_id,
f14.pf_dts_id,
f14.episode_id,
f14.sessionid,
On Fri, 2008-05-16 at 00:31 -0600, kevin kempter wrote:
I'm running the join shown below and it takes 10 hours and
eventually runs out of disk space on a 1.4TB file system
Well, running in 10 hours doesn't mean there's a software problem, nor
does running out of disk space.
Please crunch
kevin kempter wrote:
Hi List;
I have a table with 9,961,914 rows in it (see the describe of
bigtab_stats_fact_tmp14 below)
I also have a table with 7,785 rows in it (see the describe of
xsegment_dim below)
I'm running the join shown below and it takes 10 hours and eventually
runs out of
I'm expecting 9,961,914 rows returned. Each row in the big table
should have a corresponding key in the smaller tale, I want to
basically expand the big table column list by one, via adding the
appropriate key from the smaller table for each row in the big table.
It's not a cartesion
kevin kempter wrote:
I'm expecting 9,961,914 rows returned. Each row in the big table should
have a corresponding key in the smaller tale, I want to basically
expand the big table column list by one, via adding the appropriate
key from the smaller table for each row in the big table. It's not
kevin kempter wrote:
Hi List;
I have a table with 9,961,914 rows in it (see the describe of
bigtab_stats_fact_tmp14 below)
I also have a table with 7,785 rows in it (see the describe of
xsegment_dim below)
Something else is puzzling me with this - you're joining over four fields.
from
On further investigation it turns out that I/we have a serious data
issue in that my small table is full of 'UNKNOWN' tags so my query
cannot associate the data correctly - thus I will end up with 2+
billion rows.
Thanks everyone for your help
On May 16, 2008, at 1:38 AM, Simon Riggs
10 matches
Mail list logo