We're having a problem with a query during our investigation into Postgres (as an Oracle replacement). This query Postgres takes 20-40 seconds (multiple runs). Tom Lan recommended I post it here, with an explain-analyze.
Here's the query:
EXPLAIN ANALYZE SELECT company_name, address_1, address_2, address_3, city,
address_list.state_province_id, state_province_short_desc, country_desc, zip_code, address_list.country_id,
contact_info.email, commercial_entity.user_account_id, phone_num_1, phone_num_fax, website, boats_website
FROM commercial_entity, country, user_account,
address_list LEFT JOIN state_province ON address_list.state_province_id = state_province.state_province_id
LEFT JOIN contact_info ON address_list.contact_info_id = contact_info.contact_info_id
WHERE address_list.address_type_id = 101
AND commercial_entity.commercial_entity_id=225528
AND commercial_entity.commercial_entity_id = address_list.commercial_entity_id
AND address_list.country_id = country.country_id
AND commercial_entity.user_account_id = user_account.user_account_id
AND user_account.user_role_id IN (101, 101);
Here's the explain:
 Nested Loop  (cost=0.00..64570.33 rows=1 width=385) (actual time=42141.08..42152.06 rows=1 loops=1)
   ->  Nested Loop  (cost=0.00..64567.30 rows=1 width=361) (actual time=42140.80..42151.77 rows=1 loops=1)
         ->  Nested Loop  (cost=0.00..64563.97 rows=1 width=349) (actual time=42140.31..42151.27 rows=1 loops=1)
               Join Filter: ("outer".commercial_entity_id = "inner".commercial_entity_id)
               ->  Index Scan using commercial_entity_pkey on commercial_entity  (cost=0.00..5.05 rows=1 width=94) (actual time=0.57..0.58 rows=1 loops=1)
                     Index Cond: (commercial_entity_id = 225528::numeric)
               ->  Materialize  (cost=63343.66..63343.66 rows=97221 width=255) (actual time=41741.96..41901.17 rows=90527 loops=1)
                     ->  Merge Join  (cost=0.00..63343.66 rows=97221 width=255) (actual time=1.44..41387.68 rows=90527 loops=1)
                           Merge Cond: ("outer".contact_info_id = "inner".contact_info_id)
                           ->  Nested Loop  (cost=0.00..830457.52 rows=97221 width=222) (actual time=0.95..39178.32 rows=90527 loops=1)
                                 Join Filter: ("outer".state_province_id = "inner".state_province_id)
                                 ->  Index Scan using addr_list_ci_id_i on address_list  (cost=0.00..586676.65 rows=97221 width=205) (actual time=0.49..2159.90 rows=90527 loops=1)
                                       Filter: (address_type_id = 101::numeric)
                                 ->  Seq Scan on state_province  (cost=0.00..1.67 rows=67 width=17) (actual time=0.00..0.21 rows=67 loops=90527)
                           ->  Index Scan using contact_info_pkey on contact_info  (cost=0.00..3366.76 rows=56435 width=33) (actual time=0.44..395.75 rows=55916 loops=1)
         ->  Index Scan using user_account_pkey on user_account  (cost=0.00..3.32 rows=1 width=12) (actual time=0.46..0.46 rows=1 loops=1)
               Index Cond: ("outer".user_account_id = user_account.user_account_id)
               Filter: (user_role_id = 101::numeric)
   ->  Index Scan using country_pkey on country  (cost=0.00..3.01 rows=1 width=24) (actual time=0.25..0.25 rows=1 loops=1)
         Index Cond: ("outer".country_id = country.country_id)
 Total runtime: 42165.44 msec
(21 rows)
I will post the schema in a seperate email - the list has rejected one big email 3 times now.

Reply via email to