Hi Alex!
Furtunaly, I'm thinking about the same! I mean the synchronization 
Postgres-DAL while developing. First of all,  I belive you can find the 
script *extract_pgsql_models.py* in your web2py/scripts directiory, but it 
just generates models to stdout and you can improve it but why?
So, I did  some universal VIEWs  that  grab  tables and views definition ( 
schemas from *search_path only* are processed - see *current_schemas()*  pg 
function) and represents them in eatable state for DAL.
Here are they - see below. Keep in mind *column_vw* depends on others, so 
it must be latest creation. After that, you can select any/all table 
definition from *column_vw* having  schema, table name, column name, column 
pg_type, column web2py_type (including 'reference table_name' ),  defaults, 
nullable. Now you can easy  generate DAL models on the fly!  It's just the 
cost of performance, but it's always possible to generate a "hard copy" for 
production. 

*model_gen* - separate schema to filter out VIEWs-for-DAL own definition, 
so you should create this schema first of all.


Tables
CREATE OR REPLACE VIEW model_gen.*table_vw* AS  
 SELECT tables.table_catalog,
    tables.table_schema,
    tables.table_name,
    tables.table_type,
    tables.self_referencing_column_name,
    tables.reference_generation,
    tables.user_defined_type_catalog,
    tables.user_defined_type_schema,
    tables.user_defined_type_name,
    tables.is_insertable_into,
    tables.is_typed,
    tables.commit_action
   FROM information_schema.tables
  WHERE (tables.table_schema::text IN ( SELECT ss.s
           FROM unnest(current_schemas(false)) ss(s)));


Prymary keys
CREATE OR REPLACE VIEW model_gen.*cons_pku_vw* AS 
 SELECT table_constraints.table_schema,
    table_constraints.table_name,
    constraint_column_usage.column_name AS col_name,
    table_constraints.constraint_type AS cons_type,
    table_constraints.constraint_name AS cons_pku_name,
    count(*) OVER (PARTITION BY table_constraints.constraint_name) AS 
col_in_cons,
    key_column_usage.ordinal_position AS cons_position
   FROM information_schema.table_constraints
     JOIN information_schema.constraint_column_usage USING 
(constraint_catalog, constraint_schema, constraint_name, table_catalog, 
table_schema, table_name)
     JOIN information_schema.key_column_usage USING (constraint_catalog, 
constraint_schema, constraint_name, table_catalog, table_schema, 
table_name, column_name)
  WHERE (table_constraints.table_schema::name IN ( SELECT ss.s
           FROM unnest(current_schemas(false)) ss(s))) AND 
table_constraints.table_schema::text <> 'model_gen'::text
  ORDER BY table_constraints.table_schema, table_constraints.table_name, 
table_constraints.constraint_name;

Foreign keys
CREATE OR REPLACE VIEW model_gen.*cons_fk_vw* AS 
 SELECT key_col_use.constraint_name AS cons_fk_name,
    key_col_use.table_schema,
    key_col_use.table_name,
    key_col_use.column_name AS col_name,
    cons_col_use.table_schema AS ref_schema,
    cons_col_use.table_name AS ref_table,
    cons_col_use.column_name AS ref_col,
    ref_cons.update_rule,
    ref_cons.delete_rule,
    key_col_use.ordinal_position
   FROM information_schema.key_column_usage key_col_use
     JOIN information_schema.referential_constraints ref_cons USING 
(constraint_catalog, constraint_schema, constraint_name)
     JOIN information_schema.table_constraints t_cons USING 
(constraint_catalog, constraint_schema, constraint_name, table_catalog, 
table_schema, table_name)
     JOIN information_schema.constraint_column_usage cons_col_use ON 
cons_col_use.constraint_name::text = ref_cons.unique_constraint_name::text
  WHERE (key_col_use.table_schema::name IN ( SELECT ss.s
           FROM unnest(current_schemas(false)) ss(s))) AND 
key_col_use.table_schema::text <> 'model_gen'::text AND 
t_cons.constraint_type::text = 'FOREIGN KEY'::text
  ORDER BY key_col_use.table_schema, key_col_use.table_name, 
key_col_use.column_name;



All together
CREATE OR REPLACE VIEW model_gen.*column_vw* AS 
 SELECT columns.table_schema,
    columns.table_name,
    columns.column_name AS col_name,
    columns.data_type AS col_type,
        CASE
            WHEN "left"(columns.column_default::text, 7) = 'nextval'::text 
OR cons_pku_vw.cons_type::text = 'PRIMARY KEY'::text THEN 'id'::character 
varying
            WHEN cons_fk_vw.col_name IS NOT NULL THEN concat('reference ', 
cons_fk_vw.ref_table)::character varying
            WHEN "left"(columns.data_type::text, 9) = 'character'::text 
THEN 'string'::character varying
            WHEN columns.data_type::text = ANY 
(ARRAY['timestamp'::character varying::text, 'timestamp without time 
zone'::character varying::text]) THEN 'datetime'::character varying
            WHEN columns.data_type::text = ANY (ARRAY['time'::character 
varying::text, 'time without time zone'::character varying::text]) THEN 
'time'::character varying
            WHEN columns.data_type::text = ANY (ARRAY['integer'::character 
varying::text, 'smallint'::character varying::text, 'bigint'::character 
varying::text]) THEN 'integer'::character varying
            WHEN columns.data_type::text = ANY (ARRAY['boolean'::character 
varying::text, 'bit'::character varying::text]) THEN 'boolean'::character 
varying
            WHEN columns.data_type::text = ANY (ARRAY['double 
precision'::character varying::text, 'real'::character varying::text]) THEN 
'double'::character varying
            WHEN columns.data_type::text = ANY (ARRAY['numeric'::character 
varying::text, 'currency'::character varying::text]) THEN 
concat('decimal(', columns.numeric_precision, ',', columns.numeric_scale, 
')')::character varying
            WHEN "left"(columns.data_type::text, 9) = 'character'::text 
THEN 'string'::character varying
            ELSE columns.data_type::character varying
        END AS w2p_type,
    columns.character_maximum_length AS col_length,
    columns.column_default AS col_default,
    columns.is_nullable AS is_null,
    columns.ordinal_position AS col_position,
    columns.numeric_precision AS num_precision,
    columns.numeric_scale AS num_scale
   FROM information_schema.columns
     LEFT JOIN cons_fk_vw ON cons_fk_vw.table_schema::text = 
columns.table_schema::text AND cons_fk_vw.table_name::text = 
columns.table_name::text AND cons_fk_vw.col_name::text = 
columns.column_name::text
     LEFT JOIN cons_pku_vw ON cons_pku_vw.table_schema::text = 
columns.table_schema::text AND cons_pku_vw.table_name::text = 
columns.table_name::text AND cons_pku_vw.col_name::text = 
columns.column_name::text
  WHERE (columns.table_schema::name IN ( SELECT ss.s
           FROM unnest(current_schemas(false)) ss(s))) AND 
columns.table_schema::text <> 'model_gen'::text
  ORDER BY columns.table_schema, columns.table_name;





On Sunday, February 7, 2016 at 12:14:46 AM UTC+3, Alex Glaros wrote:
>
> I'd like to set up a collaborative data modeling environment where 
> multiple governmental licensing agencies jointly participate in designing 
> an open source data model. (They license doctors, dentists, pharmacies, 
> etc.)
>
> The default web2py bundle is almost perfect for this except that sqLite is 
> not multi-user and I'd like users to have the option of something 
> physically functional to try with their in-house Java/PL-SQL programmers, 
> and become familiar with alternatives to proprietary databases.
>
> Because it is an opportunity to showcase web2py, I hope it's okay to ask 
> our group for ideas regarding designing the environment. 
>
> Would it work where users create the data model in Postgres first, and 
> then I write code in web2py controller using their data to express their 
> ideas? This way, they can use Postgres GUI data model designer or just type 
> in the table commands. In this case, I would like to find a way to solve 
> complexities of DAL being out of sync with data definitions. I'm not yet 
> skillful in resolving constraint/out-of-sync issues with Postgres/w2p. 
> (Maybe prototype would have no constraints. Is there any utility that 
> converts Postgres schema to web2py DAL?)
>
> I'm thinking users would create the Postgres database somewhere, maybe on 
> Hostmonster.com, and I would upload a copy to PythonAnywhere each week to 
> write the views and controller functions. Alternatively they could just 
> email their ideas and I post them to the DAL which creates the Postgres 
> tables but then if I got too busy I'd become the bottleneck. If I separated 
> the project out so Postgres was independent of web2py, then users could 
> carry on without me if I fell behind on creating views and functions.
>
> Any thoughts appreciated,
>
> Alex Glaros
>

-- 
Resources:
- http://web2py.com
- http://web2py.com/book (Documentation)
- http://github.com/web2py/web2py (Source code)
- https://code.google.com/p/web2py/issues/list (Report Issues)
--- 
You received this message because you are subscribed to the Google Groups 
"web2py-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/d/optout.

Reply via email to