Hi Adam,

Back in 2014 I had the same driving motives like you and I ended up writing 
my own generator for Postgres (to deal with tables, views, and functions, 
for the basic types):

https://github.com/silviucm/pgtogogen
https://www.cmscomputing.com/articles/programming/generate-go-entities-from-postgres-tables-views

It served me well over the past 3-4 years, particularly because I tend to 
rely a lot on materialized views, that I can refresh via (generated) Go 
code for complex queries.

If you start your own project, I can tell you it's a large undertaking and 
will probably end up a work in progress over months and years. 
When mapping database object you will find that some of the code may not 
feel idiomatic Go, so you will need to use your judgment whether it's 
appropriate for you individually or for a larger team, where the least 
common denominator is more important than generated patterns.

Cheers,
Silviu


On Friday, 2 June 2017 08:55:12 UTC-4, brylant wrote:
>
>
> I've been trying hard (well.. as much as I can considering my lack of 
> in-depth go knowledge or - to be perfectly honest - lack of in-depth 
> knowledge of anything) to find suitable go+sql technique that would not 
> require a lot of code repetition, not use reflection and not use ORMs of 
> any sort... Could somebody please tell me if there's anything particularly 
> wrong with the following:
>
>
> type ScannerFunc func() []interface{}
>
> func (db *DB) ScanSome(stmt string, sf ScannerFunc, params ...interface{}) 
> error {
>  rows, err := db.Query(stmt, params...)
>  if err != nil {
>  return err
>  }
>  defer rows.Close()
>  for rows.Next() {
>  err = rows.Scan(sf()...)
>  if err != nil {
>  return err
>  }
>  }
>  if err = rows.Err(); err != nil {
>  return err
>  }
>  return nil
> }
>
> Having the above I could then implement the following for each of my 
> 'models' (User being an example below). This could easily be 'go 
> generate'-d for each model
>
>
> type User struct {
>     UserID  int64
>     Name    string
>     Role    int
>     // (...)
> }
>
> func ScanUsersFunc(users *[]*User) ScannerFunc {
>     return ScannerFunc(func() []interface{}) {
>         u := User{}
>         *users = append(*users, &u)
>         var r []interface{} = []interface{}{&u.UserID, &u.Name, &u.Role, 
> (more 
> properties)}
>         return r
>     }
> }
>
>
> and finally use it like this: 
>
>
> const (
>     sqlUsersByRole = "SELECT user_id,name,role, (more if needed) FROM 
> user WHERE role=?"
>     sqlAllUsers    = "SELECT user_id,name,role FROM user"
> )
>
> func (db *DB) UsersByRole(role int) ([]*User, error) {
>     users := make([]*User, 0)
>     err := db.ScanSome(sqlUsersByRole, ScanUsersFunc(&users), role)
>     if err != nil {
>         return nil, err
>     }
>     return users, nil
> }
>
> func (db *DB) AllUsers() ([]*User, error) {
>     users := make([]*User, 0)
>     err := db.ScanSome(sqlAllUsers, ScanUsersFunc(&users))
>     if err != nil {
>         return nil, err
>     }
>     return users, nil
> }
>
>
> Alternatively (to avoid scanning/returning all results) a callback could 
> be provided to ScanSome and called after each scan.
>
> Obviously I could also implement ScanOne for situations where I only 
> expect one row of results...
>
>
> So - any obvious issues with the above 'technique'...?
>
>
> Thanks,
>
> adam
>
>
>
>

-- 
You received this message because you are subscribed to the Google Groups 
"golang-nuts" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to golang-nuts+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to