Re: [sqlalchemy] Date range query problem

2016-02-29 Thread Mike Bayer



On 02/29/2016 06:01 PM, Nana Okyere wrote:

I have a model and one of its attributes is a column
called last_updated_timestamp . It is a date column set
to datetime.datetime.now() . That's all good.

I'm trying to query for some roles based on a date range. So I do:

results =
WorkForceTable.query.filter(WorkForceTable.last_updated_timestamp.between(form.from_dt.data,
form.to_dt.data)).all()

where form.to_dt.data and form.from_dt.data are datetime.date values.

The table currently has rows all with last_updated_timestamp set to
today's date. Yet, when I run the above query and I set both dates to
today's date, I get nothing returned. When I set the 'from date' to
today and 'to date' to tomorrow's date, I get all the results I need.
Why is that? Am I misunderstanding something? Looks like it includes
results for the 'from date' but not results for the 'to date'.

I was under the impression that using 'between' will give me the results
inclusive of the endpoints. I even switched and tried the >= and <= but
the result is the same. What could be wrong? Basically, when I pass a
starting date and an ending date to a between construct, why do I not
get the right results? Especially, I have both to and from set to the
same date. All my rows are date stamped today's date too.



Oracle's DATE type includes a time portion.  So if you have a date 
"2016-01-15 15:45:00", it will only be located in a BETWEEN if you 
specify a timestamp greater than or equal to that as the end range. 
That's why setting the end range to '2016-01-16 00:00:00", e.g. 
tomorrows date, works, and sending "2016-01-15 00:00:00" does not.


That's at least what it sounds like is happening as this is very common. 
 Turn on echo=True on your engine to see the values specifically.









--
You received this message because you are subscribed to the Google
Groups "sqlalchemy" group.
To unsubscribe from this group and stop receiving emails from it, send
an email to sqlalchemy+unsubscr...@googlegroups.com
.
To post to this group, send email to sqlalchemy@googlegroups.com
.
Visit this group at https://groups.google.com/group/sqlalchemy.
For more options, visit https://groups.google.com/d/optout.


--
You received this message because you are subscribed to the Google Groups 
"sqlalchemy" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sqlalchemy+unsubscr...@googlegroups.com.
To post to this group, send email to sqlalchemy@googlegroups.com.
Visit this group at https://groups.google.com/group/sqlalchemy.
For more options, visit https://groups.google.com/d/optout.


[sqlalchemy] Re: Bulk Insert Broken for Polymorphism?

2016-02-29 Thread Alex Hewson
Hi Mike,

Thanks for the quick response.  If that's the intended behaviour I'll go 
back to non-bulk inserts for my inherited types.  Doubtless I could work 
around it by inserting N new Entities, fetching their autoincrement ID's 
then using them to make Child1 and Child2's but I don't trust myself with 
the added complexity.


Cheers,
Alex.



On Monday, February 29, 2016 at 10:38:22 PM UTC, Alex Hewson wrote:
>
> Hello All,
>
> I'm trying to use the new bulk_save_objects() to improve performance on 
> bulk inserts, and have run into a problem.  If bulk_save_objects() is used 
> to save objects of a polymorphic class..
>
>1. They are created correctly in the DB, with polymorphic type column 
>populated correctly
>2. BUT queries for the new objects will return one of incorrect type.  
>In my case I'm getting instances of Child1 back when I would expect to get 
>a Child2.
>
> The following code demonstrates the problem:
>
> #!/usr/bin/env python3
> # -*- coding: utf-8 -*-
>
> from sqlalchemy import create_engine
> from sqlalchemy import Column, Integer, SmallInteger, String, ForeignKey
> from sqlalchemy.orm import sessionmaker
> from sqlalchemy.ext.declarative import declarative_base
>
> Base = declarative_base()
>
> class Entity(Base):
>   __tablename__ = 'Entity'
>   Id  = Column(Integer, primary_key=True, nullable=False)
>   Content = Column(String)
>   _polytype   = Column(SmallInteger, nullable=False)
>
>   __mapper_args__ = {
> 'polymorphic_identity':1,
> 'polymorphic_on':_polytype
>   }
>
> class Child1(Entity):
>   __tablename__   = 'Child1'
>   MyId= Column(ForeignKey("Entity.Id"), primary_key=True)
>   __mapper_args__ = {'polymorphic_identity':11}
>
> class Child2(Entity):
>   __tablename__   = 'Child2'
>   MyId= Column(ForeignKey("Entity.Id"), primary_key=True)
>   __mapper_args__ = {'polymorphic_identity':12}
>
>
> if __name__ == '__main__':
>   # engine = create_engine('sqlite:///:memory:', echo=False)
>   engine = create_engine('sqlite:///test.db', echo=False)
>   Session = sessionmaker(bind=engine)
>   sess = Session()
>   Base.metadata.create_all(engine)
>   c1_many = [Child1(Content="c1inst_%d"%i) for i in range(0,1000)]
>   c2_many = [Child2(Content="c2inst_%d"%i) for i in range(0,1000)]
>   sess.bulk_save_objects(c1_many)
>   sess.bulk_save_objects(c2_many)
>   # sess.add_all(c1_many)
>   # sess.add_all(c2_many)
>   sess.flush()
>   sess.commit()
>   for c in sess.query(Child1):
> assert isinstance(c, Child1)
>   for c in sess.query(Child2):
> assert isinstance(c, Child2)
>
>
> All the calls to assert isinstance(c, Child1) complete successfully.  But 
> once we start checking for Child2 - boom, we are still getting back Child1 
> instances.
>
> At first I wondered if I was misunderstanding SA's implementation of 
> polymorphism, so tried inserting rows the traditional way with 
> sess.add_all().  But that works fine so I think I've exposed a bug in the 
> new bulk_save_objects() code.
>
> My environment is Python 3.5.1, SQLAlchemy==1.0.12, SQLite 3.8.10.2 on OSX.
>

-- 
You received this message because you are subscribed to the Google Groups 
"sqlalchemy" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sqlalchemy+unsubscr...@googlegroups.com.
To post to this group, send email to sqlalchemy@googlegroups.com.
Visit this group at https://groups.google.com/group/sqlalchemy.
For more options, visit https://groups.google.com/d/optout.


Re: [sqlalchemy] Bulk Insert Broken for Polymorphism?

2016-02-29 Thread Mike Bayer



On 02/29/2016 05:38 PM, Alex Hewson wrote:

Hello All,

I'm trying to use the new bulk_save_objects() to improve performance on
bulk inserts, and have run into a problem.  If bulk_save_objects() is
used to save objects of a polymorphic class..

 1. They are created correctly in the DB, with polymorphic type column
populated correctly
 2. BUT queries for the new objects will return one of incorrect type.
In my case I'm getting instances of Child1 back when I would expect
to get a Child2.



turn on echo=True, and you'll see this:


INSERT INTO "Child1" DEFAULT VALUES
2016-02-29 17:48:11,349 INFO sqlalchemy.engine.base.Engine ((), (), (), 
(), (), (), (), ()  ... displaying 10 of 1000 total bound parameter sets 
...  (), ())


what you will notice here is that this is the Child1 table receiving 
entirely empty rows; the primary key values from Entity are nowhere to 
be found.  SQLite does not enforce foreign keys by default so it's just 
auto-generating identifiers here, something that wouldn't happen on most 
other databases where this column wouldn't work as an autoincrement by 
default.   If you run it on Postgresql you get:


sqlalchemy.exc.IntegrityError: (psycopg2.IntegrityError) null value in 
column "MyId" violates not-null constraint

DETAIL:  Failing row contains (null).
 [SQL: 'INSERT INTO "Child1" DEFAULT VALUES'] [parameters: ({}, {}, {}, 
{}, {}, {}, {}, {}  ... displaying 10 of 1000 total bound parameter sets 
...  {}, {})]


What's happening here is documented, however the documentation for bulk 
is a little long and the documentation referring to the use case here 
might benefit from a little more boldface and probably should be more 
clearly listed as "will not work", instead of a somewhat casual 
"however".  At 
http://docs.sqlalchemy.org/en/rel_1_0/orm/persistence_techniques.html#orm-compatibility 
(emphasis added):


Multi-table mappings, such as joined-inheritance - **however**, an 
object to be inserted across multiple tables either **needs to have 
primary key identifiers fully populated ahead of time**, else the 
Session.bulk_save_objects.return_defaults flag must be used, which will 
greatly reduce the performance benefits


what we mean here is this:

c1_many = [Child1(Id=i+1, MyId=i+1, Content="c1inst_%d"%i) for i in 
range(0,1000)]
c2_many = [Child2(Id=i+1001, MyId=i+1001, Content="c2inst_%d"%i) 
for i in range(0,1000)]



In SQLAlchemy 1.1, things are much easier to spot, even if you're using 
a non-FK/non-autoincrement enforcing database like SQLite; running this 
program immediately catches the problem on the Python side:


sqlalchemy.exc.CompileError: Column 'Child1.MyId' is marked as a 
member of the primary key for table 'Child1', but has no Python-side or 
server-side default generator indicated, nor does it indicate 
'autoincrement=True' or 'nullable=True', and no explicit value is 
passed.  Primary key columns typically may not store NULL.


This is because 1.1 has changed the logic of the "autoincrement" flag 
and adds deeper checks for NULL primary key values as described at 
http://docs.sqlalchemy.org/en/latest/changelog/migration_11.html#the-autoincrement-directive-is-no-longer-implicitly-enabled-for-a-composite-primary-key-column.



So the mitigation for the fact that your specific test case silently 
fails include:


1. this whole issue only silently passes on SQLite, not on any of the 
higher volume databases where you'd want to use bulk operations in the 
first place


2. documentation here should be spruced up to list this practice as a 
**warning**, including that we should also have a boldface up in the 
earlier paragraph talking about fetching of inserted primary keys being 
disabled (this is the slowest part of the INSERT so has no place within 
bulk inserts, hence you must populate columns dependent on a PK up front 
which means the PK itself needs to be populated up front in those cases 
where you need it)


3. SQLAlchemy 1.1 won't let these INSERTs without a primary key value 
when the column is not configured as an "autoincrement" proceed


Thanks for the clear test case here.







The following code demonstrates the problem:

|
#!/usr/bin/env python3
# -*- coding: utf-8 -*-

fromsqlalchemy importcreate_engine
fromsqlalchemy importColumn,Integer,SmallInteger,String,ForeignKey
fromsqlalchemy.orm importsessionmaker
fromsqlalchemy.ext.declarative importdeclarative_base

Base=declarative_base()

classEntity(Base):
   __tablename__ ='Entity'
Id=Column(Integer,primary_key=True,nullable=False)
Content=Column(String)
   _polytype =Column(SmallInteger,nullable=False)

   __mapper_args__ ={
'polymorphic_identity':1,
'polymorphic_on':_polytype
}

classChild1(Entity):
   __tablename__ ='Child1'
MyId=Column(ForeignKey("Entity.Id"),primary_key=True)
   __mapper_args__ ={'polymorphic_identity':11}

classChild2(Entity):
   __tablename__ ='Child2'
MyId=Column(ForeignKey("Entity.Id"),primary_key=True)
   __mapper_args__ 

[sqlalchemy] Re: Date range query problem

2016-02-29 Thread Nana Okyere
Update: Using current version of sa. Oracle 12 c.

-- 
You received this message because you are subscribed to the Google Groups 
"sqlalchemy" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sqlalchemy+unsubscr...@googlegroups.com.
To post to this group, send email to sqlalchemy@googlegroups.com.
Visit this group at https://groups.google.com/group/sqlalchemy.
For more options, visit https://groups.google.com/d/optout.


[sqlalchemy] Date range query problem

2016-02-29 Thread Nana Okyere
I have a model and one of its attributes is a column 
called last_updated_timestamp . It is a date column set 
to datetime.datetime.now() . That's all good.

I'm trying to query for some roles based on a date range. So I do:

results = 
WorkForceTable.query.filter(WorkForceTable.last_updated_timestamp.between(form.from_dt.data,
 
form.to_dt.data)).all()

where form.to_dt.data and form.from_dt.data are datetime.date values.

The table currently has rows all with last_updated_timestamp set to today's 
date. Yet, when I run the above query and I set both dates to today's date, 
I get nothing returned. When I set the 'from date' to today and 'to date' 
to tomorrow's date, I get all the results I need. Why is that? Am I 
misunderstanding something? Looks like it includes results for the 'from 
date' but not results for the 'to date'.

I was under the impression that using 'between' will give me the results 
inclusive of the endpoints. I even switched and tried the >= and <= but the 
result is the same. What could be wrong? Basically, when I pass a starting 
date and an ending date to a between construct, why do I not get the right 
results? Especially, I have both to and from set to the same date. All my 
rows are date stamped today's date too.

-- 
You received this message because you are subscribed to the Google Groups 
"sqlalchemy" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sqlalchemy+unsubscr...@googlegroups.com.
To post to this group, send email to sqlalchemy@googlegroups.com.
Visit this group at https://groups.google.com/group/sqlalchemy.
For more options, visit https://groups.google.com/d/optout.


[sqlalchemy] Bulk Insert Broken for Polymorphism?

2016-02-29 Thread Alex Hewson
Hello All,

I'm trying to use the new bulk_save_objects() to improve performance on 
bulk inserts, and have run into a problem.  If bulk_save_objects() is used 
to save objects of a polymorphic class..

   1. They are created correctly in the DB, with polymorphic type column 
   populated correctly
   2. BUT queries for the new objects will return one of incorrect type.  
   In my case I'm getting instances of Child1 back when I would expect to get 
   a Child2.
   
The following code demonstrates the problem:

#!/usr/bin/env python3
# -*- coding: utf-8 -*-

from sqlalchemy import create_engine
from sqlalchemy import Column, Integer, SmallInteger, String, ForeignKey
from sqlalchemy.orm import sessionmaker
from sqlalchemy.ext.declarative import declarative_base

Base = declarative_base()

class Entity(Base):
  __tablename__ = 'Entity'
  Id  = Column(Integer, primary_key=True, nullable=False)
  Content = Column(String)
  _polytype   = Column(SmallInteger, nullable=False)

  __mapper_args__ = {
'polymorphic_identity':1,
'polymorphic_on':_polytype
  }

class Child1(Entity):
  __tablename__   = 'Child1'
  MyId= Column(ForeignKey("Entity.Id"), primary_key=True)
  __mapper_args__ = {'polymorphic_identity':11}

class Child2(Entity):
  __tablename__   = 'Child2'
  MyId= Column(ForeignKey("Entity.Id"), primary_key=True)
  __mapper_args__ = {'polymorphic_identity':12}


if __name__ == '__main__':
  # engine = create_engine('sqlite:///:memory:', echo=False)
  engine = create_engine('sqlite:///test.db', echo=False)
  Session = sessionmaker(bind=engine)
  sess = Session()
  Base.metadata.create_all(engine)
  c1_many = [Child1(Content="c1inst_%d"%i) for i in range(0,1000)]
  c2_many = [Child2(Content="c2inst_%d"%i) for i in range(0,1000)]
  sess.bulk_save_objects(c1_many)
  sess.bulk_save_objects(c2_many)
  # sess.add_all(c1_many)
  # sess.add_all(c2_many)
  sess.flush()
  sess.commit()
  for c in sess.query(Child1):
assert isinstance(c, Child1)
  for c in sess.query(Child2):
assert isinstance(c, Child2)


All the calls to assert isinstance(c, Child1) complete successfully.  But 
once we start checking for Child2 - boom, we are still getting back Child1 
instances.

At first I wondered if I was misunderstanding SA's implementation of 
polymorphism, so tried inserting rows the traditional way with 
sess.add_all().  But that works fine so I think I've exposed a bug in the 
new bulk_save_objects() code.

My environment is Python 3.5.1, SQLAlchemy==1.0.12, SQLite 3.8.10.2 on OSX.

-- 
You received this message because you are subscribed to the Google Groups 
"sqlalchemy" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sqlalchemy+unsubscr...@googlegroups.com.
To post to this group, send email to sqlalchemy@googlegroups.com.
Visit this group at https://groups.google.com/group/sqlalchemy.
For more options, visit https://groups.google.com/d/optout.


Re: [sqlalchemy] How to test functions that use sqlalchemy ORM?

2016-02-29 Thread Mike Bayer

sure see that at http://www.sqlalchemy.org/library.html#buildingtheapp.



On 02/29/2016 04:01 AM, Abhijeet Rastogi wrote:

Hi Mike,

Thanks for the reply. Do you have any slides to share from that talk?

Cheers!

On Monday, February 22, 2016 at 9:54:38 PM UTC+5:30, Mike Bayer wrote:

I created a pretty comprehensive example of two ways to do this as part
of a talk I did at Pycon some years ago.   The example case is
https://bitbucket.org/zzzeek/pycon2014_atmcraft
 and you can see a
contrast of the "mock" approach and the "run tests in a transaction"
approach at:


https://bitbucket.org/zzzeek/pycon2014_atmcraft/src/f50cbe745a197ea7db83569283b703c418481222/atmcraft/tests/test_views_mockdb.py?at=master=file-view-default





https://bitbucket.org/zzzeek/pycon2014_atmcraft/src/f50cbe745a197ea7db83569283b703c418481222/atmcraft/tests/test_views_transactional.py?at=master=file-view-default




The transactional version is based on the techniques at

http://docs.sqlalchemy.org/en/rel_1_0/orm/session_transaction.html#joining-a-session-into-an-external-transaction-such-as-for-test-suites

.



There are pluses and minuses to both approaches.   Real world I think
both are usually used at the same time, which has been the case for me.





On 02/22/2016 09:46 AM, Abhijeet Rastogi wrote:
 > Hi Folks,
 >
 > I've functions that do a subset of tasks on a sqlalchemy object
for a
 > bigger function. For ex,
 >
 > def delete_certificate_store(CS):
 >
 >  CS.is_archived = True
 >  # Also delete the associated CARequest object
 >  for DR in CS.CA_request:
 >  db.session.delete(DR)
 >  # Delete the Certificate Object only if this it's the only
order
 > for that Certificate
 >  if len(CS.order.certificate.get_all_unarchived_orders()) ==
1 and
 > CS in CS.order.certificate.get_all_unarchived_certificate_stores():
 >  CS.order.certificate.is_archived = True
 >
 > Now, this function modifies a database. How do we test functions
like
 > these? If I use mocks for DB related stuff, I'm not really
testing it.
 >
 > As a human, I test these functions by executing them and
verifying if
 > the intended changes happened in the DB. How's that done by testing
 > frameworks like pytest? Sorry, if this is a stupid question, I'm
new to
 > testing.
 >
 > Thanks

--
You received this message because you are subscribed to the Google
Groups "sqlalchemy" group.
To unsubscribe from this group and stop receiving emails from it, send
an email to sqlalchemy+unsubscr...@googlegroups.com
.
To post to this group, send email to sqlalchemy@googlegroups.com
.
Visit this group at https://groups.google.com/group/sqlalchemy.
For more options, visit https://groups.google.com/d/optout.


--
You received this message because you are subscribed to the Google Groups 
"sqlalchemy" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sqlalchemy+unsubscr...@googlegroups.com.
To post to this group, send email to sqlalchemy@googlegroups.com.
Visit this group at https://groups.google.com/group/sqlalchemy.
For more options, visit https://groups.google.com/d/optout.


[sqlalchemy] @property mapping via hybrid property?

2016-02-29 Thread TomS.

Hi!

I've got an answer on Stack ( 
http://stackoverflow.com/questions/35653889/sqlalchemy-property-mapping/35654405 
) - solution uses relationship and an association proxy. But I also 
found out that it can be achieved via hybrid property. How to do this?


I have following models:

|class  Details(db.Model):

details_id=  db.Column(db.Integer,  primary_key=True)
details_main=  db.Column(db.String(50))
details_desc=  db.Column(db.String(50))

class  Data(db.Model):

data_id=  db.Column(db.Integer,  primary_key=True)
data_date=  db.Column(db.Date)
details_main=  db.Column(db.String(50))

@property
def  details_desc(self):

result=  object_session(self).\
scalar(
select([Details.details_desc]).
where(Details.details_main==  self.details_main)
)

return  result|


Now, I would like to run query using filter which depends on defined 
property. I get an empty results (of course proper data is in DB). It 
doesn't work because, probably, I have to map this property. The 
question is how to do this? (One limitation: FK are not allowed in this 
DB's design).


|Data.query\
.filter(Data.details_desc==  unicode('test'))\
.all()|



Cheers,
TomS

Exported from Notepad++

--
You received this message because you are subscribed to the Google Groups 
"sqlalchemy" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sqlalchemy+unsubscr...@googlegroups.com.
To post to this group, send email to sqlalchemy@googlegroups.com.
Visit this group at https://groups.google.com/group/sqlalchemy.
For more options, visit https://groups.google.com/d/optout.