the easiest approach to this is to run each test within a transaction  
(begun during setUp()), and issue a rollback() during the tearDown()  
process.   our own tests don't actually do this since we test hundreds  
of different table configurations, but I've used it with success in  
other applications that work with a fixed schema.


On Jan 18, 2009, at 2:00 PM, Adam Dziendziel wrote:

>
> Hello,
>
> I'm working on a larger project which is using SQLAlchemy. We started
> with writing automated tests using an in-memory SQLite database and
> inserting test data in setUp() then emptying tables in tearDown().
> Even though the in-memory SQLite is pretty fast, the process of
> inserting test data (sometimes a hundred of records) takes significant
> amount some time. Eventually, a test which should take 30ms, takes
> 300ms.
>
> One solution is probably to create a DAO layer with find_by..()
> methods on top of the SQLAlchemy layer and then write a DAO mock which
> stores records in an array. Another one would be to create a fake
> session which stores records in an array. The latter would probably be
> better because we are not creating another unnecessary layer, but
> harder to implement, because of the complex query API.
>
> How do you cope with this situation?
>
> Best regards,
> Adam
> >


--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"sqlalchemy" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to 
[email protected]
For more options, visit this group at 
http://groups.google.com/group/sqlalchemy?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to