Hi all,

A while ago, we were discussing avoiding using LINQ to query sql server.
The preferred method of querying discussed was either to use direct SQL
calls or stored procs to perform data manipulation.


This was because the overhead of starting up Entity Framework is
significant and the underlying queries produced by LINQ can be quite
convoluted and inefficient. Lazy loading is also something to be avoided
(at the very least you should be using Eager loading – which forced you to
be explicit about what related data is being included/loaded. As an aside,
I’ve also seen a massive performance drop when using mappers to covert
database objects in EF to POCO objects using tools such as AutoMapper.)


Add to this, that putting some business logic in stored procs is about the
most efficient way to perform data manipulation in a SQL Server database.
It is unbelievably fast and efficient compared to passing all the data over
the wire to your middle tier to perform any updates and then passing it
back to commit the data to the database.


In fact, I would argue that the very fact that current “best practice” is
to inefficiently pass all your data to the middle-tier to be modified, only
to be returned to the database for the update, is a failure in modern
development, but of course, there is not really an alternative if your
intent is to performing proper unit testing. It is a very sad thing that
modern enterprise development has not worked out how to utilise the full
power of SQL Server other than to say "only use stored procs in special
cases."


So the question I have is, if it was decided to put business logic in
stored procedures (and some of you have, I know, even though a few of you
with the purist hat would deny it!), how do people currently unit test
their stored procs?


Kind regards,

Tony

Reply via email to