Over the last year or so, I have been developing a line of business application (LOB). The application itself is quite dull to discuss because unless you know the business problems it tries to specifically solve, then you will probably fall asleep reading about it – so I shan’t compose a disquisition here.
A core requirement of a typical LOB application is that it is multi-user, and that at any given moment, you usually have a lot of people adding, accessing or modifying data. This presents a number of challenges e.g.
- You need to be able to control who has access to certain modules for the application, and who can modify, delete or add data.
- You need to ensure that if the same record is modified by 2 users at the same time, a level of concurrency is met.
- You need to be able to make certain modules of the application available via the internet or web service.
- You need to shape your data
Linq to SQL/Linq to Entities
This is an inexhaustive list, but will be sufficient to allow the objective of this post to be comprehensible. When I started the aforementioned application development, I elected Linq to SQL for my data access, as that [Linq] made it very easy and super fast, to develop the application.
I did however start to run into some limitations e.g. the DataContext object – central to Linq and the new Entity Framework – needed to be in the same form for all your Create, Update, Delete or CRUD operations to work correctly. This resulted in all my business logic, presentation logic and data access logic being in the same file. Whilst this worked, the application codebase quickly became unmaintainable and very ugly. It was possible to separate this code, but this involved workarounds, and the object count in the application inflated.
The chief problem with Linq to SQL and the Entity Framework is that you must always remain connected to the database, once a connection is broken, then change tracking for your application is lost. After a few months of trying various workarounds, I decided to go back to the tried and tested ADO.NET datasets. I have a tutorial here, which allows you to create a WCF service using datasets. The biggest strength with datasets is that they allow for a disconnected environment. What happens with datasets is that when you request data from a database, you have the tables and columns copied locally (into the dataset) which your users can then modify, and the changes then be persisted back to the underlying database at a later stage. This improves database performance, as the dataset just gets the data and breaks the connection to the database immediately, unlike the always connected Linq to SQL. In multi user applications, this is a serious concern.
Solution – SQL and Stored Procedures
- Stored procedures allow the database administrator to give certain users read, write or update permissions on a stored procedure. This allows for an extra layer of security for your data and solves the first problem above.
- Stored procedures make it easy to handle concurrency violations.
- Once your stored procedures are defined, you can point an ASP.NET, Silverlight or Web Service to them, negating rewriting the same logic.
- You can provide joined and computed columns with aggregates. Typically you need to display related tables in the same datagrid, or provide computed columns with totals for example. Stored procedures make this easy, where you would otherwise have to handle datagrid events and perform calculations on the client. This increases code bloat and decreases performance.
At the heart of datasets is SQL, and you have the option to either embed the SQL statements into the dataset, or you can create stored procedures where the statements are saved as functions in SQL Server, and you use these functions in your dataset.
I must admit that at first I found stored procedures extremely daunting, and avoided them as best I could, as they seemed overly complex. Subsequently,I was forced into learning them because my application architecture necessitated them, and have subsequently and might add pleasantly found that they are not as difficult to grasp as I thought. SQL is a pure functional language. Not “pure” in the sense of a functional language like Haskell, but it is composed purely of functions that one uses in their procedures. It is this functional syntax that most C# or Visual Basic.NET developers loathe, as it is completely different to a general purpose programming language.
In truth when you use Linq to SQL or Linq to Entities, your queries are converted to SQL anyway, so in my book, knowing SQL well makes you a better developer because
- You understand what the C#/VB abstraction layer is generating, why and how.
- You understand functional programming better, compared to the functional constructs added to C# or VB, as SQL is a pure functional language.
I was in a second hand bookstore (I like collecting books new and old), and picked up a 10 year old book on Transact-SQL. It was one of those “Teach yourself Transact-SQL in 21 days” type books I usually detest, but I am really pleased with it. The book is wonderfully written, but what is most striking is the fact that most SQL has stayed the same over the last decade or so – if not back to the IBM SEQUEL days in the mid–seventies.
Over the last decade Microsoft have introduced many flavours of ADO.NET, and most have always had their drawbacks. The reason I like SQL so much, is that is is not going away anytime soon, so knowing it well, situations you advantageously over the flavour of the month data access solutions, and one can use tried and tested technologies and methodologies, which are essential in multi-user distributed applications.
Domain Driven Design and MVC
Another discovery I have found with using stored procedures, is that you inadvertently design the application domain first, then implement the logic in your application. This is somewhat counter to the style of programming your typical developer likes, but the advantages of this type of design, far outweigh any other considerations. If you look at the MVC Framework Microsoft is pushing at present, you soon discover that stored procedures are your model (In Model View Controller), so all you have left to implement are your View and Controllers, again a default advantage to separating your data access logic from your application.
Application Changes and Updates
A key aspect of enterprise applications is reporting. In general, reporting is a moving target, and companies generally require that their reports are modified frequently to obtain the latest data to give them the competitive edge. With your data access logic tied to Linq to SQL/Entities, one would need to recompile the application, then force every terminal in the enterprise to re-install their application so they can access the latest reports. Using stored procedures, you can simply create a powershell script and run this on the server. The new reports are then available to every terminal, without a re-install, which is very expense if you have hundreds of terminals.
I hope the above has provided you with a basic explanation as to why you need to be judicious about what data access technology you choose, and that the newer technologies are not always the best in some situations.
2 thoughts on “Choosing the correct database access technology for your application”
Look at this http://6opuc.wordpress.com/2008/12/29/data-access-in-csharp/
“After a few months of trying various workarounds, I decided to go back to the tried and tested ADO.NET datasets”
Thanks for sharing this, it could save others heaps of time