SQL Compact and The Entity Framework


Linq to SQL Vs Entity Framework

Linq to SQL is not supported using SQL Compact, although it is possible to use Linq to SQL using SQLMetal.exe, I have a post here showing how this can be done. The Entity Framework however, is fully supported in SQL Compact CE.

Personally speaking, if I was to create a large application that is data access heavy (especially if it is multi-user), then I would still opt for ADO.NET datasets and stored procedures. There are quite a few applications that don’t require the the fuss of creating and maintaining stored procedures, and in those cases, I definitely prefer using an Object Relational Mapping tool (ORM).

Linq to SQL Development Terminated

Microsoft announced that they would stop any further development of Linq to SQL. This (as you might guess) was not received at all well in the developer community – just read the comments in the announcement – because this little ORM tool gained phenomenal popularity very quickly, mostly because it was very lightweight though still very powerful.

The Entity Framework has not been received as favourably, with many dissatisfied by the complexity and ultimately the cogency of the object model. For me, however, the Entity Framework is just an ORM tool like Linq to SQL, and contains pretty much all the functionality, so I see no point in not using it, especially since resources are continuing to be poured into it, and the same Linq queries you ran in Linq to SQL are the same you run as Linq to Entities.

Master Details Demo

For this demo (note this demo is in both C# and Visual Basic), I am going to create a simple master/details form that collects data from the Northwind database, make some changes and save those changes. This is very typical for any type of application that would use a compact database.

Create a new .NET 3.5 windows forms project and call it NorthwindTraders.

Add a new item to the project

AddNew

Choose the ADO.NET Entity Data Model and name it NorthwindModel and clock add

NorthwindModel

You should now have the Entity Data Model Wizard, choose “Generate From Database” and click “Next”

Wizard

Here you can choose the database type which can be SQL 2008, Express or SQL Compact. Since this demo is about SQL compact, change the datasource by clicking the change button

DataBaseType

And choose SQL Compact and click OK

SQLCompact

When you click “Browse to locate” a database, you should automatically open the Northwind.sdf located at C:Program FilesMicrosoft SQL Server Compact Editionv3.5Samples

BrowseForDB

I never run as administrator when developing in general (on Vista/Wiindows 7) so you will need to move this sample database to a folder where you can access it without running as administrator. I have copied the database to a folder on my C drive, select the database there and click OK.

You should now have the connection string, and set the connection string to NorthwindEntities and click “Next”.

ConnectionString

Choose the “Customers” and “Orders” tables and click “Finish”

Model

IF you double click the NorthwindModel.edmx in Solution Explorer you should have the ORM in the designer.

ModelInDesigner

Note: The designer has a slight bug in that if the scroll bar is right at the top (see red arrow above), then you cannot see the mappings, to correct this, just move the scrollbar down a little

From the Data menu, add a data source

AddDataSource

Select “Object” and select “”Next

Customers

Click “Finish” and you should now have the datasource showing

Add a SplitContainer to the form and set the orientation to “horizontal”. Drag a DataGrid from the Customers Node into the first split panel, and do the same with the related orders

MasterDetails

You should now have the two datagrids, the two binding sources and a binding navigator. Set the save button in the navigator to “Enabled”. Double click the form to create a Form_Load event and also double click the save button to create an event handler

Enabled

In the code behind of the form (note Visual Basic code is beneath)

C#

using System;

using System.Collections.Generic;

using System.ComponentModel;

using System.Data;

using System.Drawing;

using System.Linq;

using System.Text;

using System.Windows.Forms;

 

namespace NorthwindTraders

{

    public partial class Form1 : Form

    {

        NorthwindEntities context;

 

        public Form1()

        {

            InitializeComponent();

        }

        /// <summary>

        /// When the form loads we instantiate the NorthwindEntities context and perform

        /// a simple Linq query that returns all the customers in London

        /// </summary>

        /// <param name="sender"></param>

        /// <param name="e"></param>

        private void Form1_Load(object sender, EventArgs e)

        {

            context = new NorthwindEntities();

 

            var customers = from c in context.Customers.Include("Orders")

                            where c.City == "London"

                            select c;

 

            this.customersBindingSource.DataSource = customers;

        }

 

        /// <summary>

        /// Save the changes made in both DataGridviews

        /// </summary>

        /// <param name="sender"></param>

        /// <param name="e"></param>

        private void customersBindingNavigatorSaveItem_Click(object sender, EventArgs e)

        {

            this.context.SaveChanges();

 

        }

 

 

    }

}

Visual Basic

Public Class Form1

 

    Private context As NorthwindEntities

 

    ”’ <summary>

    ”’ When the form loads we instantiate the NorthwindEntities context and perform

    ”’ a simple Linq query that returns all the customers in London

    ”’ </summary>

    ”’ <param name="sender"></param>

    ”’ <param name="e"></param>

    Private Sub Form1_Load(ByVal sender As System.Object, ByVal e As System.EventArgs) Handles MyBase.Load

        context = New NorthwindEntities()

 

        Dim customers = From c In context.Customers.Include("Orders") _

            Where c.City = "London" _

            Select c

 

        Me.CustomersBindingSource.DataSource = customers

    End Sub

 

    ”’ <summary>

    ”’ Save the changes made in both DataGridviews

    ”’ </summary>

    ”’ <param name="sender"></param>

    ”’ <param name="e"></param>

    Private Sub CustomersBindingNavigatorSaveItem_Click(ByVal sender As System.Object, ByVal e As System.EventArgs) Handles CustomersBindingNavigatorSaveItem.Click

        Me.context.SaveChanges()

    End Sub

End Class

As you can see, it really is quite easy to use the EntityFramework, and pretty much the same functionality in Linq to SQL is available here.

Choosing the correct database access technology for your application


Over the last year or so, I have been developing a line of business application (LOB). The application itself is quite dull to discuss because unless you know the business problems it tries to specifically solve, then you will probably fall asleep reading about it – so I shan’t compose a disquisition here.

Problem

A core requirement of a typical LOB application is that it is multi-user, and that at any given moment, you usually have a lot of people adding, accessing or modifying data. This presents a number of challenges e.g.

  1. You need to be able to control who has access to certain modules for the application, and who can modify, delete or add data.
  2. You need to ensure that if the same record is modified by 2 users at the same time, a level of concurrency is met.
  3. You need to be able to make certain modules of the application available via the internet or web service.
  4. You need  to shape your data

Linq to SQL/Linq to Entities

This is an inexhaustive list, but will be sufficient to allow the objective of this post to be comprehensible. When I started the aforementioned application development, I elected Linq to SQL for my data access, as that [Linq] made it very easy and super fast, to develop the application.

I did however start to run into some limitations e.g. the DataContext object – central to Linq and the new Entity Framework –  needed to be in the same form for all your Create, Update, Delete or CRUD operations to work correctly. This resulted in all my business logic, presentation logic and data access logic being in the same file. Whilst this worked, the application codebase quickly became unmaintainable and very ugly. It was possible to separate this code, but this involved workarounds, and the object count in the application inflated.

DataSets

The chief problem with Linq to SQL and the Entity Framework is that you must always remain connected to the database, once a connection is broken, then change tracking for your application is lost. After a few months of trying various workarounds, I decided to go back to the tried and tested ADO.NET datasets. I have a tutorial here, which allows you to create a WCF service using datasets. The biggest strength with datasets is that they allow for a disconnected environment. What happens with datasets is that when you request data from a database, you have the tables and columns copied locally (into the dataset) which your users can then modify, and the changes then be persisted back to the underlying database at a later stage. This improves database performance, as the dataset just gets the data and breaks the connection to the database immediately, unlike the always connected Linq to SQL. In multi user applications, this is a serious concern.

Solution – SQL and Stored Procedures

  1. Stored procedures allow the database administrator to give certain users read, write or update permissions on a stored procedure. This allows for an extra layer of security for your data and solves the first problem above.
  2. Stored procedures make it easy to handle concurrency violations.
  3. Once your stored procedures are defined, you can point an ASP.NET, Silverlight or Web Service to them, negating rewriting the same logic.
  4. You can provide joined and computed columns with aggregates. Typically you need to display related tables in the same datagrid, or provide computed columns with totals for example. Stored procedures make this easy, where you would otherwise have to handle datagrid events and perform calculations on the client. This increases code bloat and decreases performance.

At the heart of datasets is SQL, and you have the option to either embed the SQL statements into the dataset, or you can create stored procedures where the statements are saved as functions in SQL Server, and you use these functions in your dataset.

I must admit that at first I found stored procedures extremely daunting, and avoided them as best I could, as they seemed overly complex. Subsequently,I was forced into learning them because my application architecture necessitated them, and have subsequently and might add pleasantly found that they are not as difficult to grasp as I thought. SQL is a pure functional language. Not “pure” in the sense of a functional language like Haskell, but it is composed purely of functions that one uses in their procedures. It is this functional syntax that most C# or Visual Basic.NET developers loathe, as it is completely different to a general purpose programming language.

In truth when you use Linq to SQL or Linq to Entities, your queries are converted to SQL anyway, so in my book, knowing SQL well makes you a better developer because

  1. You understand what the C#/VB abstraction layer is generating, why and how.
  2. You understand functional programming better, compared to the functional constructs added to C# or VB, as SQL is a pure functional language.

Digression

I was in a second hand bookstore (I like collecting books new and old), and picked up a 10 year old book on Transact-SQL. It was one of those “Teach yourself Transact-SQL in 21 days” type books I usually detest, but I am really pleased with it. The book is wonderfully written, but what is most striking is the fact that most SQL has stayed the same over the last decade or so – if not back to the IBM SEQUEL days in the mid–seventies.

Over the last decade Microsoft have introduced many flavours of ADO.NET, and most have always had their drawbacks. The reason I like SQL so much, is that is is not going away anytime soon, so knowing it well, situations you advantageously over the flavour of the month data access solutions, and one can use tried and tested technologies and methodologies, which are essential in multi-user distributed applications.

Domain Driven Design and MVC

Another discovery I have found with using stored procedures, is that you inadvertently design the application domain first, then implement the logic in your application. This is somewhat counter to the style of programming your typical developer likes, but the advantages of this type of design, far outweigh any other considerations. If you look at the MVC Framework Microsoft is pushing at present, you soon discover that stored procedures are your model (In Model View Controller), so all you have left to implement are your View and Controllers, again a default advantage to separating your data access logic from your application.

Application Changes and Updates

A key aspect of enterprise applications is reporting. In general, reporting is a moving target, and companies generally require that their reports are modified frequently to obtain the latest data to give them the competitive edge. With your data access logic tied to Linq to SQL/Entities, one would need to recompile the application, then force every terminal in the enterprise to re-install their application so they can access the latest reports. Using stored procedures, you can simply create a powershell script and run this on the server. The new reports are then available to every terminal, without a re-install, which is very expense if you have hundreds of terminals.

I hope the above has provided you with a basic explanation as to why you need to be judicious about what data access technology you choose, and that the newer technologies are not always the best in some situations.

Report Builder 2.0 Released


Microsoft SQL Server 2008 Reporting Services Report Builder 2.0 (bit of a mouthful isn’t it?) has now been released and is available for download here.

Overview

Microsoft SQL Server 2008 Reporting Services Report Builder 2.0 delivers an intuitive, Office-like report authoring environment enabling business and power users to leverage their experience with Microsoft Office 2007 products. Microsoft SQL Server 2008 Reporting Services Report Builder 2.0 supports the full capabilities of SQL Server 2008 Reporting Services including:

  • Flexible report layout capabilities of SQL Server 2008 Report Definition Language
  • Data Visualizations including charts and gauges
  • Richly formatted textboxes
  • Export to Microsoft Office Word format

Features specific to Report Builder 2.0 are focused on simplifying the process of creating and editing reports and queries and include the following:

  • Easy to use wizards for creating table, matrix and chart data regions
  • Support for directly opening and editing reports stored on the report server
  • Support for using server resources such as shared data sources
  • Query designers for multiple data sources including a Microsoft SQL Server-specific query designer

ReportBuilder

Note; The title says SQL 2008 and is misleading, as you can use this to target SQL 2000, 2005 and a whole host of other databases like Oracle and Microsoft Access.

The ADO.NET Entity Framework, REST and SOAP


As an n-tier developer, specialising in enterprise architecture, web services are central in determining how data is transferred between application and data store. It came as a painful shock to many, trying to implement Linq in n-tier scenarios, why they found such difficulty. The principal problem in Linq is that the “datacontext” object must always be connected to the database. Once the connection is broken, then change tracking (or state) is lost. For people coming from an ADO.NET disconnected dataset background, it seemed mysterious and a step backwards. I was one of those people, and ended up returning back to disconnected datasets, principally because the infrastructure was so mature, even though there are pain points. On the other hand, the ADO.NET Entity Framework has the distinct advantage over datasets, in that it is usable across all data providers i.e. Oracle, MySQL etc. whereas datasets are strictly a .NET shop. Datasets are still my data access layer object of choice for Smart Client focused applications, with the ADO.NET Entity Framework better suited to the Web.

The ADO.NET Entity Framework is a new layer in ADO.NET, recently released in Visual Studio 2008 and .NET 3.5 service pack 1. This in some ways overlaps with Linq, because it again is an Object Relational Mapping (ORM), but this is really heavyweight – it does more, and then some. Central to its architecture, is Representational State Transfer (REST). Aside from the cool acronym, what is it that REST offers that SOAP does not?

Within a software architect’s remit, is the task to never be technologically obsequious, and choose the correct technology to solve a specific problem. So which is better?

REST is a lightweight HTTP based stack, SOAP on the other hand is XML and rather heavyweight. Neither is better than the other. Bob Familiar and Ron Jacobs have an excellent video to disambiguate the two.

endpoint.tv – SOAP and REST a Perspective

New SQL Server 2008 Data Types


SQL Server 2008 includes 7 new data types. Presently, I’m embroiled in a Business Intelligence application that makes heavy use of Date and Time values. SQL Server 2005 is quite inflexible in this regard, because it was released after .NET 2.0 and failed to include the new Date and Time features. These new data types also resolve issues pertaining to the global nature of applications over the Internet. In layman’s terms, Microsoft in particular are investing heavily in Software as a Service (SaaS). If your application will be exposed over the Internet, with Rich Internet Application (RIA) frameworks like Microsoft Silverlight, then new features like this become necessities, especially across time zones.

Luckily this has now been addressed, and we have fields that offer far greater precision – essential for BI – and a separation of concerns. I have trawled through the Internet trying to find information, but most articles hitherto appear derived from this comprehensive technet article.

SQL Server 2008 and occasionally connected client support


For those of you that don’t already know, SQL Server 2008 was released last week. It has been a long wait, and as soon as the .NET framework 3.5 SP1 has been released for Visual Studio 2008 (later today), developers can then start to program against this new SQL server version. I will be adopting SQL 2008 immediately, as there is a plethora of goodness which I will blog about in the forthcoming weeks and months, including the new reporting services.

One key feature introduced (especially for n tier applications) is the ability to have a SQL database – “local” or in “the cloud” – populate a local SQL Compact Edition (CE) database with information that scarcely changes, or does so every once in a while. Think of a suppliers or employees table in a database. These are commonly used as lookup tables, that are frequently used in applications, so the ability to cache them locally then synchronise them when something changes or when the application re-connects back to the cloud, does really improve performance, and reduce the amount of hits on the database server.

This should all happen in the data access layer (if your application is well architected) so does not matter whether you use Windows Forms or WPF. This outlines for me why for business applications one certainly wants to elect disconnected datasets as opposed to “new kids on the block” Linq to SQL and ADO.NET Entity Framework ORM’s. It will be some time before this type of support is available as Linq to SQL Compact is currently unsupported.

To see how this works, a short and concise screencast is available here that is well worth a look. The SQL compact team have almost made it too easy, visual studio integration is fabulous, making utilising the new change tracking feature available in SQL Server 2008 a breeze.