Friday, December 22, 2006

Touchy connection strings

My ASP.NET 2.0 / SQL Server 2005 application was working perfectly in my development environment but I experienced the following error message when testing deployment in a Windows 2003 virtual machine:

"A connection was successfully established with the server, but then an error occurred during the login process. (provider: Shared Memory Provider, error: 0 - No process is on the other end of the pipe.)".

My connection string was:

Server=localhost;Database=myDB;Integrated Security=SSPI;

Simply changing it to the following solved the issue:

Data Source=.;Initial Catalog=myDB;Integrated Security=True;

Considering both shared memory and TCP/IP where enabled in both environments, I can neither explain why the issue occurred or why the change above solved it. If someone can, please leave a comment.

Thursday, December 21, 2006

Diagnosing "aspnet_merge.exe exited with code 1” error in Web Deployment Projects

For those that have problems with Error 24 "aspnet_merge.exe" exited with code 1.

The error is related to a duplicate class name in your web project, i.e. two files which have the same name, generally in different directories.

Those files could be hard to find and the following is a quick way to isolate the problem and find the duplicate names.

In Visual Studio 2005:

  • Select Tools --> Options.
  • Then in the Projects and Solutions branch, select Build and Run.
  • You'll see a dropdown box for MSBuild Project Build Output Verbosity. Change this to Diagnostic, and OK.
  • Show your build output by selecting the View --> Output option.
  • Make sure that in the 'Show output from' box, you've selected "Build".
  • Then just build your project, and wait for the inevitable fail. You will see where the duplicates are when it fails to build.

This tip has been posted by Simon Morgan on

Tuesday, December 19, 2006

New incompatibility between ASP.NET Ajax 1.0 RC and Google AdSense

Run the following ASP.NET page in IE7 and you will get a Javascript error:

<%@ Page Language="C#" %>
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "">
<script runat="server">
<html xmlns="" >
<head runat="server">
<title>Untitled Page</title>
<form id="form1" runat="server">
<asp:ScriptManager ID="ScriptManager" runat="server" >
<asp:ScriptReference Name="PreviewScript.js" Assembly="Microsoft.Web.Preview" />
<script type="text/javascript"><!--
google_ad_client = "pub-6623312146541354";
google_ad_width = 250;
google_ad_height = 250;
google_ad_format = "250x250_as";
google_ad_type = "text";
google_ad_channel = "";
google_color_border = "000000";
google_color_bg = "F0F0F0";
google_color_link = "0000FF";
google_color_text = "000000";
google_color_url = "008000";
<script type="text/javascript"

ASP.NET Ajax page method on login page requires setting location in web.config

I have spent quite a while on this one, so I deliver my findings here hoping that it will help.

If you have an ASP.NET web site with forms authentication and your login page has an ASP.NET Ajax extensions page method, this method will not work unless you set:

<location path="login.aspx">
<allow users="*"/>

Otherwise, the call to ~/login.aspx/yourScriptMethod fails because it is not authenticated.

Thursday, December 14, 2006

“debugger”, the magic JavaScript statement

This post is dedicated to all the Javascript programmers who still use alerts to debug their code.

Not sure about you, but I have always struggled to set and enable Visual Studio breakpoints to debug my Javascript code until I have found about a magic statement named debugger.

To use it:
  1. Make sure “Disable Script Debugging” is unchecked in Internet Explorer, Tools (menu) -> Internet Options… (submenu) -> Advanced (tab) -> Browsing (node).
  2. Put the debugger; statement anywhere in your code where you want the breakpoint.
  3. Execute your web application and when the browser script engine interprets the statement, it launches the debugger and breaks on the statement line.

Monday, November 27, 2006

Migration from Atlas July CTP to ASP.NET Ajax Beta 2

I am developing an application which does not really make use of Atlas controls but has a significant amount of Atlas scripting code to migrate.

I am not found of declarative scripting but maybe I am old-fashioned: I like to step through my code to debug it. My opinion is that declarative scripting is only good as the output of code-generation tools. It is not something that developers should write in a text editor.

So, I have got script to migrate and I have read the documentation available at:

Downloading and installing ASP.NET Ajax Beta 2 is well explained in the documentation but I had to do a bit of guessing as well as exploring Microsoft’s code to migrate my own code. I deliver my findings below:

Migrating client script

  1. To use client scripting of html components, you need to add a reference to PreviewScript.js in your ScriptManager as shown below:

<asp:ScriptManager ID="ScriptManager" runat="server">
<asp:ScriptReference Assembly="Microsoft.Web.Preview"

  1. $(id) is now $get(id)

  2. The Sys.UI namespace is now Sys.Preview.UI

  3. Sys.UI.Select is now Sys.Preview.UI.Selector

  4. is now Sys.UI.Preview.CheckBox.add_click(handler)

  5. Sys.UI.Select.selectionChanged.add(handler) is now Sys.Preview.UI.Selector.add_selectionChanged(handler)

  6. $addHandler/$removeHandler is the new way to add event handlers

  7. Sys.UI.Control.get_enabled() and set_enabled() no more work

Migrating web service calls

  1. You need to decorate your web services and page methods with new attributes as explained in

  2. The prototype of asynchronous calls to web services and page methods has been simplified, but this is well explained in the documentation listed above.

Friday, November 24, 2006

How to configure the IIS SMTP Service to send emails from ASP.NET code?

I have probably configured the IIS SMTP service a dozen times over the past five years but I still struggle with authentication and relay, so I have decided to write this note once and for all.

First, the source for your System.Net.Mail issues is definitely, including some sample code.

Because we are all urged to deliver rapidly, I have provided a test application which you can download from Contrary to most samples that you will find on the Internet, this one uses a configuration file including\mailSettings, which is probably the way you want to send emails from your ASP.NET code.

Obviously you need to install the SMTP Service, which is well explained at

Let’s say you need to send your emails from website@yourdomain.tld. In this case, it is recommended (but not required) that you rename the default domain into yourdomain.tld.

Display the default SMTP Virtual Server properties and on the Access tab:

  1. Configure authentication by removing Anonymous access and allowing Basic authentication and/or integrated Windows authentication;
  2. Grant relay to and allow all computers which successfully authenticate to relay.

You can certainly be more restrictive depending on your requirements, once you got teh above configuration to work properly.

Finally, you can create a dedicated account and properly configure your settings:

<smtp deliveryMethod="Network" from="website@yourdomain.tld">
<network host="" port="25" userName="DOMAIN\account" password="password"/>

What’s next after designing and hosting your web site?

You have now designed a nice web site which you have uploaded on a hosted server and you think you are ready to go. Not quite right. There are a few more steps required which will guarantee a better visibility of your web site.

I am not a search engine optimization (SEO) professional and qualified people are writing books about SEO which can be a very complex subject. But I have been confronted to the issue, I have searched for answers on the Internet and I publish here my findings. If you apply the 80-20 Pareto rule, you can get decent results in three steps that you may want to re-iterate periodically:

  1. Make your web site is right for search engine crawling robots;
  2. Submit your web site to search engines;
  3. Get referral links.

I may have done something right since I have managed to get our web site on the first page on MSN and Yahoo. I know from experience that the only way to get a high ranking on Google is to obtain referral links.

1) Make your web site right

First, you need to make sure that search engine robots can read your web site. You also need to assist search engine robots by telling them which pages to index and how to categorize your web site.

Make sure your HTML is compliant

A web design environment like Dreamweaver will warn you of any non-compliance issue. You can also use a text browser like Lynx to check that your web site displays properly to search engines. Note that search engine robots read sites in a similar way. You can also check your pages at

Important also is to set the page encoding and content language. I personally recommend using utf-8 encoding even on English pages. Contrary to what people often think, utf-8 is not double byte and the size of an English page is the same in utf-8 and iso-8859-1.

Describe your web site properly

Title, meta description, meta keywords and images alternate text contribute to help search engines categorize your web site properly.

Getting your keywords right is a difficult exercise:

  1. you should limit yourself to 10 to 20 keywords;
  2. the more targeted your keywords the more efficient they are to help find you, but you do not want to be too narrow;
  3. As far as Google, Yahoo and MSN are concerned, you do not need both the singular and plural of a keyword and the order of words in a keyword seems to make no difference.

The following tools will help your build your keywords meta tag:

Add an address and a privacy policy

An address and a privacy policy will not give you a high ranking, but they may prevent you from getting one because you won’t look like a serious company. To add a privacy policy for your site, follow the steps at

Add content rating

To add content rating for your site, follow the steps at

Create sitemaps

Sitemaps will tell search engines which pages to look for. To create sitemaps, follow the steps at You can get more information at

You are encouraged to submit your sitemaps to search engines, in particular:

Create a robot.txt file

You create a robots.txt file essentially to tell which parts of your site not to index. More information about robots.txt files is available at

Add web site monitoring and statistics

If your web site is too often inaccessible, you will be downgraded by search engines. Accordingly, you need to monitor your web site and keep your ISP honest. The following monitoring services range from free to high-end:

Most hosting packages come with AWStats and Webalizer which give poor statistics. There is obviously but I can only recommend the free

2) Submit your web site to search engines

Most ISPs offer search engine submission standard with hosting packages. Unless you are very lazy, do not pay for a submission service and do not buy submission software. Submitting your URLs to Google, MSN and Yahoo only takes 10 minutes and covers for more than 80% of the search engine traffic.

This is especially true if you already own a web site that is referenced. In this case, you just need a hyperlink from the referenced web site to the new web site and the search engine crawlers will automatically index your new site.

3) Get referral links

Getting referral links is the only thing that guarantees high rankings. Basically, the more a page is referenced on the web, the best the value of this page to the Internet community, so the highest ranking it gets in searches. Additionally, a reference on the web site is worth more than a reference on

So you need to work on getting other web sites to reference your own web site. This is an everyday job that only your organization can successfully perform. You should get a decent ranking on Google (top 3 pages) with about 100 references but obviously some topics are more crowded than others. The earlier you start the better. Good luck!

Thursday, November 23, 2006

Troubleshooting SMTP using Telnet

Today, I have been confronted to configuring Virtuozzo + Plesk on our new VPS web site server to send emails from ASP.NET code.

On our old Plesk 7 server, we used to send emails from an email account which had no mailbox but the same configuration did not seem to work in the more recent Plesk 7.6. I am not a Plesk aficionado, so I went the old way to find out what was going wrong.
  1. Open a command windows by typing cmd in the Start -> Run… dialog box and click OK.
  2. At the prompt, type telnet mail.yourdomain.tld 25 + Enter
  3. The mail server should display version information.
  4. Type HELO + Enter.
  5. Type AUTH LOGIN + Enter.
  6. Type your mailbox username or username@ yourdomain.tld encoded in base64 + Enter. ESMTP requires a login in the form username@ yourdomain.tld. A helpful site to get a base64 encoded value is
  7. Type your mailbox password encoded in base64 + Enter.
  8. If you have successfully authenticated, the server will reply "235 authenticated".
  9. Type MAIL FROM: username@ yourdomain.tld + Enter.
  10. Type RCPT TO: anothermailbox@ yourdomain.tld + Enter.
  11. Type DATA + Enter.
  12. Type Subject: Test + Enter + Enter (Send a blank line to separate the headers from the message body).
  13. Type SMTP test + Enter.
  14. Type . + Enter (a dot, then Enter).
  15. The server should reply "250 Message queued"
  16. Type QUIT + Enter.
  17. Close the command window and check your mailbox.
In my case, the message that I was getting in .NET was “no email account can send to anothermailbox@ yourdomain.tld”, which is not very helpful but the procedure above was showing that step 8 was actually going wrong. After a few changes in the Plesk configuration I have discovered that the email account required a mailbox to send emails.

Translating a web site into Chinese and Japanese

We are a small company selling software as a service. Most of our leads come from advertising on Google, MSN and Yahoo and although we are European, 70% of our customer base is located in the US. This is not surprising because our web site is only available in English and French and our advertising too. Considering that most of our competitors are located in the US too, we can assume that the cost of a lead in the US is much higher. So we have decided to tap into the great reservoir of non English-speaking countries and have our web site translated in 8 languages, among which Chinese and Japanese.

Word documents

We have put our web content into Word 2003 to get it translated by professional translators. Obviously the Chinese and Japanese documents that we have received were not properly displayed in Word. We have found on the web that we had to install the Proofing Tools for Microsoft Office to display the content, which we have done and it works. We have realized later on a computer that had downloaded the Asian language packs for Internet Explorer that this is another way to get the Japanese and Chinese content properly displayed in Office, and contrary to the Proofing Tools, it is free. In fact Word 2003 is Unicode and you just need the proper fonts.

Html pages

The next step is to get the Word document into HTML. Apparently copying and pasting from Word to Dreamweaver works quite well but I have had so many issues in the past with the way Word handles HTML that I have preferred another way. You need to choose whether you will have an UTF-8 encoded page or whether you will use a code page (GB 2312 for simplified Chinese). A code page will produce a more compact file but this is the old way. We have decided to use UTF-8 for our entire site. In Word, save your word document as “filtered html” and in the Save As dialog select “Web options”. Select the UTF-8 encoding and a simplified Chinese font. Do the same for Japanese. Then you can open your new html document in Dreamweaver and copy paste reliably within Dreamweaver.

You will need to add the following meta tags in your translated html pages:
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8" />
<meta http-equiv="Content-Language" content="zh" />


We use Adobe ImageReady and Macromedia Fireworks for our images. Fireworks is less powerful but the use of the PNG file format makes it much easier to maintain large quantities of image files because Windows Explorer displays the thumbnails. When you copy Chinese characters from Word or Html into ImageReady, only half of them display properly and the others are replaced by question marks. In fact ImageReady selects the MS Gothic font by default when it should paste SimHei or SimSon. These fonts are not even displayed in the font drop-down list but you can key in the font name and this works.

Finally, no matter how well you prepare your work with the translators, you will realize that some strings will be too long to fit your buttons or some last minute changes will not have been taken into account or else and you will find much easier to get an approximate automated translation from

You can check the result at and To download the Asian Pack, simply select View -> Encoding -> More -> Chinese Simplified in Internet Explorer.

Wednesday, November 22, 2006

Resumable file downloads in ASP.NET

The http protocol implements the ACCEPT-RANGES and ETAG response headers to signify the ability to download resources in chunks. There is an interesting article published in MSDN magazine which has some sample code implemented in VB.NET. For the records, similar implementations can be found at:

I have made a C# implementation of this code which you can download from

To use this code, you need to:

  1. Create a C# class library named “Memba.FileDownload” or whatever name you deem more appropriate and add the two class files from the archive.
  2. Create a web site and add the following to the system.web/httpHandlers section of the web.config:
    <add type="Memba.FileDownload.DownloadHandler, Memba.FileDownload" validate="false" path="*.zip" verb="*"/>
  3. You also need to make sure the zip extension is mapped to the aspnet_isapi.dll in the IIS management snap-in otherwise your handler will not be called.

To test this code, download a zip file hosted on the web site you have just created. You should be able to stop and resume the download in Internet Explorer or any download manager which supports resumable downloads.

Where do you get your icons from?

I have never been able to find a decent royalty-free library of icons but you only have what you pay for, haven’t you?

IconExperience is the best library I have found and I can only recommend it. It is comprehensive and affordable. Incors have just released version 2.0 which has more than 2000 icons and now includes the 128*128 format.

If you know any other valuable source of icons, please comment.

Wednesday, October 25, 2006

Designing gadgets and widgets for uploading files to Memba Velodoc

  • A sender, a Receiver, a Subject, a Message, a File to upload and a checkbox to accept terms (see;
  • A progress bar to monitor upload progress;
  • A professional look & feel;
  • To be developed in 3 to 5 days max.
Reference documentation
Findings and impact on technologies
A windows sidebar gadget is a piece of DHTML hosted in the sidebar. Generally what you can do with DHTML can be done in a gadget.

There are limitations though, most of which are related to the browser security sandbox. Effectively a gadget is hosted locally, which has impacts on how it can communicate with remote web sites.

In particular, developing a file upload gadget with a progress bar in DHTML requires a cross-domain iFrame and scripting the iFrame in this case is forbidden. I have tried two nested iFrames which gives you scripting but the deepest iFrame opens in a new browser window instead of within the gadget. The only workaround I have found in half a day is to display the progress bar in a new browser window, but this does not look good and I was not prepared to lose more time.

So, I have considered XBAP WPF browser applications and although there is no File Upload control, I seem to have found the foundations for a file upload gadget here:

But WPF development is really early stage, development environment is primitive, documentation is lacking and I got worried about some warnings related to the sandbox that are mentioned in the literature quoted above.

In these circumstances, the only alternative was Macromedia Flash. I have even found a couple of examples running on the web. I have hosted them within an iFrame of a bespoke gadget and there I was: at the end of day one, I had my design and a fully functional (but ugly) prototype. Further documentation is available at:

Sunday, October 22, 2006

Choosing a GUI library for ASP.NET

You can get an exhaustive list of .NET components at but if you want a GUI library for ASP.NET, your choice is really between:
I would not recommend the other vendors for any of the following reasons:
  • Framework is not sufficiently exhaustive to cover 99% of the requirements;
  • Vendor has not been around for a sufficiently long time;
  • Updates are not sufficiently frequent to follow the pace of new developments like Ajax;
  • Developer license is not royalty-free and/or price is unaffordable;
Infragistics, Telerik and ComponentArt have very similar features, pricing and support when you do not dig into the details of each framework. See the Infoworld comparative article.

I generally find Infragistics to be a richer framework than Telerik and ComponentArt, with two drawbacks: (1) more complex to use and (2) fatter Javascript library to load on the client.

On the mid-term, ComponentArt may have taken a competitive edge by rewriting its components on top of the Microsoft Ajax framework (code-named Atlas).

On the long-term, it is difficult to predict the impact of WPF on web interfaces. On one hand, there is a need for richer interfaces that are easier to develop and Microsoft is committed to deliver the technology (WPF/E) and tools to achieve just that. On the other hand, WPF is really nothing more than Flash the Microsoft way, and very few web sites are developed with Flash yet although the Flash plug-in is available on 90% of internet PCs.

I have personally used Sheridan and now Infragistics controls for years, so I stick with them but if I had to make a new choice now, I would definitely opt for the rewritten ComponentArt controls.

Paypal IPN with UTF8

Today, I got stuck a couple of hours on implementing Paypal IPN with UTF-8 encoding. You get loads of examples on the Web and in the documentation with windows-1252 default charset but none with UTF-8 encoding.

Documentation led me to the wrong track, trying to set the charset and form-charset post fields to UTF-8, but this does not work. You always get a windows-1252 encoded notification.

When you search for the solution on Google, you mostly get complaints from developers who struggle with it and pretend there are bugs.

Finally, the solution is obvious (as always). Log to the Paypal web site and go to your profile.
Click the Language Encoding link.

Click the More Options button.

Select UTF-8 in the Encoding drop down list. That’s all!

Friday, October 13, 2006

SourceSafe 2005 issue fixed

More than 6 months ago, I have experienced an awkward problem with SourceSafe.

I periodically run the following script to launch the Analyze tool on SourceSafe databases:

Dim shell
Set shell = CreateObject("WScript.Shell")

Dim sVssAnalyzeCmd
Dim sVssAnalyzeExe
sVssAnalyzeExe = chr(34) & "C:\Program Files\Microsoft Visual SourceSafe\analyze.exe" & chr(34) & " -F -V3 -D "

'----------> First Project
sVssAnalyzeCmd = sVssAnalyzeExe & chr(34) & "d:\vss2005\project1\data" & chr(34)
shell.Run sVssAnalyzeCmd, 7, true

One day, Analyze was reporting the following error:

The file 0\DATA\\ is not a valid SourceSafe physical database file. It must be renamed to a file with an extension or moved to another directory outside the database.

I found a curious fix on the web: to work around this problem, rename the physical file for the database root. Use all uppercase letters in the new name.

I renamed d:\vss2005\project1\data\a\aaaaaaaa into d:\vss2005\project1\data\a\AAAAAAAA and it worked.

I am glad to realize that a fix has recently been made available by Microsoft at:

Thursday, October 12, 2006

Useful networking tools

Today, I was working on integrating a web application with Paypal. My IPN handler in my development environment could not be reached by Paypal, although my test web server was published to the Internet.

Having searched for simple ways to test a Url from the Internet, I have found the following which I recommend:

[Updated on 22 Oct 2006]

Endeavouring to compare web hosting packages, I have used:

[Updated on 25 Oct 2006]

An excellent and still free web site monitoring service is available at:

[Updated on 6 Nov 2006]]

A comprehensive DNS toolset can be found at:

Preventing page caching and displaying “Page has expired” in ASP.NET 2.0

I have found two answers to this question:

Response.Buffer = True
Response.ExpiresAbsolute = Now().Subtract(New TimeSpan(1, 0, 0, 0))
Response.Expires = 0
Response.CacheControl = "no-cache"


Note that Response.Expires is deprecated in ASP.NET and Response.Cache.SetExpires should be used instead.

<%@ OutputCache location="none" %>


More details at

Sunday, September 10, 2006

Atlas script manager control on login page raises a Javascript error in IE, not in Firefox

The error reported by IE is the following:

If you execute the code carefully in the debugger or use a request analyzer like Fiddler, you will realize that the script manager control requires http://localhost:3261/WebApp/atlasglob.axd which it cannot access until user is logged in.

The solution is to add the following in web.config so that atlasglob.axd can be retrieved when the user is not logged in.

<location path="atlasglob.axd">
<allow users="*"/>

Sunday, July 09, 2006

Choosing a wiki and a blog

I recently had to choose both a wiki and a blog. I also expected to find both functionalities within the same product or hosted service as they are not that different.

My requirements for the wiki were the following:
  • Topic articles with versioning, attachments and discussions;

  • Authoring of articles by invitation only, discussions opened to everyone;

  • Hierarchical organisation of topics, ideally presented like online help: table of contents, index, search;

  • Customisable UI (branding);

  • Full-text search;

  • Authoring using Word;

  • Preferably a hosted service with a friendly URL,

  • Ad-free.
My requirements for a blog were very similar, except:
  • Authoring of articles by the owner of the blog only;

  • Organisation by category and by date.
My first observation is that you apparently cannot have both within the same solution. is an invaluable resource to compare wiki software and two hosted solutions rise above average: jotspot and stikipad, although neither of them fulfils 100% of my requirements. Both products are basic but they do the job although authoring in Word, my top requirement, is not possible with these wikis.

I have not been able to find a comprehensive resource that compares blog software properly although there is some information at The well-known products I have come across are Community Server (formerly .Text), Moveable Type and WordPress. The top services are Blogger, MySpace and MSN Spaces. There is now also a Yahoo offering based on Moveable Type. My preference is Blogger because there is a Word plug-in which works reasonably well and it is free. The main objection is that Blogger lacks proper categories.

My conclusion is that there is still a lot to improve on wikis and blogs especially to streamline the authoring process and to make navigation more fluid.

Thursday, July 06, 2006

Why I rarely use ADO.NET Datasets

Have you ever seen these great presentations which teach you how to build a master-details web page in a few minutes without writing a single line of code using a combination of DataSets, TableAdapters, ObjectDataSources, GridViews and FormViews? There is a great one at

There are sound technical reasons not to use DataSets in ASP.NET applications due to their stateful nature. These reasons are explained by Frans Bouma in his blog at I have experienced other reasons not to use DataSets as a business layer.

The entity-relationship model (the way data is organised in database) rarely corresponds so perfectly (see the video mentioned above) to the way you are going to present it. You will need to go through various transformations, for example:

  • You may have an invoice which is constituted of items and you need to present the invoice total which is actually the total of all item amounts. This total may also need to be presented in several currencies.

  • You may have an N-N relationship between messages and contacts in your database, but you need to display a message with a To, Cc and Bcc fields which are separated lists of email addresses.
Data is rarely presented in the way it is stored in database and a business layer gives you the objects that you need between data and presentation, for example:

  • You need to store UTC dates which will have to be converted using the time zone defined in your user’s profile;

  • You need to store country codes which will have to be mapped to localized country names;

  • You need to store a document status as a byte like 0, 1 and 2, which will have to be mapped to an enumerated value like “draft”, “approved” and “rejected”.
You also need to validate data before storing it in database. You can use validation controls in your presentation layer, but their features are limited and it is good practice to implement business rules in a business layer.

So what should we do?

  1. Implement your data access layer (DAL) as stored procedures handling create, read, update and delete (CRUD) operations;

  2. Implement a business logic layer (BLL) in C# or VB.NET calling stored procedures using data readers and commands, which will achieve much better performances.
Then when do you use datasets?
  1. You only use datasets in rapid application development (RAD) scenarios, or when

  2. You have no other choice, for example when a component that you need (otherwise you would spend hours reinventing the wheel) requires datasets.

Thursday, June 29, 2006

Using Guids or Integers as database identifiers

Globally unique identifiers (Guids) are those strings that you find in COM clsid’s or in the windows registry and that look like: {ED256ABD-5BE7-4E46-BCDA-1E26B0364EBB}. You need 128 bits to store a Guid.

Integers are simply numbers that identify database rows in a table in an ascending order. They generally start at a seed of 1 and are automatically incremented by the database itself with an incremental step of 1, so you get 1, 2, 3, … Note that the database lets you specify the seed and the increment as required. You need 32 bits to store an integer.

The detractors of Guids claim that Guids require a lot more storage space, so they dramatically affect performances. If you want to store contact details with name, address, telephone and fax number in a table, you will easily get a dozen fields which will at least require 1000 bytes to store with a double-byte character set (required for working in any country). In this example, the difference on storage between using a Guid and an Integer as an identity is below 1%.

There are several advantages to using Guids:
  • Guids are well-typed in Java and in the .Net framework and I personally like to have a specific type for identifiers in the business layer;

  • They can be generated in the application, so you do not have to query the database after an insert to know which identifier has been generated for you;

  • They are guaranteed to be globally unique by using a combination of the network card MAC address and time at instant of generation, which is a reasonable guarantee. This is required in replication scenarios and SQL Server replication relies on Guids anyway, so using Guids is more future proof.

Wednesday, June 28, 2006

Stored procedures or inline SQL

Using stored procedure versus using inline SQL has been a debate for long in the software development community. There are good articles on the subject, including:

Generally the debate is about best practice, security, performance and maintainability with two camps:

  • Rapid Application Developers prefer to rely on tools (RAD, ORM) that generate code, which all use inline SQL statements. This camp will claim that their tools enforce best practices, prevent SQL injection attacks and ensure maintainability while SQL execution plan caching makes performance equivalent to stored procedures.
  • Developers who write their own business layer and data access layer code without support from such tools should opt for Stored Procedures which will make their application generally more secure, maintainable and performing better. I am on this side. Also note that having all your SQL code in stored procedures makes it easier to have it audited and reviewed by a database expert.

Tuesday, June 27, 2006

Enterprise Library

One of the most exciting Microsoft initiatives and sections of their web site is the patterns and practices section at As part of this section, there is Enterprise Library for .NET Framework (EntLib).

EntLib is not really a framework. Microsoft calls it a collection of reusable and extensible application blocks for enterprise development. I would define it a set of helper classes which greatly simplify development in the following areas:
  1. Datatabase access;
  2. Caching;
  3. Logging;
  4. Exception Handling;
  5. Security and Cryptography;
Generally these blocks are extremely easy to use and I definitely recommend any .NET architect of serious business application to consider building on top of EntLib with two main benefits:
  • Rock-solid foundations (helper classes);
  • Instrumentation.
Database access
The Data Access Application Block (DAAB) is in my opinion the weakest block and the fans of O/R mapping frameworks like DataObjects, Genome or NHybernate generally laugh at it. Microsoft has made an attempt to complement the DAAB with the Data Mapping Application Block but it does not seem to be continued. I am not a big fan of O/R mapping tools. They make custom development very productive, but you have to tweak the code to get what you really need and optimize performances which makes maintenance more complex on the long term, especially when upgrading to new releases of the frameworks. So I write my own data access layer and I get the productivity gains and best practices from a template-based code generator like CodeSmith. In this scenario, DAAB is very neat and my only objection is that it is only database access. In my applications, I also have data in files. I would love an instrumented FAAB for file IO including XML/CSV/Text files and binary files. And with the new packaging APIs in WinFX for Office Open XML Formats, we could even imagine OAAB. Then we would have a complete data access application block.

Considering the HttpContext.Cache in ASP.NET 2.0 the Caching Application Block is only an incremental improvement in this context but it delivers its full value in the context of WinForms applications.

The logging application block is an alternative to Log4Net. You will need to find or build replacements for the default formatters and trace listeners. The default format of log entries does not make it easy to read large files when you can now associate XML with XSL stylesheets that let you drilldown into data effectively. And you will want to replace the default flat file trace listener with a rolling file trace listener that generates a new file every day or when the size reaches a threshold.

Exception Handling:
Exception handling is the block I personally prefer. I find writing good error handling code and reporting extremely difficult and the exception handling block provides much more than a few helper classes. You get best practices in an extensive framework with the full benefits of configurable policies. This is definitely the block that justifies adopting EntLib.

Security and Cryptography:
The security application block handles caching of security-related credentials and authorizations. Caching security-related credentials works well with Windows Forms applications but it does not fit the Membership provider model implemented in ASP.NET 2.0. For the same reason the authorization part of the block which makes a neat use of rules, does not integrate well in ASP.NET and especially with the navigation server controls. In my opinion, the block lacks proper server controls including menus and command buttons to constitute an application block and not simply a collection of helper classes.

What’s next?
The great new feature of .NET framework 3.0 (formerly WinFX) is workflow and we will need best practices and frameworks to get the full potential of this exciting technology. Microsoft makes an extensive use of RSS in the next releases of Vista, Internet Explorer and Office and the same requirement applies here. Finally we will also want to rip the benefits of the full-text indexing and search functionalities of Vista which are worth a new block.

Monday, June 26, 2006

Ajax at last

I have always felt that HTML and more recently XML/XSL were a step back from object oriented programming and rich user interfaces.

In the mid-90’s, we had very neat C++ user interface (UI) frameworks implementing a model-view-controller (MVC) like Microsoft Foundation Classes (MFC). With these frameworks we could build 100% object oriented applications that were:
  • Easy to architect, design, develop, test and maintain especially because the tools were very mature;

  • Providing rich UI features like complex controls, dynamic data exchange, object linking and embedding, drag and drop or notifications that update the interface when the underlying data changes.

It all went away with the advent of HTML, a rudimentary technology to present and link documentation pages, which has been diverted from its original purpose to develop business applications, just because IT departments were missing the days of mainframes when an application had to be deployed only once on a server to be available everywhere.

Suddenly our applications became very slow. Everything had to be done with a bunch of simplistic UI controls. Data had to be round-tripped to the server to be validated or for the user interface to be updated accordingly. We had to page through long lists to find our items with limited sorting capabilities. Merging data into Word was near to impossible except using the old copy-paste trick. Without proper tools, architecting, designing, developing, testing and maintaining these applications was significantly more difficult.

Then came Java with Swing and the promise of an MVC framework including rich UI controls. I really thought that Java would become the Holy Grail of web interfaces. Java applets were offering the best of both worlds with the promise of “develop once, run everywhere”. But again, we have been deceived; the promise was not kept and information technology (IT) departments did not want to deploy the plug-in with some help from Microsoft, who ditched Java after embracing it.

Not long ago after releasing .NET framework, Microsoft has praised the benefits of ‘smart clients’ which would offer the best of web architectures and rich user interfaces but nobody really bought into it.

Now the market reiterates its promise of a rich UI for the web with Asynchronous JavaScript and Xml (Ajax). People will tell you that Ajax has been around for quite some time and that Outlook Web Access 2000 was the first application implementing Ajax in the late 90’s. Ajax is based on the XMLHttpRequest object which is available since Internet Explorer version 4. XMLHttpRequest allows code to pass a request to the server without reposting the entire page but Ajax is much more than that otherwise we would have heard about it before.

To get a rich UI, you need much more that the XMLHttpRequest object; you need:
  1. support from the industry which includes browser manufacturers and the support for XMLHttpRequest in Mozilla and other browsers is very recent;

  2. A choice of UI component frameworks that use XMLHttpRequest in the background to provide rich UI features because like me, you do not want to reinvent the wheel and develop a library of UI components. Infragistics, Telerik, ComponetArt, ComponentOne and the others have only recently released their Ajax UI frameworks.

To be honest, Ajax is a jumble of technologies clumsily put together: some Javascript code within an HTML page scripts a binary client object to post data using HTTP to a web server where some Java or .NET code is interpreted by a virtual machine in view to retrieve or save data into a SQL database. I know Ajax is not the whole chain, but what a mess! This mess has good chances to win the rich UI also known as Web 2.0 battle because it does not compromise the ease of deployment of applications. After 10 years of web architecture hegemony, it is going to be good to retrieve the user experience of rich user interfaces.

Sunday, June 25, 2006

Attaching databases to SQL Server 2005

Often, sample projects contain a database file (MDF) but no log file (LDF).

In this case you need to attach the database running the following query against master in SQL Server Management Studio:

EXEC sp_attach_db @dbname = N'DBName',
@filename1 = N'D:\Working Folders\Project\App_Data\DBName.mdf'

This will not only attach the database but also create a log file.

Note: replace DBName and path as required.

Saturday, May 27, 2006

Grand opening

You blog, he/she blogs, they blog but I still don't blog.

It is now done: I have opened my blog on blogger.

I need to keep the pace so please add your comments to motivate me.