Friday, November 30, 2007

Velodoc XP Edition released as open-source

Customers have been asking us about building Velodoc functionality into their own .NET applications. This is now possible with Velodoc XP Edition which has been released as an open-source project and published on:

Velodoc XP Edition includes:

  • ASP.NET Ajax server controls, http modules and http handlers which provide file upload and download functionality;
  • A sample application for sending and receiving large files;
  • A comprehensive documentation kit.

By releasing the core of the Velodoc platform as an open-source project, we expect to gather more customer intelligence and improve the software, so please provide feedback.

Tuesday, October 23, 2007

The Ajax Control Toolkit NoBot control and session state

You need to prevent hackers using robots to run dictionaries of user names and passwords against your login pages. The most common way to achieve that is the use of Captchas ("Completely Automated Public Turing test to tell Computers and Humans Apart"), which display an image with a code which you need to type in a text box. Captchas are often difficult to read and push away genuine users with disabilities. More elaborated Captchas include sound but they are not mainstream.

The NoBot control in the Microsoft Ajax Control Toolkit can be used for any type of request and in particular to protect the sign in function of a login page:

  1. It makes sure too many requests (to sign in) are not issued from the same IP address;
  2. It provides an automated challenge response mechanism to ensure the request (to sign in) is issued by the (login) page;
  3. It enforces a delay between the time the (login) page is displayed and the request (to sign in) is issued.

The benefit of the NoBot control is that this is transparent to the user, contrary to Captchas.

You implement the NoBot control on your ASP.NET page as follows:


You implement the OnGenerateChallengeAndResponse event handler in code as follows:

void PageNoBot_GenerateChallengeAndResponse(object sender, AjaxControlToolkit.NoBotEventArgs e)


Random r = new

int iFirst = r.Next(100);

int iSecond = r.Next(100);

e.ChallengeScript = String.Format("eval('{0}+{1}')", iFirst, iSecond);

e.RequiredResponse = Convert.ToString(iFirst + iSecond);


Then the documentation tells you to implement the click event handler of the sign in button as follows:

if (!PageNoBot.IsValid()) {

//Display a message that a robot has been detected and the request cannot be processed


else {

//Process the postback event


On most sites users are redirected to the login page when sessions time out. Because the NoBot control stores the calculation in session state and session state is reset in this case, the calculation challenge response would fail teh next time the user logs in unless he/she refreshes the page.

The following solves the problem from the user perspective, but it also opens the door to hackers:

if ((!Page.Session.IsNewSession) && (!PageNoBot.IsValid()))

Apart from redesigning the NoBot control, not to use session state but to use the context cache instead, I have not found a really good solution to solve this issue.

Wednesday, September 12, 2007

Nesting the Ajax Control Toolkit Accordion control and the ASP.NET Repeater control


Install ASP.NET Ajax extensions 1.0.

The ASP.NET page

Create an ASP.NET Ajax web site (see ASP.NET Ajax extensions) and reference the Ajax Control Toolkit in your project.

Add the following statement at the top of your web page:

<%@ Import Namespace="System.Collections.Generic" %> 

Then insert the following code between the form tags of your web page.

<asp:ScriptManager ID="ScriptManager" runat="server" ></asp:ScriptManager>
<ajaxToolkit:Accordion ID="Accordion1" runat="server" SelectedIndex="0"
FadeTransitions="true" FramesPerSecond="40" TransitionDuration="250"
AutoSize="None" >

<div style="color:white;background-color:blue;cursor:pointer;">
<%# ((KeyValuePair<String, List<File>>)(Container.DataItem)).Key %>

<asp:repeater id="child" datasource='<%# (List<File>)(((KeyValuePair<String, List<File>>)(Container.DataItem)).Value) %>' runat="server">

<table border="0" cellpadding="0" cellspacing="5" style="width:100%">

<td><%# ((File)(Container.DataItem)).id.ToString() %></td>
<td><%# ((File)(Container.DataItem)).Name %></td>
<td><%# ((File)(Container.DataItem)).Description %></td>
<td><%# ((File)(Container.DataItem)).Date.ToString() %></td>



The code-behind file

Add the following in the code-behind file of your web page

protected class File
public Guid id;
public string Name;
public string Description;
public DateTime Date;
public File()
id = Guid.NewGuid();
Date = DateTime.Now;

protected void Page_Load(object sender, EventArgs e)
if (!IsPostBack)
const string C = "Category {0}";
const string N = "File {0}";
const string D = "Description {0}";
Dictionary<String, List<File>> dicCategories = new Dictionary<String, List<File>>();
for (int i = 0; i < 5; i++)
List<File> objList = new List<File>();
for (int j = 0; j < 10; j++)
File objFile = new File();
objFile.Name = String.Format(N, j);
objFile.Description = String.Format(D, j);
dicCategories.Add(String.Format(C, i), objList);
Accordion1.DataSource = dicCategories;

Friday, August 31, 2007

Getting Velodoc notifications out of junk email folders

Some Velodoc users have complained that they did not receive our email notifications.

We have looked into our application logs and these notifications have definitely been sent. Looking more closely at our SMTP server logs, we have found some entries similar to the following:

2007-08-28 09:07:16 OutboundConnectionResponse SMTPSVC1 SV1 - 25 - - 550+Your+e-mail+was+rejected+for+policy+reasons+on+this+gateway.++Reasons+for+rejection+may+be+related+to+content+with+spam-like+characteristics+or+IP/domain+reputation+problems.++If+you+are+not+an+e-mail/network+admin+please+contact+your+E-mail/Internet+Service+Provider+for+help.++For+e-mail+delivery+information,+please+go+to+ 0 0 355 0 578 SMTP - - - -

I have confirmed myself that notifications sent to Live Hotmail addresses were not received, even in the junk e-mail folder.

Although I was pretty sure about the result, I have checked that was not recorded in any spam database, using

Going to as suggested, I have read about SPF and Sender ID. I already knew about it, but our DNS Servers have been hosted by Network Solutions and they do not support SPF records. They know their stuff, don't they? I have even contacted their support team and I have been told that they get little demand for it.

Enquiring further, we have realized that more and more e-mail servers implement SPF/Sender ID. Some are now considering DKIM/DomainKeys to sign emails. So we have taken the decision to move our DNS to DNS Made Easy and to add SPF records to our domains. Note that we could have used UltraDNS or EasyDNS, which offer similar services.

We have used to identify other DNS issues. We were only missing a PTR record pointing to Because each of our web servers has its own SMTP server, we have changed the MX record to which already has a PTR record and changed the HELO greeting of the SMTP servers to display the correct fully qualified domain name. Live Hotmail users now receive our email notifications.

In the near future we will be looking at DomainKeys, but contrary to Sender ID, DomainKeys have an impact on the infrastructure because they require that we change our SMTP Servers. If you have some experience with DKIM/DomainKeys and can recommend SMTP servers that implement email authentication, please leave a comment.

Wednesday, August 29, 2007

Turning the Velodoc Flash applet into a Yahoo widget

Yahoo widget documentation is available at

Yahoo widgets cannot host the browser to display Flash applets. All Yahoo widgets based on Flash require the use of a third party component called WebBrowser4Widgets available at Installing a Yahoo widget which uses this component is a very poor user experience. Several security warnings are displayed, which are too many reasons not to install the widget. Check samples at

For this reason, we have postponed the packaging of the Velodoc applet as a Yahoo widget and wait until Yahoo widget implements a native mechanism to host Flash applets. Considering the success of Video sites like YouTube, I cannot think of any reason why they should not do it very soon.

Tuesday, August 21, 2007

Adobe Captivate or Techsmith Camtasia?

Although most of my friends seem to prefer Camtasia, I have been a long-time fan of RoboDemo, now Captivate.

The main difference between the two products is:

  • Camtasia records a full-motion video;
  • Captivate records still images, keyboard strokes and mouse movements which it assembles into a video;

The two approaches have their own strengths and weaknesses:

  • Camtasia is better at recording and displaying scrolling panes, progress bars and drag-and-drop operations;
  • Captivate gives a lot more control on the output. In fact, Captivate produces not only demonstrations but also true e-learning content.

The new Captivate 3.0 closes the gap and introduces full-motion recording, not in replacement but in addition to still images. Captivate is clever enough to trigger the full-motion recording when it detects a scrolling pane or a drag-and-drop. Anyway you can trigger it manually anytime.

In my opinion, Captivate 3.0 is now far superior to Camtasia on all grounds except price. The Captivate product remains significantly more expansive but it is now justified.

Friday, August 17, 2007

Turning the Velodoc Flash applet into a Vista Sidebar gadget


Step-by-step instructions on how to develop a sidebar gadget are available at:

Reference documentation is available at


I am always trying to start this type of RAD development from an existing example. The sample that you build in is a good starting point.

In our final gadget, the html is simply a div which is filled by Javascript code when the gadget loads:

The recommended approach to test a 64-bit platform is to use the System.Machine.processorArchitecture property but this always returns [Object error] on my 64-bit DELL Precision M65. So I have used System.Environment.getEnvironmentVariable("PROCESSOR_ARCHITECTURE") which seems to work consistently over the few Vista computers that I have tested.

There is apparently no function in the framework to check if the network connection is online, so I have used the same approach used in the Google Desktop Gadget.

I have only experienced one hitch in the implementation: the Flash applet catches the mouse events before the Sidebar. As a consequence, the gadget toolbar at the top right of the gadget is never displayed, which prevents from closing or moving the gadget. The workaround I have found is to add an image at the top of the Flash applet, so that when the mouse goes over this image, the gadget toolbar is displayed.


You can package your sidebar gadgets as .zip files or .cab files renamed into .gadget files. I have used the following batch file to produce a proper package from sources located in "D:\Documents\Velodoc Sidebar Gadget\Sources". The batch file should be located in the parent directory:

rem ** remove/create a test gadget folder
rd "%LOCALAPPDATA%\Microsoft\Windows Sidebar\Gadgets\Velodoc.gadget\" /s /q
md "%LOCALAPPDATA%\Microsoft\Windows Sidebar\Gadgets\Velodoc.gadget\"
rem ** copy all of the files into test area
xcopy .\Sources "%LOCALAPPDATA%\Microsoft\Windows Sidebar\Gadgets\Velodoc.gadget\" /y /s /q /EXCLUDE:exclude.txt
cd "%LOCALAPPDATA%\Microsoft\Windows Sidebar\Gadgets\Velodoc.gadget\"
"%VS80COMNTOOLS%Bin\cabarc.exe" -r -p n "D:\Documents\Velodoc Sidebar Gadget\Velodoc.gadget" *

Test and debugging

Testing and debugging is relatively easy, considering the logic is in the Flash applet. Simply double click the gadget file to install and test. There is no need to uninstall before a new install, provided the gadget is removed from the Sidebar.


Once your gadget is ready, follow the instructions at to publish it on the Live web site.

Thursday, August 16, 2007

Turning the Velodoc Flash applet into a Google Desktop Gadget


The Goggle Desktop SDK can be found at The documentation is rudimentary, but there are several samples worth exploring. Besides gadgets are packaged in .gg files which can be renamed into .zip files, so that the content can be explored.

A lot of information can also be found in the forums at


My initial track was to run the Flash applet in the Sidebar as explained in My gadget was very simple: a manifest, a localized string.xml file, the main.xml file represented below and a couple of images.

<view height="240" width="240">
<div id="flashcontainer" style="text-align: center;"></div>
<object classid="clsid:d27cdb6e-ae6d-11cf-96b8-444553540000" width="240" height="240">
<param name="movie" value="" />
<param name="wmode" value="transparent" />
<param name="allowScriptAccess" value="sameDomain" />
<param name="quality" value="high" />
<param name="bgcolor" value="#000000" />

In particular, there was no Javascript file in this implementation.

This track is a dead end because there is too much interactivity in the Flash applet and Google Desktop seems to get the events before the applet code gets an opportunity to handle them, which results in an unresponsive applet. Accordingly, the only way is to run the applet in a details view.

The best example to start from is HtmlDetailsView which is part of the SDK. Do not forget to change the guid in the manifest, otherwise you won't be able to submit your project to Google at the end. The code is pretty straightforward, so please download the gadget at, rename it into a .zip file and explore.

The only issue I have faced is the use of the property to detect the online/offline status. After simply disconnecting my network cable, the value returned was still true. So I have created the isOnline() function which uses a synchronous XMLHttpRequest of type HEAD to confirm access to the Flash applet. I did not feel that an asynchronous request was justified in this case. For more information see and

Test and debugging

Testing is easy and does not require packaging the gadget. Double-click the gadget.manifest file and the gadget is installed in the sidebar.

Packaging and Distribution

To package your applet, simply archive all the files in a zip file. Keep the directory structure where the manifest is at the root of the archive. Then rename the .zip extension into .gg and test. You can install your packaged gadget by double-clicking the gg file.

If you upload your gadget to an IIS 6 server, you will get an error 404 when downloading it. By default IIS blocks files which have no known mime type. See: To be able to download your gadget from an IIS server, you need to create a MIME type for the .gg extension. You can map the .gg extension to "application/octet-stream".

Then the last step is to submit your gadget at to get some visibility.

Finally, please note that I have also opened a discussion thread regarding this gadget at

Wednesday, August 15, 2007

Developing a file upload applet for Velodoc with Flash

This article follow a recent article entitled Designing gadgets and widgets for uploading files to Memba Velodoc.

ActionScript 2 and the Flash 8 IDE

This project was my first trial at Flash and in my opinion, Flash 8 presents two main difficulties for .NET developers:

  1. The IDE offers rudimentary development and debugging features;
  2. Scope and addressing of objects (relative or absolute, using _parent or this) is not always clear. Sometimes relative won't work, and this is fixed with absolute addressing. The rule seems to be using relative unless you cannot get around it.

Apart from that, Flash is not a great development tool and I would not consider large projects in Flash, but it does a nice job for small applets.

Flash security

As always, sandboxed security is not intuitive and error messages are unfriendly when you get them, so be prepared to spend a significant amount of development time on security. Most issues are explained at:

You can also get information about the IE content activation issue at:

The user interface logic

I have opted for a form application with the following forms:

  • The application form contains the background elements and the action script.
  • The outbox form contains a To, Subject and Message text inputs and a "Select file" button. It also contains a label to display the file name and a Send button.
  • The progress form contains a progress bar and a Cancel button
  • The settings form allows you to define your settings.
  • The success form confirms when a use case completed ok.
  • The error form displays an error message when needed.

Except a couple of behaviours, all our script is written in Frame1 of the application form. Navigation between forms is easy to build and does not raise any major issue.

The file upload process

The file upload process using FileReference is simple and fairly well documented in the following documentation:

Nevertheless, two trick needs to be pointed out:

First, FileReference posts a 0 byte request to the designated URL before posting the file. You need to make sure that your server code does not store a 0 byte file in this case.

Second, FileReference emits an invalid multipart request and our server logic got caught. At the end of the request, you will find something like:


Content-Disposition: form-data; name="Upload"

Submit Query


which should have been:


Content-Disposition: form-data; name="Upload"

Submit Query


The differences being an additional \r\n between the form-data and value and a missing double hyphen at the end of the request. Note that this happens when the applet runs in the Flash environment, it does not happen when the applet runs as a gadget within the Vista sidebar, but in this case, there is an additional random character at the end. We have updated our server code to handle these specificities.

Calling web services

There are three ways to call web services in Flash:

  1. Remoting is the most flexible, most complicated way. It is the low level stuff.
  2. The web service component is a data binding control which does not require any programming. You get it to work just by settings properties in dialogs.
  3. Finally* have high level classes that make calling web services from ActionScript really easy.

Our requirements are not terribly complex and the* classes have been the way to go. There is only one trick, which I have not been able to find either on the web or in the flash documentation, which is how to pass complex objects as parameters.

Styling our Flash gadget

Styling the flash applet certainly represents the largest amount of code and time spent. We needed gradients and bevel effects that would resize. To achieve that, I have used the Flash Drawing APIs described at and skinning techniques described at

I have decided to postpone changing colours and localization to a future version.

Debugging and testing

For debugging and testing, you will find the following tools helpful:

The applet works fine in Internet Explorer. There is a bug which we have experimented in Firefox: You get a US keyboard if you define wmode="transparent". This is documented at but seems to have been corrected . Other issues that you may experience are documented at


The result is available at

If you want to participate and improve this applet, please contact me. I'll be very happy to share the code with active contributors.

Next, I'll explain how to turn this applet into gadgets and widgets for all major platforms.

Monday, August 13, 2007

The future of Internet TV

Following aerial TV and cable TV, the next revolution is Internet TV or IPTV, which makes the cost of distributing TV programs even lower, allowing for much more targeted and specialized content. There are currently four types of actors that have a chance to play a major role in this exciting revolution:

  1. The major TV broadcasters
  2. The major telecom providers
  3. The online merchants including iTunes
  4. The video web sites including YouTube
  5. The peer-to-peer newcomers including Joost and Babelgum

The following criteria will make the difference between the platforms and eventually specialize them:

  1. Capacity to attract advertising dollars, which is a mix of size and segmentation of the audience
  2. Capacity to attract paid subscriptions, which is related to the quality and exclusivity of the content and ultimately to the remuneration and protection of content producers
  3. Capacity to display hi-res content, which is mainly a technical issue
  4. Ease of use, especially to search for or subscribe to archived content

Let’s now review the various platforms and trends:

Ad Dollars

Paid Subs.



Major TV Broadcasters





Major telecom providers





Online Merchants





Video web sites





P2P newcomers





Weak point

Strong point

Best fit

Major TV Broadcasters

Transition to Internet

Quality content and revenue stream

News and other short-life content

Major telecom providers



Mobile phones

Online Merchants

Purchasing process

Hi-Res content


Video web sites

Streaming bandwidth

Navigate and search

Short low-res content

P2P newcomers

Works on PC + Content

Hi-Res content


As more platforms will become available, the split between content producers and broadcasters will become more obvious. In this respect, I do not think that the major telecom providers, online merchants, video web sites and P2P newcomers will ever produce content, but will rather enter into agreements to obtain content. In this respect, the major TV broadcasters which both produce and broadcast content will enter in “co-opetition” agreements, for example to broadcast their programs on mobile phones.

Telecom providers are probably the biggest threat to TV broadcasters as many content producers will see in them a new channel for their content. This is probably the main reason for Sky to offer broadband and telephone or for Microsoft to target MediaRoom at service providers.

Considering the purchasing process, online merchants will probably focus on music and films for quite some time and do not represent a significant threat for TV broadcasters and telecom providers.

I cannot imagine video web sites getting enough money from advertising to sustain their activity, they have to remain free for users and they will not be able to compete on high-quality content with TV broadcasters and telecom providers. In my opinion, their only option to survive is to get subscriptions from companies to get their own channel, for example:

  • A BMW channel where BMW would present educative content regarding its range of cars;
  • A L’OREAL channel where L’OREAL would give advice to women how to make the most of their makeup using its products;
  • A DANONE channel where a chef would give recipes using DANONE products;
  • A NIKE channel which would give training advice for running the marathon.

The video web site could even provide links to purchase the products demonstrated, competing with video shopping channels.

P2P newcomers will have to review their business model. They are currently in a vicious circle:

  • they do not remunerate content producers because they are free and they do not have enough advertising;
  • they do not have enough advertising because they do not have enough quality content attracting users;
  • finally, they do not have enough quality content because they do not remunerate content producers.

P2P newcomers definitely need some free content to let users evaluate their technology and make the numbers, but they have a vocation at being a video encyclopedia capable of making available not only the blockbusters but more importantly confidential content like documentaries or TV archives and they won’t be able to achieve that without user subscriptions and retributions to content providers.

Monday, July 30, 2007

Developing a quick and dirty bulk email infrastructure to clean your lists

If you want to send an email newsletter, I can only recommend you use:

In our research for the ideal tool, we have found that services are weak on editing and layout and strong on reporting, whereas software is strong on editing and weak on opt-in/opt-out use cases and cleaning contact lists. There is no perfect tool. There is in my opinion an advantage to email marketing services because there is a workaround to their weaknesses which is to compose the newsletter in an HTML editing tool like Dreamweaver.

You may choose not to use these software and services, either to spare your marketing dollars or because you need specific features that these tools do not provide. In this case you will need to assess the following requirements and you will find that developing a bulk email infrastructure is no easy task:

  • Editing and layout
  • Contact list management
  • Mail merge
  • Bulk send performances
  • Opt-in/opt-out use cases
  • Cleaning contact list (email bounces)
  • Reporting

In our case, we had to find a way to clean our contact list before considering a subscription to an email marketing service because their pricing is per contact and some contacts in our list were fairly old. To achieve that, we have spent a couple of days building a quick and dirty bulk email infrastructure comprising:

  1. An SMTP server with two mailboxes, newsletter@domain.tld and dsn@domain.tld, where the first one has an autoresponder;
  2. A bulk email tool including management of bouncing emails, which is only a GUI around devMail;
  3. An opt-in/opt-out web page;
  4. Integration with Google Analytics for reporting (not in the sample).

The process to build, send and analyse a newsletter is the following:

  1. Design the newsletter in Dreamweaver (use %%dbfield%% for mail merge)
  2. Check the newsletter against the SPAM checker of iContact using a trial account
  3. Open the newsletter in IE and save as web archive (*.MHT) with embedded images
  4. Open the MHT file in the bulk email tool (configured to use a specific database and SMTP server)
  5. Build the text version of the HTML newsletter
  6. Click send
  7. A few days later, open the tool and click Analyze to interpret delivery status notifications and tag bouncing emails
  8. Open the mailbox in Outlook to handle manually the notifications which could not be interpreted

You can download the source code here. This tool does not measure up with the software and services mentioned above, but it offers a convenient way to purge a contact list before subscribing to an email marketing service.

Friday, July 06, 2007

Schedule your backup transfers

This is a follow-up of my recent article entitled "Automated SQL Server backups".

After automating SQL server backups on a remote server, you will probably want to transfer them to an FTP server and you will want this to be automatically done every day.

Using FTP

Windows command-line FTP.exe has the ability to use scripts like in FTP –script:"C:\backup.ftp" where backup.ftp is a text file containing a series of FTP commands:

open ftp://ftp.yourdomain.tld/
cd /backup
put "C:\Program Files\Microsoft SQL Server\MSSQL\BACKUP\DBName Mondays.bak"

Accordingly, you can schedule the following VB script to execute daily using windows task scheduler:

Dim ftp, dir1, dir2, s, d, bak, fso, sf, shell, cmd
ftp = ftp://ftp.exe/
dir1 = "C:\Program Files\Microsoft SQL Server\MSSQL\BACKUP\" '—- Local dir
dir2 = "/backup" '—- FTP dir (do not include / at the end)
s= "backup.ftp"
select case d
case 1
bak="DBName Sundays.bak"
case 2
bak="DBName Mondays.bak"
case 3
bak="DBName Tuesdays.bak"
case 4
bak="DBName Wednesdays.bak"
case 5
bak="DBName Thursdays.bak"
case 6
bak="DBName Fridays.bak"
case else
bak="DBName Saturdays.bak"
end select

Set fso = CreateObject("Scripting.FileSystemObject")
Set sf = fso.CreateTextFile(dir1 + s, True)
sf.WriteLine("open ftp.yourdomain.tld")
sf.WriteLine("cd " + dir2)
sf.WriteLine("put " + chr(34) + dir1 + bak + chr(34))

Set shell = CreateObject("WScript.Shell")
cmd = chr(34) + ftp + chr(34) + " -s:" + chr(34) + dir1 + s + chr(34)
shell.Run cmd, 8, true

Workaround when Passive FTP is required

When running the command above, you may get the following FTP error "425 – Could not open data connection to port 2512: connection refused" depending on the infrastructure.

There is a good chance that you need passive FTP. Contrary to what some people say, windows command-line FTP is capable of passive mode, but the "literal PASV" command returns a port number which has to be opened using "literal PORT" and this cannot be easily scripted.

The workaround is to use ncftp software and the following VB script instead of the script above:

Dim ftp, dir1, dir2, d, bak, shell, cmd
ftp = "C:\Program Files\NcFTP\ncftpput.exe"
dir1 = "C:\Program Files\Microsoft SQL Server\MSSQL\BACKUP\" '—- Local dir
dir2 = "/backup" '—- FTP dir (do not include / at the end)
select case d
case 1
bak="DBName Sundays.bak"
case 2
bak="DBName Mondays.bak"
case 3
bak="DBName Tuesdays.bak"
case 4
bak="DBName Wednesdays.bak"
case 5
bak="DBName Thursdays.bak"
case 6
bak="DBName Fridays.bak"
case else
bak="DBName Saturdays.bak"
end select

Set shell = CreateObject("WScript.Shell")
cmd = chr(34) + ftp + chr(34) + " -u username -p password ftp.yourdomain.tld " + dir2 + " " + chr(34) + dir1 + bak + chr(34)
shell.Run cmd, 8, true

Downloading backups from your FTP server to your workstation or a local server

A similar approach can be used to schedule backup downloads from the FTP server to your workstation or a local server which has tape backup.

Dim ftp, dir1, dir2, shell, cmd
ftp = "C:\Program Files\NcFTP\ncftpget.exe"
dir1 = "/backup/" '-- FTP dir
dir2 = "C:\BACKUP" '-- Local dir (do not include \ at the end)
Dim bakArray(6)
'-- Sundays
bakArray(0) = "DBName Sundays.bak"
bakArray(1) = "DBName Mondays.bak"
bakArray(2) = "DBName Tuesdays.bak"
bakArray(3) = "DBName Wednesdays.bak"
bakArray(4) = "DBName Thursdays.bak"
bakArray(5) = "DBName Fridays.bak"
bakArray(6) = "DBName Saturdays.bak"

Set shell = CreateObject("WScript.Shell")
cmd = chr(34) + ftp + chr(34) + " -u username -p password ftp.yourdomain.tld " + chr(34) + dir2 + chr(34) + " "

For Each bak In bakArray
'ncftpget is sufficiently intelligent to only download newer files
cmd = cmd + chr(34) + dir1 + bak + chr(34) + " "
cmd = Trim(cmd)
shell.Run cmd, 8, true

What is next?

The above can be improved one step further by launching the script directly from SQL Server jobs using an Execute T-SQL Statement Task which would possibly generate and call the script with master.dbo.xp_cmdshell. This would secure the FTP credentials in SQL Server and launch the transfer immediately at the end of a back-up.

Automated SQL Server Backups

SQL Server 2000

  1. Open SQL Server Enterprise Manager
  2. Register your server
  3. Right-click the node Management > Backup, then select menu Add backup devices... We create devices "DBName Mondays.bak" to "DBName Sundays.bak" corresponding to each day of the week but you can use your own naming convention. We have assumed that DBName is the name of the database to backup.
  4. Right-click the node Databases > DBName, then select menu option All Tasks > Backup database...
  5. The SQL Server Backup – DBName dialog box is displayed.
  6. On the General tab, rename the backup "DBName Mondays", click the Remove button if an item appears in the backup to list, then click Add...
  7. In the Select Backup Destination backup select Backup Device and "DBName Mondays" then click OK.
  8. Make sure Database - complete and Overwrite existing media are selected.
  9. Check schedule and click ... to edit. Rename the schedule "DBName Mondays" and schedule to occur weekly on Mondays. Click OK.
  10. On the Options tab, click Verify backup upon completion and Remove inactive entries from transaction log. Keep other options unchecked.
  11. Go through steps 4 to 10 with "DBName Tuesdays" to "DBName Sundays".
  12. Go to the Management > SQL Server Agent > Jobs node, refresh the list and check the 7 backup jobs which you have just created.

SQL Server 2005

This is slightly more complicated in SQL Server 2005 because you cannot use the backup dialog which has no scheduling option. You need to use a maintenance plan. A maintenance plan is an SSIS package, which means you either need SSIS or SP2 installed, because SP2 includes a limited version of SSIS for maintenance tasks including backups.

  1. Open SQL Server Management Studio (Run as Administrator on Windows Vista).
  2. Right-click the node Server Objects > Backup Devices and select menu "Add Backup Device...".
  3. Name your backup device "DBName Mondays" and make it a file named "DBName Mondays.bak".
  4. Repeat steps 2 and 3 for "DBName Tuesdays" to "DBName Sundays".
  5. Start and configure SQL Server Agent which is a requirement for maintenance plans.
  6. Right click the node Management > Maintenance Plans and select menu "New Maintenance Plan..."
  7. Rename your plan "DBName Plan".
  8. Double-click "Subplan_1" to edit its properties.
  9. Name it "DBName Mondays" and create a schedule of the same name recurring weekly every Monday.
  10. Drop a "Back Up Database Task" from the toolbox onto the light yellow design surface.
  11. Right-click the task on the design surface and select menu "Edit...".
  12. Select the database(s) you want to back up, check option "Back up databases across one or more files", then click button "Add..." and select the backup device called DBName Mondays.
  13. Select "Overwrite" if backup file exists, check "Verify backup integrity" and click OK.
  14. On top of the list of subplans of "DBName Plan", click "Add Subplan".
  15. Go through steps 9 to 14 with "DBName Tuesdays" to "DBName Sundays".
  16. Go to the SQL Server Agent > Jobs , refresh the list and check the 7 backup jobs which you have just created.

For both SQL Server 2000 and SQL Server 2005, you can consider adding tasks to shrink databases, rebuild indexes, check database integrity in the scheduled jobs, but this goes beyond the scope of this article. In a following article, we will show you how to schedule FTP transfers to push and/or pull the backup files.

Wednesday, June 27, 2007

Getting metrics from .NET projects

There are loads of free tools available to get metrics from Java projects. I have found much more difficult to find similar tools for .NET projects.

When googling for ".NET project metrics" you get many references to devMetrics from a company called Anticipating Minds. I will spare you the time: Anticipating Minds is no more doing business and devMetrics does not work with .NET framework 2.0 and above.

I have used two complementary tools which give simple but effective metrics:

  1. A Reflector add-in called CodeMetrics, and
  2. Source Monitor from Campwood Software.

Friday, June 22, 2007

The ad war from a user perspective

The ad war between Google, Microsoft and Yahoo is on. This is a summary of what you have probably read in the press:

  • Yahoo Search Marketing (formerly Overture) has only been losing market share for a couple of years and struggles to deploy its new advertising platform, code-named Panama.
  • Microsoft’s advertising strategy has always been confused until recently. They have now decided to go into the advertising space and they want to make it big. See how much they have spent to recently acquire aQuantive. See the pace of upgrades made to AdCenter. See also Microsoft’s track record of turning a late arrival in a competitive market into a great success: Internet Explorer, Windows Mobile, Xbox.
  • Google is the leader with a consistent strategy and a great advertising platform, which not only includes AdWords and its important counterpart AdSense, but also free services like Blogger or Gmail where advertising is leveraged. Finally they have two critical complementary tools which give them a competitive edge, Google Analytics and Google Checkout.

It is difficult to measure how much the advertising platform weighs in the success or failure of its owner. I like to believe that it is a large part of it. I am a user of Yahoo Search Marketing, Microsoft AdCenter and Google’s suite including AdWords, AdSense, Analytics, and Blogger and I report here my experience:

  • Yahoo Search Marketing’s platform is simply a pain to use. Registering is overly complicated with issues regarding restrictions on billing address and currency in relation to the target market. Note that only Yahoo proposes a service fee to help you get started. Vocabulary is confusing but it has been corrected in the new platform. And the worst design issue is certainly the concept of binding an account to a national market, which remains in the new platform. In other words, if you advertise in the US and in 5 European countries, you need 6 different accounts and there is no way you can get a single view of your advertising spend with Yahoo. In my opinion, there is urgency for Yahoo to correct this if they want to survive in the advertising space.
  • Microsoft AdCenter is fairly new and gets improved regularly. It is certainly more rigid than Google. For example, an Ad has a culture which is a combination of language and country. Accordingly, if you want the same Ad to be displayed in the US and in the UK, you need to duplicate it. Google is better in this respect but I think AdCenter is deemed do a reasonable Job after a few revisions. The challenge for Microsoft is to build synergies with other tools in a reasonable time: they definitely need the equivalent of Google AdSense and Google Analytics and they also need to offer users who contribute content on their platform, including Live Spaces , the ability to generate revenue using their AdSense equivalent.
  • Google is I think two years ahead of the competition and their recent acquisition of DoubleClick has given them more comfort against Microsoft. I have very few complaints against their platform apart from the inability to change from credit card to bank account, the inability to get a bank account automatically debited and the incompatibility of AdSense with SSL. Google Analytics is absolutely a must have and I can’t wait for Google checkout to be available in continental Europe.

Thursday, June 21, 2007

Getting scheduled scans to work with Norton Antivirus 10.2 running in unmanaged mode

We use Norton Antivirus (NAV) Corporate Edition which is installed in managed mode on our LAN.

We have been using an old version of NAV on dedicated hosted web servers. Recently we have realized that NAV was triggering ThreadAbortException on long running ASP.NET pages. Excluding files did not solve the problem. We have had to upgrade NAV.

We have installed NAV in unmanaged mode on these servers as per the documentation, but after a couple of days, we have realized that scheduled scans were not working. The reason is explained here.

The workaround which is not given in the Symantec knowledge base is simply to copy (export + import) the following registry keys (and all subkeys) from a computer which has NAV in managed mode to the computer which has NAV in unmanaged mode:
  • HKEY_LOCAL_MACHINE\SOFTWARE\INTEL\LANDesk\VirusProtect6\CurrentVersion\LocalScans\ClientServerScheduledScan_1
  • ...
  • HKEY_LOCAL_MACHINE\SOFTWARE\INTEL\LANDesk\VirusProtect6\CurrentVersion\LocalScans\ClientServerScheduledScan_n

Thursday, March 29, 2007

Ajax extensions services always report “There was an error processing the request.”

I have been puzzled by the following for quite a while.

I use Microsoft Ajax extensions 1.0 to query web services which may raise exceptions. In my development environment, I would always get a nice localized error message but on the production server, the same code would produce a generic “There was an error processing the request”.

After having spent enough time stepping through my code with a debugger, I have decided to look at the Ajax extensions code and I have found the following piece of code in WriteExceptionJsonString of RestHandler.cs.

if (context.IsCustomErrorEnabled)
writer.Write(JavaScriptSerializer.SerializeInternal(new WebServiceError(

AtlasWeb.WebService_Error, String.Empty, String.Empty)));
writer.Write(JavaScriptSerializer.SerializeInternal(new WebServiceError(

ex.Message, ex.StackTrace, ex.GetType().FullName)));

Where WebService_Error is a resource which is valued “There was an error processing the request.”

This means that if you have enabled custom errors in your web.config on your production environment like I did, your Ajax calls will always report a generic error.

Why have the Microsoft people introduced such a restriction is a mystery to me.

Friday, March 16, 2007

Extender controls may not be registered before PreRender

Today, I have added new AjaxControlToolkit controls to Velodoc pages and I got exception “Extender controls may not be registered before PreRender” from running a page.

All our pages derive from our own WebPage class which derive from the standard Page class. WebPage provides features like custom error handling and QueryString parsing into page properties.

The page where we had the new extender control had the following method:

protected override void OnPreRender(EventArgs e)
//The following line is required, otherwise you get "Extender controls
//may not be registered before PreRender."

//Some code that displays errors on postbacks

The solution to the problem above is to add base.OnPreRender(e); at the beginning of the method.

Wednesday, January 31, 2007

Functional and load testing of ASP.NET Ajax applications – part I


Velodoc is developed in ASP.NET C# 2.0 with Visual Studio .NET 2005.

VSTS comes with unit testing, web testing and load testing, considering load testing is actually an execution environment for unit tests and web tests.

Unfortunately Visual Studio web tests work at the protocol level, recording HTTP traffic in order to replay it. The same applies to Redgate’s ANTSLoad and many other load testing tools. Velodoc is an Ajax application and this does not work.

We have started our search for a solution from the following lists of testing tools:

Obviously there are tools like Mercury Winrunner/Loadrunner and the IBM Rational equivalents which certainly cope but they are too expensive.

Our requirements are the following:

  • Test ASP.NET applications
  • Works on Windows XP with IE
  • Compatible with NetAdvantage controls, iFrames, file uploads and Ajax
  • Scripts IE including IE dialogs to drive functional UI tests
  • Several instances can be executed concurrently to create load/stress tests
  • Uses a familiar technology (low learning curve)
  • Open source is a plus and a requirement if the supplier has not been around for a long time.

There are three types of solutions which cope more or less with these requirements:

  • Macro recorders/players
  • Test environments
  • Web application scripting frameworks

Macro recorders/players and test environments

Searching on Tucows, and other shareware web sites reveals loads of macro recorders. Most of them hook the message pump and replay the windows messages. Obviously, the result is not good. We have tried a dozen of them but only iOpus iMacros 5.2 could do a decent job at automating a file upload from The code is reproduced below:

SIZE X=876 Y=627
TAG POS=1 TYPE=TEXTAREA FORM=NAME:aspnetForm ATTR=ID: MessageTextBox CONTENT= Test<SP>with<SP>iMacro
WINCLICK X=100 Y=429 CONTENT=C:\test.bin

I see at least three drawbacks to using iOpus iMacros as a functional test tool:

  • The lack of assertions and reporting capabilities makes it insufficient for functional testing;
  • The lack of infrastructure makes it insufficient for load testing;
  • The proprietary language reduces the possibilities despite the fact that it can call external scripts or be called via a COM component.

I have also tried several test environments mentioned in the above lists. Generally, recording the file upload test on has always proved difficult but I cannot tell whether it was feasible or not for some of them. I have limited myself to 2 hours per evaluation and the inherent complexity and learning curve was not worth digging passed first impression.

Web application scripting frameworks

I have looked at two open-source frameworks, which both script IE to execute functional tests on the web application:

  • IEUnit 2.3, a framework based on JavaScript
  • WatiN 0.9.5, a framework based on C# .NET

Both frameworks are very easy to start with due to a familiar language and an intuitive API. Additionally both frameworks provide the necessary assertions to report on the success or failure of a test fixture.

Scripting a file upload on the Velodoc home page with IEUnit entails the following JavaScript code:

_.setTextArea("MessageTextBox", "Test with IEUnit");
var fname = "C:\\test.bin";
var cmdShell = new ActiveXObject("WScript.Shell");
cmdShell.Run("C:\\EnterFileName.sbk " + fname, 0, false);

Where EnterFileName.sbk contains:

var fpath = " " + WScript.Arguments(1);
var popupWin = _.waitForWindow("Choose file", 30000);
_.findWindow(popupWin, "Edit").sendText(fpath);
_.findWinButton(popupWin, "&Open").click();

Scripting a file upload on the Velodoc home page with WatiN entails the following C# code:

using (IE ie = new IE(
ie.TextField(Find.ById("MessageTextBox")).TypeText("Test with WatiN");
Frame f = ie.Frame(Find.ById("FileUploadFrame"));
FileUpload fUp = f.FileUpload(Find.ById("FileInput"));
ie.CheckBox(Find.ById("SendTermsCheckBox")).Checked = true;
SimpleTimer t = new SimpleTimer(60 * 60);
Span s;
s = ie.Span(Find.ById("DownloadLinkLabel"));
if (!String.IsNullOrEmpty(s.Text))
goto EXIT;
} while (!t.Elapsed);
throw new WatiN.Core.Exceptions.TimeoutException("...", 60 * 60));
Assert.AreEqual(true, s.Text.Contains(;


The beauty of WatiN is that it builds on top of C# and .NET framework. Accordingly you get not only a rich language but also a rich environment. Test code built with WatiN can be executed within NUnit and Visual Studio unit testing projects which means that they can also be executed for load tests which I am going to try next although we know already that launching several instances of IE has its own limitation.

Wednesday, January 24, 2007

Remapping the universe

Following our recent post regarding Bumptop, see in the following Video Jeff Han and Phil Davidson demonstrating how a multi-touch driven computer screen will change the way we work and play.

Source: YouTube

Read the full article.

Tuesday, January 16, 2007

SGen XmlSerializers

My VS2005 C# web project was perfectly working in my development environment but I got the following error after deploying in a production environment:

Configuration Error
Parser Error Message: Cannot deserialize [C:\data.xml].

Using the fusion log viewer (FUSLOGVW.EXE) reveals a binding exception to Assembly.XmlSerializers.dll, where Assembly is the name of the assembly which contains the classes serialized in data.xml.

My ASP.NET application runs as a dedicated application pool under the credentials of a windows user which has very limited rights. The problem is related to the user rights which prevent the ASP.NET process from automatically generating XmlSerializers.

Visual Studio cannot generate XmlSerializers

On the build tab of your VS project properties, there is an option called “Generate Serialization Assemblies” which you can set to “On”.

If you look at MSBuild commands in the output window, doing so actually adds a command line which looks like the following:

C:\Program Files\Microsoft Visual Studio 8\SDK\v2.0\bin\sgen.exe /assembly:"Assembly.dll" /proxytypes /reference:"reference1.dll" … /reference:"referenceN.dll" /compiler:/keycontainer:VS_KEY_5EFB7881D71082EDCF85DBBFCD748B9A /compiler:/delaysign-

Note the /proxytypes option which actually prevents SGen from generating your XmlSerializers as specified in the documentation for the XML Serializer Generator Tool (Sgen.exe).

Use SGen as a post-build event

As a consequence you need to add the SGen command as a custom post-build event on the Build Events tab of your VS project properties:

"C:\Program Files\Microsoft Visual Studio 8\SDK\v2.0\Bin\sgen.exe" /force /assembly:"$(TargetPath)" /compiler:/keycontainer:VS_KEY_5EFB7881D71082EDCF85DBBFCD748B9A /compiler:/delaysign-

Other interesting links

Thursday, January 04, 2007

Unable to validate data error in relation to machine key

I have experienced the following error on a production web site:

Unable to validate data

at System.Web.Configuration.MachineKey.GetDecodedData (Byte[] buf, Byte[] modifier, Int32 start, Int32 length, Int32& dataLength)

Searching on the web, many developers seem to solve the problem at least partially by generating a static key as described in Microsoft's knowledge base.

This is a workaround but not an actual solution to the problem, at least in my scenario where my application runs in a dedicated application pool under limited privileges as described in

The fix is to run aspnet_regiis.exe –ga DOMAIN\USER where USER is the identity of the application pool. Also make sure the user is part of the IIS_WPG group.

This command gives not only access to the IIS metabase but also creates the registry keys required in


for the application pool to generate machine keys.