January 21, 2012
Web Application Performance Testing using Visual Studio 2010

Visual Studio’s Web Performance Test

The primary vehicle for web application performance testing in Visual Studio is the Web Performance Test.

Visual Studio provides a nice GUI for designing Web Performance Test’s and it integrates beautifully with Internet Explorer which allows for recording a browser session to generate a Web Performance Test.

Additionally, Fiddler’s got an export to Visual Studio Web Test (aka Web Performance Test) feature which is pretty good at generating .webtest’s which can be brought directly into Visual Studio

(NOTE: Fiddler export to Web Test does not capture every aspect of the request verbatim, i.e. - you may not get all Cookies, Headers or ViewState content but instead a reference to these items from the previous request in the test - so tread carefully here).

While the GUI is good for designing the basic steps of your test, it does lack some advanced features which can only be fully realized by switching over to a Coded Web Performance Test. The GUI does provide a feature for exporting a GUI based web test into a Coded Web Performance Test.

Since I am no stranger to writing code, it’s no surprise that I actually prefer to slang my Web Performance Tests coded instead of tinkering around in the UI anyway.

Here’s an example of a Coded Web Performance Test generated from a GUI designed Web Performance Test: 

//------------------------------------------------------------------------------
// <auto-generated>
//     This code was generated by a tool.
//     Runtime Version:4.0.30319.235
//
//     Changes to this file may cause incorrect behavior and will be lost if
//     the code is regenerated.
// </auto-generated>
//------------------------------------------------------------------------------
 
namespace Performance.Testing.Utilities.SampleTargetScripts
{
    using System;
    using System.Collections.Generic;
    using System.Text;
    using Microsoft.VisualStudio.TestTools.WebTesting;
 
 
    public class WebTest1Coded : WebTest
    {
 
        public WebTest1Coded()
        {
            this.PreAuthenticate = true;
        }
 
        public override IEnumerator<WebTestRequest> GetRequestEnumerator()
        {
            WebTestRequest request1 = new WebTestRequest("http://www.google.com/");
            request1.Encoding = System.Text.Encoding.GetEncoding("utf-8");
            yield return request1;
            request1 = null;
        }
    }
}

I am sure we can do better than this. Here is the same code (with non-required statements eliminated) refactored:

using System.Collections.Generic;
using Microsoft.VisualStudio.TestTools.WebTesting;
using Performance.Testing.Fluent.WebTesting.Framework;
 
namespace Performance.Testing.Utilities.SampleTargetScripts
{
    public class Google_Home_PageLoad : BaseWebTest
    {
        public override IEnumerator<WebTestRequest> GetRequestEnumerator()
        {
            var request = FluentRequest.Create("http://www.google.com");
            yield return request;
            request = null;
        }
    }
}

So that’s a bit better. We can now use a common base class BaseWebTest along with a Fluent API to create a WebTestRequest in a single line of code which will use sensible defaults (i.e. - setting the encoding to UTF8, etc). The BaseWebTest class simply sets the PreAuthenticate = true statement in the constructor.

Need to add form post parameters? Query string parameters? JSON data? The FluentRequest API will support it.

Have a look at the Performance Testing Utilities as well as the FluentRequest API on github.

Stay tuned for more on Web Application Performance Testing.

November 4, 2011
DevTeach Ottawa - DSL’s for automated builds

After attending James Kovacs session on psake, its got me seriously considering upgrading my old-school NAnt scripts to psake. While the NAnt scripts that I have in place, do everything I currently need, I am always nervous about attempting to implement new or change existing pieces of the process as its just so hard to test. The great thing about psake is that its just plain old powershell and its a whole lot easier to test a piece of powershell than it is to test a NAnt target (or even worse, a custom nant task).

October 13, 2011
DevTeach Ottawa from CodeBetter.com

Brendan Tompkins over at Codebetter.com has announced that I am the winner of CodeBetter Fantasy Football week 4 and I will be receiving attendance for three to DevTeach Ottawa from November 2-4th.

This is awesome and I am super psyched to get the opportunity to head back to DevTeach. Thanks to Brendan Tompkins and the whole CodeBetter/Devlicious crew for hosting this fantastic contest.

Now the only problem… the e-mail address I used to enter the contest is completely filled with spam. I made the stupid mistake of positing my e-mail address on my blog a few years back and ever since, I have not been able to use this account to send/receive any real e-mail however for some reason (gravatar, website address, etc), I used this e-mail address to enter the contest. I sure hope Brendan and the CodeBetter folk still honor the win.

If all goes well, I will be in Ottawa on Nov 2 - 4 getting my DevTeach on!

October 6, 2011
Developer tools & frameworks which are worth a look

There are few developer tools and frameworks which I am going to be looking at in more detail in the coming weeks.

Here is a brief list:

September 27, 2011
Agile Day 2011

I’m here at Agile Day in NYC. There have been a lot of great presentations (even one by my boss, Bob Viscovich on “A Year In The Life Of Agile at Publishers Clearing House”).

I’m looking forward to the open spaces discussions still yet to come!

September 6, 2011
If something hurt's, do it more often

September 6, 2011
Upgrading my build system to use Web Deploy

The last time I took a look at Microsoft’s Web Deploy, it was still called the “IIS Web Deployment Tool” and at the time, it was little more than a dream of how automated web application deployments for IIS should be done.

The tool was still in its conceptual/beta phase and it lacked a significant set of features which I needed at the time (as in - yesterday), so in order to accomplish my goals of reducing deployment friction between developers/testers/ops, I went on to creating my own sort of Web Deploy toolkit.

The system was built on two core applications, a command line tool and a web application.

The command line tool (Deployment Console) would take a variety of parameters such as the source “package” (zip file) to deploy from, the target server and target directory for the deployment, FTP info for transferring files, IIS info, etc. Simply enough, Deployment Console really had only two jobs. The first, to FTP the source package to the target server for deployment. The second, to take the incoming deployment parameters and dispatch a web request to the target deployment server.

Deployment Agent (the web application responsible for the actual deployment process) would be hosted on the target server and would listen for incoming deployment requests. Once received, deployment agent would spin into action, pause the target website, clean out existing files, unpack the package, etc.

I set up Deployment Agent’s on my INTEGRATION, QA and STAGING environments and hooked Deployment Console into my Continuous Integration build/deployment pipeline allowing for

  • Automated deployments to the INTEGRATION environment upon successful developer/tester check-ins (successful = compiles + passes all non-functional requirements/fxcop warnings/code coverage metrics/etc + passes unit tests)
  • Push-Button automated deployments to the QA environment, triggered whenever the QA team felt comfortable consuming a new build
  • Push-Button automated deployments to the STAGING environment, trigged whenever the QA/Performance/Operations team felt comfortable consuming a new build

Now, as I review the latest version of Web Deploy (2.1), I see that Microsoft has really put a lot of effort into making deployment a first class member of the software development process. Where the “IIS Web Deployment Tool” was a step in the right direction, Web Deploy is few miles down the road to deployment bliss!

Architecturally similar in many aspects to my own deployment toolkit, Web Deploy consists of a command line tool (MSDeploy.exe) and a Windows Service (msdepsvc - an “Agent” accessible by HTTP).

In addition, Microsoft has integrated Web Deploy with Visual Studio, IIS and SQL Server allowing you to not only package (zip) your application code, but the database changes and server configuration settings which power it! With a simple command line execution of MSDeploy, you can easily create a deployment package with your application code, IIS server settings and even your database migration scripts (support is still spotty here, but you can always dream!).

From within the IIS 7 management console, you can specify one of these packages for deployment. IIS understands the concept of a package, allows you to view its content, customize deployment parameters and execute the entire deployment process.

Web Deploy is not perfect and while Microsoft has made some significant strides forward in the area’s of automated application/configuration/database deployments, there are still a few kinks to be worked. That said, its got integration into the primary tools of the trade (IIS, Visual Studio, etc) and will only get better over time.

On that note, I think its time I stop committing to Deployment Console and Deployment Agent and focus on migrating the deployment steps of my build/deploy pipeline over to Web Deploy.

July 31, 2011
The Scrum Guide has been updated

A much welcomed update to the Scrum Guide by Ken Schwaber and Jeff Sutherland.

They’ve also provided a separate document detailing some of the reasons behind the update. Scrum Update

July 3, 2011
TDD and bug fixes

Here is a quick tip:

When a bug is found, DO NOT, run off to fix the bug immediately. Instead, I urge you to write a unit/acceptance test first which exposes the bug. Then and only then, should you go and fix the bug.

By following this mantra you can ensure that all bugs which have appeared in your system are documented by specifications (tests) which prove that they no longer exist.

Here is a quick example. Sally system user reports that the Change E-mail Address feature is saving data in the database replacing the @ symbol with %40. This is causing serious problems and she has requested an urgent fix be provided.

  • First, identify and simulate the problem (i.e. - get your app up and try to change e-mail address, step through the execution and try to pinpoint where the issue exists).
  • Next, automate your simulation steps through test (Unit testing, front-end testing using Selenium, WaitR/N, White… whatever it takes to simulate the users experience of entering an e-mail address and producing a record in the database containing %40 instead of @).
  • Now, add a verification exposing the issue (query the database for your record and verify/assert that the e-mail address does not contain %40).
    This test should FAIL right now
    (you haven’t gone ahead and fixed the issue already, have you?).
  • Lastly, go ahead and fix the issue then re-run the test. Once the issue has been resolved your test should pass.

Do your best to refactor the tests such that they align with the features behavior as opposed to its representation (i.e. if you are testing a web page, try not to rely on an elements position in the html tree as these can be volatile and are subject to change when rearranging the look and feel of the page, but normally will not change the way it behaves).

If you are using continuous integration and your build/deployment pipeline executes theses tests, your build should fail in the event that issue reoccur.

Happy coding!

July 3, 2011
Awesome series on RX