Jun 18 2009

2008 Server Activation Error – 0×8007232B

If you are getting an activation error on your 2008 server something like:

“Key management services (KMS) host could not be located in domain name system (DNS), please have your system administrator verify that a KMS is published in DNS.”

Don’t stress it is the same problem / error that I blogged about earlier with VISTA activation problems, just using a different error message.

Here is the blog post for the Vista activation fix, it also works for 2008 server.

Jun 9 2009

IIS 7.5 – New Features

IIS.7.5 Internet Information Services (IIS) have some interesting new features in Windows Server 2008 R2, which justifies the new version number, i.e. IIS 7.5. Most noteworthy  is that you can now run ASP.Net applications on Server Core. Of course, it will reduce security if you install .NET on Server Core, but IIS without .NET doesn’t make much sense either. The performance-related improvements are not very exciting in my opinion. However, support for 256 logical processor cores makes Windows an interesting server OS for cloud computing.


Internet Information Services 7.5 (IIS)
  • PowerShell provider for IIS 7 has more than 50 new cmdlets
  • Administration Pack extensions: Database Manager (SQL Server management within IIS Manager), Configuration Editor (generate scripts with a GUI to automate administrative tasks), IIS Reports, Request Filtering (HTTP filtering, URL rewriting, etc.)
  • One-click publishing in Visual Studio 10
  • Web Deployment Tool (formerly MS Deploy): Deployment, management, and migration of Web applications, sites, and entire servers
  • Configuration Tracing: track configuration changes to IIS and applications
  • New performance counters
  • .NET support for Server Core
  • WebDav integration (was available before as a separate extension)
  • URLScan 3.0 integration: restricts the types of HTTP requests (was available before as a separate extension)
  • FTP server services: integrated in the IIS administration interface; new .NET XML-based *.config format; virtual host names for FTP sites; improved logging
  • Integrated extensions: new kind of extensions that appear to be an integral part of IIS
  • Support for up to 256 logical processor cores (Windows Server 2008 supports 64 logical cores)
  • Hyper-V virtual machines support up to 32 logical cores (Hyper-V 1.0 supports 4 processor cores)
  • IP Stickiness: Configure the time period (weeks or months) that a connection state to a specific cluster node persists
  • Performance: Reduced processor utilization for “wire speed” transmissions; improved input/output process (NTIO); up to 32 paths to a storage device are supported; improved iSCSI client; storage subsystem allows hardware vendors to optimize their storage mini-driver; better Chkdsk performance
  • Availability: Failover to alternative path; configuration snapshots (ability to restore a previous configuration)
  • Manageability: Automated deployment of configuration settings using Unattend.xml; improved monitoring (new performance counters, logging for storage drivers, health-based monitoring)

Above taken from http://4sysops.com/archives/windows-server-2008-r2-new-features-the-complete-list-part-3-iis-75-and-performance/

I highly recommend looking at the requesting filtering this is URLScan 3.0 built in


Another awesome feature is the Inbuilt IIS reporting and SQL management.


May 22 2009

2008 Server DFSR Replication Problems

I have used DFSR for some time now and had only great experiences with it, however it was low volumes of data, This time I was implementing this as a High availability solution for a web farm, We had about 40 gigs of data with a massive amount of files to replicate…. But the scheme of things 40 gig of data is really not that much.

Anyways cutting to the point after reading articles about the best way to proceed with a quick initial replication was said to be copy the files to the destination server, so I did this via robocopy, keeping all attributes and the permissions intact.

However contrary to popular believe this was what ultimately caused so much grief, as it started the replication off and I my event log was filling up with:

Event Type: Information
Event Source: DFSR
Event Category: None
Event ID: 4412
Date: <Date>
Time: <Time>
User: N/A
Computer: <Computer name>
The DFS Replication service detected that a file was changed on multiple servers. A conflict resolution algorithm was used to determine the winning file. The losing file was moved to the Conflict and Deleted folder.
Additional Information:
Original File Path: <File path>
New Name in Conflict Folder: <Folder name>
Replicated Folder Root: <Folder name>
File ID: <File ID>
Replicated Folder Name: <Folder name>
Replicated Folder ID: <Folder ID>
Replication Group Name: <Replication group name>
Replication Group ID: <Replication group ID>
Member ID: <Member ID>

see more about this here: http://support.microsoft.com/kb/944804

After further investigation, this was because the file ID of the files on the destination server differed from the source server…. Thanks robocopy.

This was filling up my ClonflictAndDeleted folder very quickly, with a lot of what I though was unnecessary crap. Never the less I let it run for a few days and I cam back to find the below event log:

Source : DFSR

Catagory : None

Event ID : 2104

Type: Error

Description :

The DFS Replication service failed to recover from an internal database error on volume F:. Replication has been stopped for all replicated folders on this volume.

Additional Information: Error: 9203 (The database is corrupt (-1018)) Volume: DB587759-DC0B-11DC-940D-00304888DB13 Database: F:\System Volume Information\DFSR


Brilliant I had a corruption


Possible Solutions

Taken from Google Groups [1]

I recently had a spat with the "new and improved" DFSR and wanted to let everybody in on the proceedure for reseting a DFSR member.

First off, removing everything using the GUI doesn’t help when the database is corrupt. DFSR keeps the database regardless of its membership status. So if for example you had a broken DFSR server and removed it from every replication group, when you added it back you’d still be out of luck.

To clear it completely after the server is no longer a member of *any* dfsr replication group (i.e. remove it from all of them in the gui and wait for AD replication to propgate the changes):

1. Stop the "DFS Replication" service.

2. On the drive(s) in question, grant yourself full permission to the hidden system "System Volume Information" folder.

3. Navigate into the folder and delete (or move to be extra careful) the DFSR folder.

4. Navigate to each replication group the server was a member of and delete (or move to be extra careful) each hidden system "DfsrPrivate" folder.

5. Start the "DFS Replication" service.

You may now treat the server as a brand new member for the replication groups. Now all you need to deal with is DFSR’s sloppy initial replication routines (hint: those missing files are in the "DfsrPrivate \PreExisting" folder).




However this did not work the folder would not rename under 2008 even with UAC off, this did work for me though

1. Click Start, right click Command prompt and click run as administrator to open a command prompt window, then go to driverletter:\System Volume Information\dfsr prompt, type the command below to rename it:

Ren “old folder name” “new folder name”

I did this on both servers participating in the replication. further to this I deleted the folder I was replicating on the destination server, and let DFS do all off the creating.

after 2 days I had a fully functional DFSR, working the way it should!!!!!

Some commands I found useful through the process:

dfsrdiag backlog /rgname:”cluster replication” /rfname:websites /rmem:RECEVINGSERVER /smem:SENDINGSERVER >c:\backlog1.txt

You might also find the %systemVolume%windows\debug folder useful.


Good Luck.

Jan 30 2009

Prepare for 2007 Exchange Server install

I thought I would blog about this, because we had so many issues, and I found Microsoft’s documentation to be rather poor on this occasion.

However it does explain the steps that need to be done in order to prepare (just not how to execute them easily)

So best to start with a quick read of the article


If you’re not patient and want to skip (little like me) then the steps are in short.


  1. setup /PrepareLegacyExchangePermissions: <domain name here>
    setup /pl <domain name here>
  2. Setup /PrepareSchema
    setup /ps
  3. Setup /PrepareAD [/OrganizationName: <organization name> ]
    setup /p [/on:<organization name>]
  4. Setup /PrepareDomain
    setup /pd


In order to run these commands you must run them from the command prompt on a server that is apart of the domain you are preparing, ensure you have the exchange 07 cd in the dvd drive and <DVD drive letter> then dir ensure you see the ’setup.exe’ and ’setup.com’ files in the DVD drive these should be in the root directory of the DVD


Now run the commands.

If you are unlucky like me you will get a few errors like:

"Exchange 2007 cannot be used with the version of Windows operating system running on this computer."

there are 2 possible reasons that this could happen for

you are trying to run the setup file which is a x64 bit file on a x86 system, you can not do this.

the other reason is that you are running the setup commands on a 2008 server, and you may be running exchange 2007 install not the exchange 2007 SP1. (2008 server requires SP1)

So if you are like us in the postion where you have many x64 bit 2008 servers but no x64 2003 servers then you’re going to have to download the 32 bit Exchange 2007 Management tools

get it from here:


the download is an EXE so I suggest installing winRAR and extracting it rather than installing it (unless you will use it), once extracted you will have the setup files, you can now run all of your prep on a x86 system.


Another good resource is this: http://support.microsoft.com/default.aspx/kb/555854

Jan 15 2009

IIS7 logging in Central Location for web farms







Ok so it appears IIS7 has thought of almost everything this time, and IIS7 is perfect for web Farms now they have the ability to share the configuration file.

While we are in the mists of setting up a load balanced environment with a IIS7 farm, we have everything working wonders. But when it comes to log files it seems Microsoft missed this. they really should have added in the functionality to combine log files to a central location, while we can prove a UNC path and also log to this, but only one server can do this.

the problems that you are faced with in a web farm is that:

If you have shared you ApplicationHost.config which in a web farm, (I highly recommend you do to make your life that little be easier) you will find that y our log file location will be the same on all servers. which could be a problem if:

you have used a UNC path, all servers in the farm will try and log to this location and only one can win, as the HTTP.SYS for that winning server will keep this open and therefore other servers will not be able to log.

If you have specified a local path, which really lest face it is your only option, you MUST make 100% sure that location or that drive exists on all of your servers in the web farm.

Now if you don’t care so much for your log files you will be happy to have them on each server and not collate them. But for the people like us we need them in a central location for a: We like to find our data quickly and be able to find what is going on, on our servers b; we have webstats that we need to be able to provide to our clients or internal staff and c; for a backup / management purpose its just easier to have them centralised.

So our requirement is to be able to copy all of our log files to one location, keeping of course the web log file structure i.e. W3SVC1, W3SVC2 etc…

there is one other problem I should tell you, all of the log file names are the same name, so we can just schedule a move / copy script because they will fail or overwrite the other log files, you MUST rename the files before moving. In hind sight we have do this but its a good idea anyway, because this will allow us to easily track down a problem on a server via the logs.

So we have come up with a script that can be scheduled on a daily basis that will rename all files to the server name_<orginalfilename>.log (this will look in all sub folders i.e. W3SVC1, W3SVC2 etc…) once the name change is done it will then MOVE the files to the central location.

A Big thanks to Alan Lee for this script, as he wrote the find and replace part (hats off)

set logs=C:\folder\folderagain
cd /d %logs%
dir /b %logs% > tmp.txt
for /D %%I in ("%logs%") do For /F "tokens=*"  %%J in (tmp.txt) do cd /d %logs%\%%J && for %%i in (u*.log) do move %%i %computername%_%%i 
del %logs%\tmp.txt
robocopy %logs% \\servername\FolderName /mov /E /minage:0 /R:3 /W:5 /LOG:%logs%\logs.log

Of course the script uses robocopy so you will need to include the .exe for this where ever you are executing this .bat from.

With 2008 Server you can also schedule emails with attachments, might not be a bad idea to attach the log file so you can archive it away.

Feb 26 2008

Server 2008 out performs Server 2003 by a mile

After multiple tests, I can happily say with out any doubt in my mind that 2008 server blows 2003 server in terms of web performance out of the water… How do I come to this conclusion you ask?

If you have been reading up on my partner in crime’s Blog (Alan Lee) and Craig Baileys blog we have been performing some extensive bench testing on the servers. Alan within his blog describes the method that we have used to perform our testing, which in brief is a Linux command which does a simple WGET mirror of the home page this mirror copies all the contents locally and then removes it after the crawl is complete, within the mirror command it also copies all other associated links on the home page (there is roughly 13 links which means it copies another 13 pages and associated images). 

One WGET, from the Elcom home page would receive a rough total of 6.4MB which with today’s bandwidth speeds and abundance of images and multimedia everywhere we look this is not a huge number, its what we would call realistic.

The test, the script was set to run 20 concurrent connections 1,000 times. Again 20 concurrent connections is very realistic, if anything it may not be enough, however we though give the size and the connections this was a good benchmark.

After running the tests through a out of the box 2003 Server running IIS 6.0 and SQL 2005 we posted a time of 3h:37min:59sec
then we ran the Server 2008 test again out of the box IIS7 and SQL 2005 and we posted a time of

JAW DROPPING isn’t it, yes we ran the exact same tests, we had them configured the same, both out of the box with IIS installed as an additional component and installed .NET 2.0 on both machines nothing different.

This worked out to be an amazing 17 times faster…

The test machine was the same the only difference was the drive, as one was loaded with 08 and the other with 03, however both of the drives were the same spec and same manufacture and even same model…

Machine Specs
Intel Core2 Dual core 2.12 GHz, 2GB RAM, 160GB Seagate SATA drive, Gigabit LAN

Testing (Linux) machine: Intel P4 3.0 GHz, 1GB RAM, 80GB Seagate SATA drive, Gigabit LAN


However during these tests we could see for 2003 Server the bottleneck appeared to be the CPU as it was using everything it could get, so we thought ok let try something bigger.

New Machine Specs
Intel Core Quad core 2.4 GHz, 4GB RAM, 160GB Seagate SATA drive, Gigabit LAN

Testing (Linux) machine: Intel P4 3.0 GHz, 1GB RAM, 80GB Seagate SATA drive, Gigabit LAN


Now that we had something bigger and better to test on, we thought ok lets compare something that is a little bit smaller in size, to speed the process up and also see what happens in terms of speeds.

So we deployed a Vanilla Community Manager site, which reduced the WGET to a small, manageable 70kb size. The results are still astonishing:

Start:        17:24:52
Finish:         18:15:53
Total time:    0:51min:01Sec

Start:        12:09:51
Finish:        12:22:23
Total time:    0:12min:3Sec

2008 is still 4 times faster with something tiny and very manageable. There were no bottlenecks here are all, the CPU for 2003 was not varying between 70 and 80%. The memory was very, very much under utilized, it seemed to only use a lousy 350MB, when it had gigs to use.


Where as Server 2008, made much better use of the memory and left more CPU, See below for the stats yourself.


This shows that the SQLSVR is taking little resources, there is obviously not much for it to do there, are really no transactions of it to think about.


The W3WP exe as you can see is using alot more resources, rightful so, but as you can see its using alot of memory, which isn’t a bad thing, considering it has so much (memory is cheaper then processing power)


Here is how the network card was reacting to the masses of connections.


And here is the overview of all performance, as you can see the disk was getting a work out, but dosnt seem to be a bottle neck either.

resourse over

To conclude… change all your servers to 2008, I know I cant wait to start implementing a few more 2008 Servers across our network at work.

Feb 6 2008

Windows Server 2008 is NOW RTM

As of last night at 8:10pm MSDN released the RTM versions of Server2008 Enterprise edition and Standard edition.

Both x64 and x86 were released. We are downloading it now as we speak (type), As soon as we get a chance we will be upgrading our RC0 and RC1 Servers to the real thing.

Feb 4 2008

Turning on SQL08 Database Encryption, Putting it to the test

So first of all you need to create your database if you haven’t already, then open ‘query’ on the ‘master’ database.

This is very important make sure you have the master database and not the user database. (of course I could have put ‘use Master’ but I am outlining this so you know and are learning :)   )

-- create a Database MasterKey

create master key encryption by password = 'password here';



-- Create a certificate to a database

Create certificate testingCert with subject = 'Testing Certinifcate for SQL'



Now for this part you need to ‘use databasename’ then paste the below Script, or of course just use the query on the database and then copy and paste

– creates the key for the database selected
– below I have used AES_128 this is more then enough but there are others
– such as:
– aes_192, aes-256 and tripleDes
create database encryption key with algorithm =AES_128
– notice I am using the cert name I have just created
encryption by server certificate testingcert
– use the database you are applying the encryption to:
alter databasename testing
set encryption on


You can now test to ensure that the encyption has been applied by simply going to the GUI.

This is found at the database level, in tasks toward the bottom.


Now for the testing of the data.

To show what small overhead is used, personally I see it is little to none.

Here is what was used and added to both databases (got this from http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=671045&SiteID=1 slightly modded)

create tableemployeesdemo
recordno int identity(1,1),
employeeno varchar(20),
lastname varchar(50),
firstname varchar(50),
middlename varchar(50)

declare @CTR INT
declare@ctrstr varchar(7)
select @ctr=0


This is applying the above script to the non encrypted database


this is applying the same script to the ENCYRPED database and the below images are what you should see once the above query has completed.



They both have the same table and same amount of rows.

Now its time to backup the databases, I am not going to tell you how you do that I am hoping you can do that one  Now this is what I get in terms of file sizes


As you can see there is little difference, obviously the difference may grow a little with a bigger database, which is something I am yet to try and will shortly :) , but I am doubting the file size will be all that different, from the original backup file.

So here is the results, first the NON-Encrypted


As you can see I do a find for green, which is something found in the table we created

Now this is the encrypted one


and look at that, no green found…

Conclusion it works, and it works well with little overhead of disk space, as for time frames well it’s hard to tell with such a small databases maybe with a bigger one it might be easier to judge. But this is looking promising.

Being that Elcom is on one of our SQL 2008 servers I am going to be applying this to the production database.

We love security.

Feb 4 2008

Some of new SQL 2008 fetaures

I recently attended the  of which MVP Craig Bailey runs on a regular basis. Peter Ward was kind enough to present us with some real life demos of some of the awesome new features Whilst Peter did mention a stack of them, I must say I am very impressed with some of them.

In particular the transparent data encryption. What is it? simple really. Microsoft now offers SQL backups which can now be encrypted, with NO overheads, that’s right no overheads, it doesn’t seem to take any more time to do this, it take exactly the same space as the non-encrypted backup, and restore times also appear to be the same.

 How does it work,  Well within the CTP it is currently activated via T-SQL scripting, i would suggest going hereto have a look at what SQL scripting is required.  I am sure at release time there will be some GUI associates with the TDE. Now lets say that the hard drives were stolen and this feature were enabled, it would be almost impossible to get the data and extract it out from the encrypted backups, why well once 2008 SQL is installed, it is also installed with a certificate, but this cert is made up of all your hardware components details. So this meaning that there is only 1 of every cert ever created. But then your about to ask how do you move a backup from one SQL server to the other, well apparently ( i have not tried this for-myself, I will be doing so when I get a free second, and will blog again so stay posted) you can move the Cert over with a password that is assigned to the backup.

I will will be blogging more about this feature with some screen shots shortly, stay posted :)

Another feature that we all wanted years ago was Intellisense, yes its now finally here in SQL 2008 server. THANKYOU

One last feature that I really am liking the sounds of is the Resource Governor however more on that to follow soon, i will have some real life examples shortly.

Jan 24 2008

Its linux… No its microsoft!

Whilst Linux has been doing what Microsoft has caught on to, now 2008 server will have the ability to work in an command line environment which is what they have called it ’server core’. This allows you to install and configure what modules / components you want the server to serve you with. Currently the Server Core allows for the following services to be installed and configured:

  • Active Directory Domain Services (AD DS)
  • Active Directory Lightweight Directory Services (AD LDS)
  • DHCP Server
  • DNS Server
  • File Services
  • Print Services
  • Streaming Media Services
  • Windows Virtualization
  • Why would you want a command line when you can have such a pretty GUI, well there are many reasons.

    Reduced maintenance – Because the Server Core installation option installs only what is required.

    Reduced attack surface - Being that Server Core installations so are minimal, there are considerably less apps running on the server, which in turn will decreases the attack surface.

    Reduced management- Well this is a no brainier, there are obviously fewer applications and services are installed on a server running. 

    Less disk space required- Again same as above really, fewer items, and only the essentials means less space

    Better resourse allocation / usage   – Well there is not the GUI and background applications to start-up there is just what you have told it to do, Being that there is also less demand for system resources you could potentially use lower spec machines to serve.

    Lower risk of bugs – Reducing the amount of code, applications, files, services the list goes on, all of this can help reduce the amount of bugs.

    A quote from Microsoft

    “Customers will benefit from an extremely modular, low-footprint Web hosting platform on top of the already small Server Core,” Microsoft said in a press release explaining the addition. “Server Core is ideal for hosting the PHP scripting language and now runs 10 to 20 times faster than before as a result of improvements in IIS.”

    There is still said to be some GUI involved, such as the essentials, task manager, notepad, the control panel.

    There is obviously some limitations
    from MS once again

    “The minimal nature of Server Core creates limitations:

    • There is no Windows shell and very limited GUI functionality (the Server Core interface is a command prompt).
    • There is no managed code support in Server Core (all code must be native Windows API code).
    • There is limited MSI support (unattend mode only). “


    Owh and the other limitation is there is no upgrade feature, in case you decide you want to go to the full GUI, or you want to roll your 2003 Server to a 08 Server core, you need to re-install cleanly.