Tim

Footprints in the snow of a warped mind

ServerMaintenance

Where to find me

Flickr Icon  Twitter Icon  Linked In Icon  FaceBook Icon  Windows Live Alerts Butterfly  RSS 2.0 

The Site Doctor is Hiring!

Enjoy what you read here?
Think you can do better?
Join our team and get paid
up-to £35,000.
Apply now.

Tag Cloud

AJAX (4) Analysis (3) ASP (6) ASP.Net (59) Error Reporting (4) Web Service (2) WSDL (1) Atlas (2) Azure (1) Born In The Barn (1) Business (94) Business Start-up Advice (36) Client (17) Expanding Your Business (24) Recruitment (1) C# (25) Canoeing (4) Canoe Racing (5) Cheshire Ring Race (5) Racing (2) Training (4) Christmas (1) CIMA (1) Cisco (1) 7970G (1) CMS (1) Code Management (1) Cohorts (4) Commerce4Umbraco (1) Content (1) Content Management (1) Content Management System (1) CSS (4) dasBlog (5) DDD (2) DDDSW (1) Design (12) Icons (1) Development (27) Domain Names (1) eCommerce (12) Email (1) Employment (2) Festive Sparkle (1) General (39) Christmas (6) Fun and Games (11) Internet (22) Random (46) RX-8 (8) Git (2) Google (1) Google AdWords (1) Google Analytics (1) Hacking (1) Helpful Script (3) Home Cinema (2) Hosting (2) HTML (3) IIS (11) iPhone (1) JavaScript (5) jQuery (2) LINQPad (1) Marketing (6) Email (1) Multipack (1) MVC (3) Networking (3) Nintendo (1) Nuget (1) OS Commerce (1) Payment (1) Photography (1) PHP (1) Plugin (1) PowerShell (3) Presentation (1) Press Release (1) Productivity (3) Random Thought (1) Script (2) Security (2) SEO (6) Server Maintenance (7) Server Management (12) Social Media (2) Social Networking (3) Experiment (1) Software (11) Office (5) Visual Studio (14) Windows (5) Vista (1) Source Control (2) SQL (11) SQL Server (19) Starting Something New (2) Statistics (2) Stored Procedure (1) Sublime Text 2 (1) SVN (1) TeaCommerce (1) Testing (2) The Cloud (1) The Site Doctor (139) Turnover Challenge (1) Twitter (3) uCommerce (17) Umbraco (36) 2009 (1) 2011 (1) Useful Script (4) Virtual Machine (1) Web Development (72) WebDD (33) Wii (1) Windows Azure (1) XSLT (1)

Blog Archive

Search

<April 2014>
SunMonTueWedThuFriSat
303112345
6789101112
13141516171819
20212223242526
27282930123
45678910

Recent Comments

Blog Archive

Various Links

Google+

Blogs I Read

[Feed] Google Blog
Official Google Webmaster Central Blog
[Feed] Matt Cutts
Gadgets, Google, and SEO
[Feed] Ol' Deano's Blog
My mate Dean's blog on my space, equally as random as mine but not off on as much of a tangent!
[Feed] Sam's Blog
Sam is one of my younger brothers studying Product Design and Manufacture at Loughborough, this is his blog :) Enjoy!

Disclaimer
The opinions expressed herein are my own personal opinions and do not represent my employer's view in any way.

newtelligence dasBlog 2.2.8279.16125

Send mail to the author(s) Email Me (Tim Gaunt)

© 2014 Tim Gaunt.

Sign In

# Wednesday, February 13, 2013

How to setup passive FTP on a Windows Azure virtual machine

Wednesday, February 13, 2013 9:30:39 PM (GMT Standard Time, UTC+00:00)

This is more of a reminder for me than anything else. If you're looking for a great walkthrough on how to configure Passive FTP on a Windows Azure VM, check out the walkthrough from Ronald here -it got us up and running.

The thing that takes the time to write each time is the powershell script side of things so this time I made some notes:

  1. Run Get-AzurePublishSettingsFile to get your publishsettings file (save it somewhere easily accessible
  2. Run Import-AzurePublishSettingsFile d:\Azure.publishsettings
  3. Run the Get-AzureVM calls listed below (you can copy/paste in one go and powershell will work it's way through them -it can take a few minutes). If you're not sure what <ServiceName> and <Name> should be, these are the names you configured your VMs. 
    1. To get the Service Name Run: Get-AzureVM
    2. To get the Name of the server run: Get-AzureVM -ServiceName '<ServiceName>' (from above)
    Get-AzureVM -ServiceName '<ServiceName>' -Name '<Name>' | Add-AzureEndpoint -Name 'FTPData' -Protocol 'TCP' -LocalPort 20 -PublicPort 20 | Update-AzureVM
    Get-AzureVM -ServiceName '<ServiceName>' -Name '<Name>' | Add-AzureEndpoint -Name 'FTP' -Protocol 'TCP' -LocalPort 21 -PublicPort 21 | Update-AzureVM
    Get-AzureVM -ServiceName '<ServiceName>' -Name '<Name>' | Add-AzureEndpoint -Name 'FTPPassive00' -Protocol 'TCP' -LocalPort 7000 -PublicPort 7000 | Update-AzureVM
    Get-AzureVM -ServiceName '<ServiceName>' -Name '<Name>' | Add-AzureEndpoint -Name 'FTPPassive01' -Protocol 'TCP' -LocalPort 7001 -PublicPort 7001 | Update-AzureVM
    Get-AzureVM -ServiceName '<ServiceName>' -Name '<Name>' | Add-AzureEndpoint -Name 'FTPPassive02' -Protocol 'TCP' -LocalPort 7002 -PublicPort 7002 | Update-AzureVM
    Get-AzureVM -ServiceName '<ServiceName>' -Name '<Name>' | Add-AzureEndpoint -Name 'FTPPassive03' -Protocol 'TCP' -LocalPort 7003 -PublicPort 7003 | Update-AzureVM
    Get-AzureVM -ServiceName '<ServiceName>' -Name '<Name>' | Add-AzureEndpoint -Name 'FTPPassive04' -Protocol 'TCP' -LocalPort 7004 -PublicPort 7004 | Update-AzureVM
    Get-AzureVM -ServiceName '<ServiceName>' -Name '<Name>' | Add-AzureEndpoint -Name 'FTPPassive05' -Protocol 'TCP' -LocalPort 7005 -PublicPort 7005 | Update-AzureVM
    Get-AzureVM -ServiceName '<ServiceName>' -Name '<Name>' | Add-AzureEndpoint -Name 'FTPPassive06' -Protocol 'TCP' -LocalPort 7006 -PublicPort 7006 | Update-AzureVM
    Get-AzureVM -ServiceName '<ServiceName>' -Name '<Name>' | Add-AzureEndpoint -Name 'FTPPassive07' -Protocol 'TCP' -LocalPort 7007 -PublicPort 7007 | Update-AzureVM
    Get-AzureVM -ServiceName '<ServiceName>' -Name '<Name>' | Add-AzureEndpoint -Name 'FTPPassive08' -Protocol 'TCP' -LocalPort 7008 -PublicPort 7008 | Update-AzureVM
    Get-AzureVM -ServiceName '<ServiceName>' -Name '<Name>' | Add-AzureEndpoint -Name 'FTPPassive09' -Protocol 'TCP' -LocalPort 7009 -PublicPort 7009 | Update-AzureVM
    Get-AzureVM -ServiceName '<ServiceName>' -Name '<Name>' | Add-AzureEndpoint -Name 'FTPPassive10' -Protocol 'TCP' -LocalPort 7010 -PublicPort 7010 | Update-AzureVM
    Get-AzureVM -ServiceName '<ServiceName>' -Name '<Name>' | Add-AzureEndpoint -Name 'FTPPassive11' -Protocol 'TCP' -LocalPort 7011 -PublicPort 7011 | Update-AzureVM
    Get-AzureVM -ServiceName '<ServiceName>' -Name '<Name>' | Add-AzureEndpoint -Name 'FTPPassive12' -Protocol 'TCP' -LocalPort 7012 -PublicPort 7012 | Update-AzureVM
    Get-AzureVM -ServiceName '<ServiceName>' -Name '<Name>' | Add-AzureEndpoint -Name 'FTPPassive13' -Protocol 'TCP' -LocalPort 7013 -PublicPort 7013 | Update-AzureVM
    Get-AzureVM -ServiceName '<ServiceName>' -Name '<Name>' | Add-AzureEndpoint -Name 'FTPPassive14' -Protocol 'TCP' -LocalPort 7014 -PublicPort 7014 | Update-AzureVM
    
    
 

Don't forget to follow me on Twitter.

# Friday, September 02, 2011

Nuget server on IIS6 returns 404 when downloading package after upgrade

Friday, September 02, 2011 8:03:30 AM (GMT Daylight Time, UTC+01:00)

Nuget-Returns-404-Manager-ErrorWe updated our nuget server today and ran into a problem where regardless of package selected or whether it's through the nuget package manager or the nuget package explorer, the server returns 404 (File Not Found).

Nuget-Returns-404-Explorer-Error

What was odd about this was that the packages exist and the feed was valid:

Nuget-Returns-404-Manager

There are a couple of comments about this online e.g.: Codeplex Discussion 246387 but nothing helped.  It took a while to work out that it was caused by a slight change to the way the nuget server makes it's calls. Comparing the IIS logs between the old and new version of nuget server, the previous version of nuget server would redirect the user directly to the nupkg file:

2011-04-21 08:47:46 W3SVC1759424837 192.168.1.1 GET /Packages/TheSiteDoctor.2.0.235.68.nupkg - 80 - 192.168.1.2 Package-Installer/1.2.20325.9034+(Microsoft+Windows+NT+6.1.7601+Service+Pack+1) 200 0 0

On the new one however, it passes the various requests through a new MVC route "download":

2011-09-01 11:31:16 W3SVC1759424837 192.168.1.1 GET /download/TheSiteDoctor/2.0.235.68 - 80 - 192.168.1.2 Package-Installer/1.2.20325.9034+(Microsoft+Windows+NT+6.1.7601+Service+Pack+1) 404 0 3

This is fine if your nuget server is running on IIS7, it will "just work". However if you're running IIS6 you'll need to make one additional change which is mapping all request through the aspnet_isapi.dll (a wildcard mapping). This is easy enough:

  1. 1. Open the site's properties in IIS6
  2. 2. Navigate to the "Home Directory" tab
  3. 3. Click the "Configuration" button:
    Nuget-Returns-404-Explorer-IIS
  4. This will then open the "Application Configuration" window:
    Nuget-Returns-404-Explorer-IIS-Configuration
  5. Now you'll need to add the aspnet_isapi.dll mapping, the path of this will depend on the whether you're running Windox 64bit or not:
    32bit Framework: c:\windows\microsoft.net\framework\v4.0.30319\aspnet_isapi.dll
    64bit Framework: c:\windows\microsoft.net\framework64\v4.0.30319\aspnet_isapi.dll
    Make sure you uncheck the "Verify that file exists" checkbox
    Nuget-Returns-404-Explorer-IIS-Wildcard

Et voila your packages should all be working again.

Leave a comment if you need any additional help or it helped you.

 

Don't forget to follow me on Twitter.

# Tuesday, December 14, 2010

SQL Server: Warning: Fatal error 823 occurred at date / time Note the error and time, and contact your system administrator.

Tuesday, December 14, 2010 12:46:42 PM (GMT Standard Time, UTC+00:00)

error[2]We were contacted the other day by a client with issues selecting data from one of their tables after a recent server crash (not running on our servers or a site that we were involved in developing). The issue was easy enough to recreate as you just needed to select records after the server crash and you'd get the error:

Warning: Fatal error 823 occurred at date / time Note the error and time, and contact your system administrator.

A quick Google suggests a physical disk drive error and having a quick look at the issues it wasn't pretty. Running:

DBCC CHECKDB('DatabaseName') WITH NO_INFOMSGS, ALL_ERRORMSGS

Resulted in:

Msg 8909, Level 16, State 1, Line 5 
Table error: Object ID 0, index ID 12341, page ID (1:5880). The PageId in the page header = (9728:16777220). 
CHECKTABLE found 0 allocation errors and 1 consistency errors not associated with any single object. 


....


DBCC results for 'TableName'. 
Msg 8928, Level 16, State 1, Line 5 
Object ID 871674153, index ID 0: Page (1:5880) could not be processed. See other errors for details. 


....


There are 20993 rows in 584 pages for object 'TableName'. 
CHECKTABLE found 0 allocation errors and 8 consistency errors in table 'TableName' (object ID 871674153). 
Msg 8909, Level 16, State 1, Line 5 
Table error: Object ID 1109413712, index ID 24940, page ID (1:5883). The PageId in the page header = (25198:1632843825). 
CHECKTABLE found 0 allocation errors and 1 consistency errors in table '(Object ID 1109413712)' (object ID 1109413712). 

Most of the solutions found on Google resulted in some form of system restore but that's no good in this instance as the backups only existed for after the problem was identified (great eh!) so were useless.

Although it's not an ideal solution, you can use DBCC CHECKTABLE which in our case fixed the issue:

--Put the database into single user mode
ALTER DATABASE [DatabaseName] SET SINGLE_USER WITH NO_WAIT
--Check the erors and fix any issues found (that you can)
DBCC CHECKTABLE ('Orders', REPAIR_REBUILD)
--Put the database back into multiuser mode
ALTER DATABASE [DatabaseName] SET MULTI_USER WITH NO_WAIT

 

I'd be interested to know other solutions people may have to this issue.

Note to readers: Check that your hosting provider performs regular backups and checks the health of your server regularly to avoid this happening to you.

 

Don't forget to follow me on Twitter.

# Thursday, June 17, 2010

Set Umbraco Folder Permissions with Powershell

Thursday, June 17, 2010 2:47:22 PM (GMT Daylight Time, UTC+01:00)

powershell2xa4[1] If you're not configuring Umbraco through a web installer, you've had your installs in place for years and never checked the permissions or whoever set the permissions up was lazy and gave IIS write access to the entire folder, there will come a time when you want to restrict modify access to just those user(s) who should have access.

You can find a (pretty) complete list of the files/folders that the Umbraco install should have access to here but assigning them across 101 different installs is a PITA . Thanks to a helpful PowerShell script to set folder permissions from PowerShell.nu you can easily automate the process.

For those of you not familiar with PowerShell (like me) complete instructions are below. For the rest, here's the command:

Get-ChildItem -path ##PATH TO YOUR INSTALL## 
| Where { $_.name -eq "Bin" -or $_.name -eq "Config" -or $_.name -eq "Css" -or $_.name -eq "Data" -or $_.name -eq "Masterpages" -or $_.name -eq "Media" -or $_.name -eq "Scripts" -or $_.name -eq "Umbraco" -or $_.name -eq "Umbraco_client" -or $_.name -eq "UserControls" -or $_.name -eq "Xslt" } 
| ForEach {./SetFolderPermission.ps1 -path $_.Fullname -Access "NETWORK SERVICE" -Permission Modify}

 

Instructions:

  1. Save the SetFolderPermission.ps1 script to your server
  2. Open your PowerShell console (I think it's installed by default if not, you can download PowerShell here)
  3. Copy the above PowerShell command into notepad
  4. Update "##PATH TO YOUR INSTALL##" to your Umbraco install
  5. If your IIS install doesn't use NETWORK SERVICE as the default user, update it to your user
  6. Make sure it's all on a single line
  7. Copy/Paste/Run in PowerShell

Bonus

If you're uber lazy and just have a web folder of Umbraco installs you can set the path to the folder of Umbraco installs and use:

Get-ChildItem -path ##PATH TO YOUR FOLDER## -recurse
| Where { $_.name -eq "Bin" -or $_.name -eq "Config" -or $_.name -eq "Css" -or $_.name -eq "Data" -or $_.name -eq "Masterpages" -or $_.name -eq "Media" -or $_.name -eq "Scripts" -or $_.name -eq "Umbraco" -or $_.name -eq "Umbraco_client" -or $_.name -eq "UserControls" -or $_.name -eq "Xslt" } 
| ForEach {./SetFolderPermission.ps1 -path $_.Fullname -Access "NETWORK SERVICE" -Permission Modify}

 

I've not tried this mind you and can't recommend it but hey, it's there if you want it ;)

 

Don't forget to follow me on Twitter.

# Saturday, July 05, 2008

Deleting SVN directories with PowerShell

Saturday, July 05, 2008 4:25:32 PM (GMT Daylight Time, UTC+01:00)

I've been re-working our new SVN structures recently as I'm now starting to understand how it works but one of the issues I had was trying to move the files/folders from a previous SVN directory.

PowerShell is great if you understand it (which I'm also learning) so I thought I would share this little script with you. It just loops through the files/folders and removes all those named _svn. I found this script from Wyatt Lyon Preul and he complained about the length of the script, but from what I can tell you can condense that down to:

gci $folder -fil '_svn' -r -fo | ? {$_.psIsContainer} | ri -fo -r

I'm not that great with PowerShell yet but I hope that helps someone :)

WARNING: As ever, incase I'm wrong (it happens!) test that on a folder first that you don't worry about losing!
 

Don't forget to follow me on Twitter.

# Tuesday, October 16, 2007

Identify which application pool is associated with which W3WP.exe process

Tuesday, October 16, 2007 11:18:11 AM (GMT Daylight Time, UTC+01:00)

Today I needed to identify a site that was causing the W3WP.exe process to run at 100% CPU. I had hoped that there was some clever way of identifying the site from the process id but no such luck. The issue was escalated because we have multiple sites under each application pool.It was done like this to keep the overheads minimal (each W3WP.exe process needs circa 25MB to run) but it makes identifying rogue code difficult.

If you need to identify which W3WP.exe relates to which Application Pool, open CMD, navigate to your System32 directory and type:

cscript iisapp.vbs

That'll then list the relevant W3WP.exe processes, process id and their app pool name.:) -simple and useful, just the way I like it!

 

Don't forget to follow me on Twitter.

# Monday, June 18, 2007

Server management 101 -part 1: Website directory structures and Identifying folder sizes

Monday, June 18, 2007 10:24:35 AM (GMT Daylight Time, UTC+01:00)

Since getting our own dedicated server a couple of years ago we’ve had a fairly step learning curve which a lot of the time has been a tad hit-and-miss (never at the detriment of our customers I might add). Luckily we’ve had the superb support of Rackspace behind us but as others may not be so lucky, I thought I would post up a few nuggets we’ve received over the years. As I remember more, I’ll add additional posts.

Domain/Folder organisation

One of the first issues we came across (and I’m sure many people have already got into this position) was the structure of the folders on both the server and development machines. The solution we came up with was to have a common folder –for argument’s sake lets call it “WebsitesFolder”. Within “WebsitesFolder” you then create a new directory for each domain name and finally within that, a folder for each subdirectory i.e. www, blogs etc.

By creating a new folder for each subdomain, you are able to quickly find the correct folder for the domain. Then locally you are able to store the source files outside of the site’s root which will (or should) speed up your FTP transfer process as you won’t need to select which files to upload1. The structures might then look like this:
Development server

  • /domain.com
    • /www/
    • /subdomain/
    • /Source Imagery/
    • /Some Irrelevant Folder/
  • /domain2.com
    • /www/

Production server

  • /domain.com
    • /www/
    • /subdomain/
  • /domain2.com
    • /www/

1It might also be worth you checking out SyncBackSE which is an excellent FTP client that only uploads files you have changed since the last transfer. It also has the added advantage that it has customisable filters allowing you to ignore source files and folders as _notes, .cs, .vb etc. http://www.2brightsparks.com/syncback/sbse.html

Finding large directories

The other day I noticed that one of our server’s disk space was running a little low but as far as I was aware there was plenty of space left. As we tend to store all client data within set folders I was able to quickly identify that it wasn’t the client folders that was taking all the room so what was?

When you don’t know which folders are taking the space, there are a couple of tools you may find useful. The first I was told about was TreeSize (http://www.jam-software.com/freeware/index.shtml) -a free program that gives you a graphical representation of each folder’s usage:

It then allows you to quickly traverse the directory structure and identify the offending directory. There’s a load more information available through the easy-to-use interface but if all you want is a number it’s a little overkill.

The alternative to TreeSize

A heading? Just for this? Yes –this little tool is the Mac Daddy of directory size info as far as I’m concerned as it’s a free (we like free ;)) command line tool found on Microsoft’s site called “Directory Disk Usage” –DIRUSE.

DIRUSE is really easy to use, simply load up CMD and type in:
diruse /m /* c:\
and you’ll get a report of your chosen folder’s sub folders, related sizes and a count of the files within it. Ok it’s iteration can be a little slow but it gives you all the information you need quickly and easily.

The syntax is as follows:
DIRUSE [/S | /V] [/M | /K | /B] [/C] [/,] [/Q:# [/L] [/A] [/D] [/O]] [/*] DIRS

/S
Specifies whether subdirectories are included in the output.
/V
Output progress reports while scanning subdirectories.  Ignored if /S is specified.
/M
Displays disk usage in megabytes.
/K
Displays disk usage in kilobytes.
/B
Displays disk usage in bytes (default).
/C
Use Compressed size instead of apparent size.
/,
Use thousand separator when displaying sizes.
/L
Output overflows to logfile .\DIRUSE.LOG.
/*
Uses the top-level directories residing in the specified DIRS
/Q:#
Mark directories that exceed the specified size (#) with a "!".
(If /M or /K is not specified, then bytes is assumed.)
/A
Specifies that an alert is generated if specified sizes are exceeded. (The Alerter service must be running.)
/D
Displays only directories that exceed specified sizes.
/O
Specifies that subdirectories are not checked for specified size overflow.
DIRS
Specifies a list of the paths to check –you can use semicolons, commas, or spaces to separate multiple directories if required.

Note: Parameters can be typed in any order. And the '-' symbol can be used in place of the '/' symbol.

Also, if /Q is specified, then return code is ONE if any directories are found that exceed the specified sizes. Otherwise the return code is ZERO.

Example: diruse /s /m /q:1.5 /l /* c:\websitesfolder

 

Don't forget to follow me on Twitter.