Wednesday, 28 January 2009

Using Powershell to Find Free Space in Exchange 2003 databases

One regular task for Exchange admins can be reclaiming free space within Exchange mailbox stores either after a large amount of data has been removed or just a significant amount has built up over time for various reasons. For those of you who don't know, to reclaim the space the mailbox store has to be taken offline and the database defragged to get the space back on the disk.

Finding good candidates for defragging especially in a large environment with multiple databases and Exchange servers can be a pretty tedious task. Exchange records an entry in the Application Event Log (Event ID 1221) after online maintenance has taken place which tells you how much free space is in the database - this means trawling through the event logs on each server and recording which databases you think are worth defragging. So naturally I wrote a Powershell script to save wasting time on this task!

Using WMI the script queries the Application log on each of the Exchange servers you choose, looking for 1221 events in the last day. It sucks out the name of the mailbox store and the amount of free space in the database from the message field of the event and if the free space is greater than a particular figure (in the example below 3GB) adds the info to a csv file.

You could obviously change the figure to meet your needs and also if you remove the if statement you could get it to report on every database so you have a report of free space across all databases.

Tip: If you have clustered mailbox servers then you only need to point the script at one of the servers in the cluster since it will contain all of the event log entries for each server in the cluster.

#Check to see if csv exists and if so remove it
If (Test-Path "FreeSpaceGreaterThan1GB.csv")
Remove-Item "FreeSpaceGreaterThan1GB.csv"

#Set the columns for the csv file
$rows = "Servername," + "Mailbox Store," + "Free Space (MB),"
Add-Content FreeSpaceGreaterThan1GB.csv $rows

$ExchServer = 'server1','server2'
foreach ($Server in $ExchServer){

#Get the time 1 day ago in the right format for WMI query
$WmidtQueryDT = [System.Management.ManagementDateTimeConverter]::ToDmtfDateTime([DateTime]::Now.AddDays(-1))
#Perform WMI query of Event 1221 in Application log in the last day
$1221 = Get-WmiObject -computer $ExchServer -query ("Select * from Win32_NTLogEvent Where Logfile='Application' and Eventcode = '1221' and TimeWritten >='" + $WmidtQueryDT + "'")

foreach ($event in $1221){

#Get the name of the Mailbox Store
$MBXStoreLocationStart = $event.Message.IndexOf("Storage Group") + 16
$MBXStoreLocationFinish = $event.Message.IndexOf("has") - 2
$MBXStoreLocation = $event.Message.SubString($MBXStoreLocationStart, $MBXStoreLocationFinish - $MBXStoreLocationStart)

#Get the free space figure and convert it to an integer
$MBLocationStart = $event.Message.IndexOf("has") + 4
$MBLocationFinish = $event.Message.IndexOf("megabytes") - 1
$MBLocation = $event.Message.SubString($MBLocationStart, $MBLocationFinish - $MBLocationStart)
$result = [int]$MBLocation

$ComputerName = $event.ComputerName

#If free space > 3GB, add the details to the csv file
if ($result -ge 3072){

$rowline = "$ComputerName," + "$MBXStoreLocation," + "$MBLocation,"
Add-Content FreeSpaceGreaterThan1GB.csv $rowline


Sunday, 25 January 2009

Using Powershell to Monitor VMware Guests - on a Budget...

...i.e. a budget of £0.

(Update 28/01/09 - some feedback about this post and the reason we are not using the built in alerts in the VI client is because the CPU alerts in this case were not granular enough for us.)

So this all stemmed from trying to track down which process was causing particular servers' CPU to hit 100% for a period. So first of all my colleague and Get-Scripting co-host Alan Renouf traded a script back and forth which ended up as the CheckHighCPU function - it is now pretty cool and comes back with a list of processes sorted by how much CPU they are using and very importantly for our circumstance who is the owner of each process.

The servers in question all belong to a particular cluster in ESX. Rather than constantly having to monitor the console, wait for a VM to turn red and then run the script to track down the process we decided to try and monitor them with a Powershell script, kick off the CheckHighCPU function when a server's CPU hit 100% for a significant enough period and email a warning through with the process details - and so the below script was born.

OK, its not Operations Manager and to be honest its not really production quality, but it does the job we need it to.

The VI Toolkit from VMware is a great set of cmdlets you can plug into your Powershell console to manage your VMware environment. You can use the Get-Cluster and Get-VM cmdlets to return a list of all the VM's in that cluster as objects. You can then use the very handy Get-Stat cmdlet to retrieve performance data for each VM.

In this case we check the cpu.usage.average statistic of a period of the last few minutes (watch out for the -IntervalMins parameter it can produce a period slightly different to what you would expect) and if its over 99% run the CheckHighCPU function and send the results by email.

We then make the script sleep for a short time period so that we are not constantly bombarded with alerts if a warning is triggered.

Obviously if we wanted to make it production quality we would add in some error checking and testing to see if an alert had recently been sent, but for the time being its doing a great job for the requirements that exist.

You could obviously re-use the below to monitor for different statistics offered by Get-Stat like disk or memory.

Function EmailWarning ()
param ($ServerName,$Attachment)
#Email warning
Write-Output "Creating E-Mail Structure"

$smtpServer = "servername"

$msg = new-object Net.Mail.MailMessage
$att = new-object Net.Mail.Attachment($attachment)
$smtp = new-object Net.Mail.SmtpClient($smtpServer)

$msg.From = "sender"
$msg.Subject = "Server Warning - High CPU on $Servername"
$msg.Body = "$Servername has a CPU value of $HighCPU %"

Write-Output "Send E-Mail"



Function CheckHighCPU ()
param ($Target)

$procs_total = Get-WmiObject -Class Win32_PerfRawData_PerfProc_Process -Filter 'name="_total"' -ComputerName $Target
$procs = Get-WmiObject -Class Win32_PerfRawData_PerfProc_Process -Filter 'name<>"_total"' -ComputerName $Target

int64]$totalpercentuser = 0
foreach ($proc in $procs_total)
$totalpercentuser = $totalpercentuser + $proc.PercentUserTime}

decimal] $perc = [System.Convert]::ToDecimal($totalpercentuser)

$myCol = @()
Foreach ($proc in $procs){
$proc_perct = (($proc.PercentUserTime / $perc) * 100)
if ($proc_perct -gt 1){
$Process = Get-WmiObject win32_process -ComputerName $target | where {$_.ProcessID -eq $proc.IDProcess}
$MYInfo = "" | select-Object Name, CPUUsage,Owner, ProcessID
$MYInfo.Name = $
$MYInfo.ProcessID = $proc.IdProcess
$MYInfo.CPUUsage = [Math]::Round($proc_perct, 0)
$MYInfo.Owner = $process.GetOwner().user
$myCol += $MYInfo

$myCol | Sort-Object CPUUsage -Descending | Out-File $file
EmailWarning $VMname $file

Connect-VIServer servername
$vms = Get-Cluster clustername | get-vm
$time = Get-Date

do {

foreach ($vm in $vms){

$VMname = $
$filename = $VMname + '.txt'
$file = "C:\Scripts\$filename"
$stats = Get-Stat -entity $vm -IntervalMins 2 -stat cpu.usage.average -MaxSamples 1
write-host $VMname

if ($stats.value -ge 99){
$HighCPU = $stats.value
Write-Host "Warning!" -ForegroundColor red


Start-Sleep -Seconds 30

until ($time.hour -ge 17)

Presenting at MM&M User Group UK - Wednesday 18th February

So I was lucky enough to receive an invite from Nathan Winters who runs the MM&M User Group UK (aka Exchange) to present at their next meeting on Wednesday 18th February at Microsoft in London.

It will be an evening around using Powershell to manage Exchange, the agenda is as below:

18:15 - 18:40 Arrival

18:40 - 18:45 Introduction to speakers and the aims of the group

18:45 - 19:30 1st session; Jonathan Medd, Introduction to PowerShell and Using PowerShell to manage Exchange 2003!

19:30 - 19:50 Food!

19:50 - 20:45 2nd session; Will Rawlings, Causing no harm with PowerShell, and using PowerShell on a large Exchange environment

20:45 - 21:00 Summing up and suggestions for future meetings.

21:00 The End!

If you want to attend you can sign up here

Tuesday, 13 January 2009

UK Powershell User Group - January Meeting

The January meeting of the UK Powershell User Group takes place Wednesday 21st Jan 2009 6.30pm GMT.

Memphis Room

Building 3

Microsoft Campus TVP

Reading UK

We have a Live Meeting with Jeffrey Snover talking about PowerShell v2

Pizza break

Then Jeremy Pack from HP will then be doing a PowerShell demo - exact topic to be confirmed

It should be a great evening. Jeffrey is obviously the man to ask if you have any burning questions about Powershell, particularly V2 for this event. Jeremy is a long time member of the user group and is incredibly knowledgeable about Powershell so I'm really looking forward to seeing what he is going to talk about.

If you want to turn up please contact Richard Siddaway at the below website, you can also find details of the webcast if you wish to watch from afar.!43CFA46A74CF3E96!1987.entry

Sunday, 11 January 2009

Modifying AD accounts with Powershell after an Exchange 2003 dial-tone restore

Recently I've been testing out some different disaster recovery scenarios for Exchange 2003, one of which involved a dial-tone method - i.e. create some new mailbox servers with blank databases to get users up and running quickly and then merge the restored data back in later. One of the types of dial tone method we used was to create new server names rather than re-use existing Exchange server names.

So for example to re-create a four node (3 active, 1 passive) cluster with new names, instead of


you would now use something like


Then you would need to amend the AD user accounts for users on those Exchange Servers to point to the new locations - the following properties need to be changed.


None of these properties can be changed through ADUC, you would need to use ADSIEdit if you wanted to use a GUI. Of course those smart people among you would choose to user Powershell anyway.

So naturally I turned to my trusty friend the Quest AD cmdlets to help me out.

First of all we get all the users who have a mailbox based on one of the original servers; depending on your naming convention you may need to adjust this filter to make sure you are matching the correct people. The three properties mentioned are not returned by default from Get-QADUser so we have to specify them.

We then loop through each user and using the Switch statement if we match ExchangeServer1, 2 or 3 we amend the text of each variable to be the new Exchange servername (note: homemta will be the same for all of these users) and then user the Set-QADUser cmdlet to change these properties on the account.

$users = Get-QADUser -ldapFilter '(msExchHomeServerName=*ExchangeServer*)' -IncludedProperties homemdb,msexchhomeservername,homemta -sizelimit 0

foreach($user in $users){

$homemdb = $user.homemdb
$msexchhomeservername = $user.msexchhomeservername
$newhomemta = 'CN=Microsoft MTA,CN=ExchangeServer1New,CN=Servers,CN=Exchange,CN=Administrative Groups,CN=Springfield,CN=Microsoft Exchange,CN=Services,CN=Configuration,DC=springfield,DC=local'

switch -wildcard ($homemdb)

"*ExchangeServer1*" {$newhomemdblocation = $homemdb.replace("ExchangeServer1","ExchangeServer1New"); $newmsexchhomeservername = $msexchhomeservername.replace("ExchangeServer1","ExchangeServer1New"); Set-QADUser $user -objectAttributes @{homemdb=$newhomemdblocation;msexchhomeservername=$newmsexchhomeservername;homemta=$newhomemta}; break}
"*ExchangeServer2*" {$newhomemdblocation = $homemdb.replace("ExchangeServer2","ExchangeServer2New"); $newmsexchhomeservername = $msexchhomeservername.replace("ExchangeServer2","ExchangeServer2New"); Set-QADUser $user -objectAttributes @{homemdb=$newhomemdblocation;msexchhomeservername=$newmsexchhomeservername;homemta=$newhomemta}; break}
"*ExchangeServer3*" {$newhomemdblocation = $homemdb.replace("ExchangeServer3","ExchangeServer3New"); $newmsexchhomeservername = $msexchhomeservername.replace("ExchangeServer3","ExchangeServer3New"); Set-QADUser $user -objectAttributes @{homemdb=$newhomemdblocation;msexchhomeservername=$newmsexchhomeservername;homemta=$newhomemta}; break}
default {"Nothing for this user"}


I was also interested to see the resulting performance of this script and was pleasantly surprised to see it change 6000+ accounts in only 10 mins.

A sidenote to this method is that you won't actually see the mailboxes appear in Exchange System Manager until either they receive an email or a user logs on to them. To prove that this method had worked I created a quick Distribution Group, used the below one-liner to populate it with all of the above users and then sent an email to this group.

Get-QADUser -ldapFilter '(msExchHomeServerName=*ExchangeServer*)' -sizelimit 0 | Add-QADGroupMember TestGroup

There are of course many different ways to carry out Exchange DR, but this proved a useful exercise.

Tuesday, 6 January 2009

Powershell Active Directory One-Liners

Recently I blogged about some scripts I left behind in my previous employment for managing AD - really a lot of them were just quick one liners. Not that that is necessarily a bad thing, one of the best things for me about Powershell is the way you can get great information with very little effort. Of course I am using my good friend the Quest AD cmdlets.

I thought I'd share a few of them:

Find Expired Users:

On the theme of cleaning out AD, find user accounts which have expired.

Get-QADUser -searchroot 'domain.local/resources/users' -SizeLimit 0 -ldapFilter '(pwdlastset=0)' | ft name,passwordlastset

Find Users Not Logged in Since X Days:

On the same theme, supply X 'how many days to go back' and find users who haven't logged in during that time. (OK I cheated on the one line a bit on this one)

$now=get-date; $daysSinceLastLogon = X; Get-QADUser -sizeLimit 0 -SearchRoot 'domain.local/resources/users' | where {$_.lastlogontimestamp.value -and (($now-$_.lastlogontimestamp.value).days -gt $daysSinceLastLogon)} | ft name,lastlogontimestamp

Note: X needs to be more than 14 days to allow for the lastlogontimestamp attribute to have replicated.

Find Users Whose Password is set to Not Expire:

Keep tabs on those naughty administrators who think they can exempt themselves from the corporate password policy - you know who you are!

Get-QADUser -Sizelimit 0 -SearchRoot 'domain.local/resources/users' -PasswordNeverExpires $True | ft name

How Many Users in Active Directory?

Need to keep track on an expanding user population? Need to figure out how many CAL's you need? Easy.

Get-QADUser -DontUseDefaultIncludedProperties -SearchRoot 'domain.local/resources/users' -SizeLimit 0 | Measure-Object


Monday, 5 January 2009

PowerGUI webcast - Exchange 2003 Powerpack demo

I was recently invited to record a webcast by the Product Manager at Quest for PowerGUI, Darin Pendergraft, demoing the Exchange 2003 Powerpack I made for PowerGUI.

They came up with the idea to make some videos / webcasts giving some community members the opportunity to show what PowerGUI can do. A lot of people primarily use it only as a script editor, but the management console side of things is brilliant once you get into it - hopefully these examples will help inspire more people to make some powerpacks.

I had a lot of fun recording the interview, it was funny to be on the other end of the questions for once, normally I'm asking them for the Get-Scripting podcast.

It runs for about 15 - 20 mins, I hope you enjoy it.

Get-Scripting Podcast Episode 6

So we got Episode 6 of the Get-Scripting Podcast out last week. You can get it from here:

Download it here, subscribe in iTunes or via a different feed reader

Normally Alan, Matt and I all arrange to be in the same place to record the show. Given we live in completely different places in the UK this is not easy and is probably the main reason we keep it to one show per month. We could do a Skype type thing, but we prefer to keep it a face-to-face recording. Now though that I work in the same office as Alan this time we took the opportunity to record the show during a lunchtime and I got Matt to post-edit the result when meeting for family time over Christmas.

End result: I think the show was a bit more relaxed because we just let the tape roll and edited afterwards rather than stopping and starting when normally we sometimes re-record bits; logistically much easier; still took about the same time to edit.

The interview with Ben Pearce which I had recorded a few weeks earlier was an absolute blast. Ben is such a fun guy to be around and made for a real easy interview because he doesn't stop talking; I think we got some great tips from him for Powershell beginners too.