Making Citrix Stats Work for You – part 3

This post is part of a 6 part series. Jump to [part 1] [part 2] [part 4] [part 5] [part 6].

Okay put your propeller hats on …

As I mentioned in my last post, I am going to run the PowerShell script periodically to gather new data.  This is easily accomplished with a batch file and a Windows scheduled task.  Here’s the batch file:

@echo off
powershell.exe -noninteractive w:\qfarm\Count-CitrixSession.ps1
exit

I’m going to run this batch file every 3 minutes with Windows Scheduled Tasks.  It will take a minute to run, so we’ll get updated data about every 4 minutes.   To import this data, we will have to create a database/table to hold it.   For this case, I’m using an MS SQL 2005,  so I’ll open Management Studio.

bi-part1

Next, we’ll create a database and table (in this example, the database is already created).  The table will consist of a time/date stamp field and 4 fields that are from the output of the PoSH script.

bi-part22

The little bit of cleverness in this table lies with the msgdatetime column.  Its default value (which you set when you create the table) is the getdate() function in MS SQL.  That way, it will always get the current date and time that the row was created.  The other columns will be a plain text data type, which we will have to convert when we do certain queries.

bi-part33

Now we have the text file that’s being updated every 4 minutes (approximately) and a table to hold the data.  I’m going to use Visual Studio 2005 to actually do the import.   Run VS and create a new Integration Services Project.

bi-part4

Now drag a Data Flow Task into your design window.

bi-part5

Double-click the new Data Flow Task and drag over a flat file source (to read the text files) and an OLE DB Destination (to communicate with the database).

bi-part6

Double-click the Flat File Source and create a new Flat file connection manager.  Give the connection manager a name and click Browse to point it to the text file that the PoSH script is creating.

bi-part7

Now, we have to set the properties of the connection manager to correctly parse the text file.  This will allow easy import of the data into our database table.   In this case, we can use a semicolon as the column delimiter and {CR}{LF} as the row delimiter.  This gives us 4 columns with the username, the applicationame, the servername, and the session state.

bi-part8

Next, drag the green arrow from Flat File Source to OLE DB Destination.  Double-click on the OLE DB Destination and create a new OLE DB Connection manager.  Enter in the ODBC information to connect to the database and select the database we want to use.

bi-part9

Now we can select the table we created before.

bi-part10

Select Mapping to coorelate the fields in the text file with the fields in the table.

bi-part11

Now to test the data flow, click the green triangle in the tool bar or use the Debug menu and Start Debugging.  You should get confirmation that a number of rows were imported.  A quick query of the table in SQL can confirm that it has data.

bi-part12

Next post will deal with uploading the project to an SQL server and creating a schedule that will automatically run, thus importing new data into the database on a recurring schedule.

Thanks,
Alain

Making Citrix Stats Work for You – part 2

This post is part of a 6 part series. Jump to [part 1] [part 3] [part 4] [part 5] [part 6].

For the next part of this series, I just want to get the data I’m looking for into text files for later use. Here’s the modified PoSH script (with comments) with the text file output:

# Create log files
$logonly = $false

# The $logonly variable will be set to false to prevent output to the screen.  If you wish to get visual feedback while the script runs, then set it to <em>$true</em>.
$sessLog = "w:\qfarm\totSessions.txt"
$appLog = "w:\qfarm\top20apps.txt"
$serverLog = "w:\qfarm\totSessionsServer.txt"
$multiLog = "w:\qfarm\multipleSessions.txt"
if (Test-Path $sessLog) {New-Item $sessLog -Type file -force | Out-Null}
if (Test-Path $appLog) {New-Item $appLog -Type file -force | Out-Null}
if (Test-Path $serverLog) {New-Item $serverLog -Type file -force | Out-Null}
if (Test-Path $multiLog) {New-Item $multiLog -Type file -force | Out-Null}

# Here, I'm defining several different text files depending on its data.  This could just as easily be one file, but for my purposes, it will be easier if they are separate.  The intention is to run this script periodically, and dump the # info into a database, so we do not need to keep the data.  Therefore, I am forcing the scripts to clear out old data before we write to them again.
$livesessions = 0
$disconnected = 0
$farm = New-Object -com "MetaframeCOM.MetaframeFarm"
$farm.Initialize(1)

# Load Up Array for a snapshot of current sessions
$sessionAry = @($farm.Sessions | select UserName,AppName,ServerName,SessionState)

foreach ($sess in $sessionAry) {
if ($sess.SessionState -eq "5") {$disconnected = $disconnected + 1}
else {$liveessions = $livesessions++}
}

if ($logonly) {Write-Host "The number of active citrix sessions is" $livesessions -fore red}
if ($logonly) {Write-Host "The numbrer of disconnected citrix sessions is" $disconnected -fore red}
Add-Content $sessLog "$livesessions"
Add-Content $sessLog "$disconnected"

if ($logonly) {Write-Host " "}

# Displays a list of published apps and the number of users on each
if ($logonly) {Write-Host "Total users on top 20 citrix applications" -fore yellow}
$sessionAry | group AppName | sort Count -desc | select Count,name -first 20 | ft -auto | Out-File $appLog

if ($logonly) {Write-Host " "}

# List of citrix servers and total number of sessions on each one
if ($logonly) {write-host "Total sessions on each citrix server" -fore yellow}
$sessionAry | group ServerName | sort name | select Count,Name | ft -auto | Out-File $serverLog

if ($logonly) {write-host " "}

# To see which users have more than one session open
if ($logonly) {write-host "First 20 Users with more than one citrix session" -fore yellow}
$sessionAry | group UserName | Sort Count -desc | select Count,Name -first 20 | ft -auto | Out-File $multiLog

That’s it. I’ll cover importing this data into a database in the next post in this series.

Thanks,
Alain

Making Citrix Stats Work for You – part 1

This post is part of a 6 part series.  Jump to [part 2] [part 3] [part 4] [part 5] [part 6].

Overview: There’s quite a bit of data that we can gather via queries to MFCom.  With this series I want to demonstrate how to use PowerShell, MS SQL, SQL Reporting Services and Visual Studio to gather real-time stats and present them in a dashboard that is easy to read and even easier to present to management.
I’m presenting today an edited PowerShell script that I grabbed from http://synjunkie.blogspot.com/2008/12/powershell-retrieving-useful-citrix.html.  Syn is also writing a hacking Citrix series of articles that is proving to be very interesting.

Here’s the original script:

#Count-CitrixSession.ps1

# Count citrix sessions and other useful info
$farm = new-Object -com "MetaframeCOM.MetaframeFarm"
$farm.Initialize(1)

# Displays a list of published apps and the number of users on each
write-host "Total users on each citrix application" -fore yellow
$farm.sessions | select UserName,AppName | group AppName | Sort Count -desc | select Count,Name | ft -auto
$<span class="livesessions">write-host "The number of current citrix sessions is" $livesessions -fore red
write-host " "

# List of Citrix servers and total number of sessions on each one
write-host "Total sessions on each citrix server" -fore yellow
$farm.sessions | select ServerName,AppName | group ServerName | sort name | select Count,Name | ft -auto
write-host " "

# To see which users have more than one session open
write-host "First 15 Users with more than one citrix session" -fore yellow
$farm.sessions | select UserName,AppName | group UserName | Sort Count -desc | select Count,Name -first 15 | ft -auto

This script is useful, but I found that it took several minutes to run in our environment due to iterating through all our Citrix sessions 3 times.  To speed the script up and ensure the data did not change as the script was running I loaded everything into an array and then replicated the data output.

$livesessions = 0
$disconnected = 0
$farm = New-Object -com "MetaframeCOM.MetaframeFarm"
$farm.Initialize(1)

# Load Up Array for a snapshot of current sessions in CITGO
$sessionAry = @($farm.Sessions | select UserName,AppName,ServerName,SessionState)
foreach ($sess in $sessionAry) {
if ($sess.SessionState -eq "5") {$disconnected = $disconnected + 1}
else {$liveessions = $livesessions++}
}
Write-Host "The number of active citrix sessions is" $livesessions -fore red
Write-Host "The numbrer of disconnected citrix sessions is" $disconnected -fore red
Write-Host " "

# Displays a list of published apps and the number of users on each
Write-Host "Total users on top 20 citrix applications" -fore yellow
$sessionAry | group AppName | sort Count -desc | select Count,name -first 20 | ft -auto
Write-Host " "

# List of citrix servers and total number of sessions on each one
write-host "Total sessions on each citrix server" -fore yellow
$sessionAry | group ServerName | sort name | select Count,Name | ft -auto
write-host " "

# To see which users have more than one session open
write-host "First 20 Users with more than one citrix session" -fore yellow
$sessionAry | group UserName | Sort Count -desc | select Count,Name -first 20 | ft -auto

Now the script runs 3 times as fast.

Thanks,
Alain

%d bloggers like this: