Making Citrix Stats Work for You – part 4

This post is part of a 6 part series. Jump to [part 1] [part 2] [part 3] [part 5] [part 6].

To sum up how we got here:

  1. We used PowerShell to gather some specific session stats from Citrix MFCom and output them to a text file. [part 1] [part 2]
  2. We then created a database and table to hold this data. [part 3]
  3. Following this, we created a job in Visual Studio to parse the text file and insert it into the above table. [part 3]

Now, I will show how to save the Visual Studio job directly to an MS SQL server and have that server run a job to insert new data on a periodic basis.

With your Visual Studio package open, go to File and select Save Copy of Package As…

Select SQL Server for package location, enter the server name, set the authentication type and credentials for the DB owner.

vs-1

Choose and name a location for the Package Path and hit OK.

vs-2

Change the protection level to “Rely on server storage and roles for acces control” and click OK to save.

Now login the SQL server with Management Studio and open SQL Server Agent and Jobs.

vs-31
Right-click on jobs and select New Job.  Select the Steps page and click new.

vs-4

Hit the Type drop down and select SQL Server Integration Services Package.  Enter the server name and then look for the Package.  It will have the same name you gave it when you exported it to the SQL server.  Give the step a name and click OK.

vs-51

Now click on schedules to run this job every 5 minutes.

vs-6

Click OK, name the job and open the Job Activity Monitor to confirm the job kicks off.  If you get a Succeeded result, you can query the table to ensure it has got new data.  So now the SQL server is kicking off the Visual Studio data flow job to read a flat text file and enter the data into a table every 5 minutes.  Next post, we’ll pull the data out of the table in some meaningful ways and place it in MS SQL Reporting Services.

Thanks,
Alain

Advertisements

Making Citrix Stats Work for You – part 3

This post is part of a 6 part series. Jump to [part 1] [part 2] [part 4] [part 5] [part 6].

Okay put your propeller hats on …

As I mentioned in my last post, I am going to run the PowerShell script periodically to gather new data.  This is easily accomplished with a batch file and a Windows scheduled task.  Here’s the batch file:

@echo off
powershell.exe -noninteractive w:\qfarm\Count-CitrixSession.ps1
exit

I’m going to run this batch file every 3 minutes with Windows Scheduled Tasks.  It will take a minute to run, so we’ll get updated data about every 4 minutes.   To import this data, we will have to create a database/table to hold it.   For this case, I’m using an MS SQL 2005,  so I’ll open Management Studio.

bi-part1

Next, we’ll create a database and table (in this example, the database is already created).  The table will consist of a time/date stamp field and 4 fields that are from the output of the PoSH script.

bi-part22

The little bit of cleverness in this table lies with the msgdatetime column.  Its default value (which you set when you create the table) is the getdate() function in MS SQL.  That way, it will always get the current date and time that the row was created.  The other columns will be a plain text data type, which we will have to convert when we do certain queries.

bi-part33

Now we have the text file that’s being updated every 4 minutes (approximately) and a table to hold the data.  I’m going to use Visual Studio 2005 to actually do the import.   Run VS and create a new Integration Services Project.

bi-part4

Now drag a Data Flow Task into your design window.

bi-part5

Double-click the new Data Flow Task and drag over a flat file source (to read the text files) and an OLE DB Destination (to communicate with the database).

bi-part6

Double-click the Flat File Source and create a new Flat file connection manager.  Give the connection manager a name and click Browse to point it to the text file that the PoSH script is creating.

bi-part7

Now, we have to set the properties of the connection manager to correctly parse the text file.  This will allow easy import of the data into our database table.   In this case, we can use a semicolon as the column delimiter and {CR}{LF} as the row delimiter.  This gives us 4 columns with the username, the applicationame, the servername, and the session state.

bi-part8

Next, drag the green arrow from Flat File Source to OLE DB Destination.  Double-click on the OLE DB Destination and create a new OLE DB Connection manager.  Enter in the ODBC information to connect to the database and select the database we want to use.

bi-part9

Now we can select the table we created before.

bi-part10

Select Mapping to coorelate the fields in the text file with the fields in the table.

bi-part11

Now to test the data flow, click the green triangle in the tool bar or use the Debug menu and Start Debugging.  You should get confirmation that a number of rows were imported.  A quick query of the table in SQL can confirm that it has data.

bi-part12

Next post will deal with uploading the project to an SQL server and creating a schedule that will automatically run, thus importing new data into the database on a recurring schedule.

Thanks,
Alain

Making Citrix Stats Work for You – part 2

This post is part of a 6 part series. Jump to [part 1] [part 3] [part 4] [part 5] [part 6].

For the next part of this series, I just want to get the data I’m looking for into text files for later use. Here’s the modified PoSH script (with comments) with the text file output:

# Create log files
$logonly = $false

# The $logonly variable will be set to false to prevent output to the screen.  If you wish to get visual feedback while the script runs, then set it to <em>$true</em>.
$sessLog = "w:\qfarm\totSessions.txt"
$appLog = "w:\qfarm\top20apps.txt"
$serverLog = "w:\qfarm\totSessionsServer.txt"
$multiLog = "w:\qfarm\multipleSessions.txt"
if (Test-Path $sessLog) {New-Item $sessLog -Type file -force | Out-Null}
if (Test-Path $appLog) {New-Item $appLog -Type file -force | Out-Null}
if (Test-Path $serverLog) {New-Item $serverLog -Type file -force | Out-Null}
if (Test-Path $multiLog) {New-Item $multiLog -Type file -force | Out-Null}

# Here, I'm defining several different text files depending on its data.  This could just as easily be one file, but for my purposes, it will be easier if they are separate.  The intention is to run this script periodically, and dump the # info into a database, so we do not need to keep the data.  Therefore, I am forcing the scripts to clear out old data before we write to them again.
$livesessions = 0
$disconnected = 0
$farm = New-Object -com "MetaframeCOM.MetaframeFarm"
$farm.Initialize(1)

# Load Up Array for a snapshot of current sessions
$sessionAry = @($farm.Sessions | select UserName,AppName,ServerName,SessionState)

foreach ($sess in $sessionAry) {
if ($sess.SessionState -eq "5") {$disconnected = $disconnected + 1}
else {$liveessions = $livesessions++}
}

if ($logonly) {Write-Host "The number of active citrix sessions is" $livesessions -fore red}
if ($logonly) {Write-Host "The numbrer of disconnected citrix sessions is" $disconnected -fore red}
Add-Content $sessLog "$livesessions"
Add-Content $sessLog "$disconnected"

if ($logonly) {Write-Host " "}

# Displays a list of published apps and the number of users on each
if ($logonly) {Write-Host "Total users on top 20 citrix applications" -fore yellow}
$sessionAry | group AppName | sort Count -desc | select Count,name -first 20 | ft -auto | Out-File $appLog

if ($logonly) {Write-Host " "}

# List of citrix servers and total number of sessions on each one
if ($logonly) {write-host "Total sessions on each citrix server" -fore yellow}
$sessionAry | group ServerName | sort name | select Count,Name | ft -auto | Out-File $serverLog

if ($logonly) {write-host " "}

# To see which users have more than one session open
if ($logonly) {write-host "First 20 Users with more than one citrix session" -fore yellow}
$sessionAry | group UserName | Sort Count -desc | select Count,Name -first 20 | ft -auto | Out-File $multiLog

That’s it. I’ll cover importing this data into a database in the next post in this series.

Thanks,
Alain