PowerShell: List as ItemSource for WPF DataGrid – Does Not Work

I have been working on a PowerShell module which required a GUI. I had already built all fo the functions, tested and validated, so this should have been a simple case of creating a pretty little front end for something functional.
The problem I had is that in my code I defined some custom types. One of these custom types included a property that is a list of another custom type. A parent child relationship. This was defined in C# and included in the PS code. It looked like this (with a whole load of extra stuff removed):

$TypeDef = @"
public class LogEntry {
    public System.DateTime LogTime;
    public string LogDetail;
}

public class UserDataDetails {
    public string samAccountName;
    public System.Collections.Generic.List Log;
}
"@
Add-Type -TypeDefinition $TypeDef

Each UserDataDetails object can have a number of LogEntry objects associated with them. This is fine, and all other code was able to add, query, sort and display as required.
I am using WPF to create a simple form with a DataGrid to display this list of information. The WPF definition looked like this:


Add-type -AssemblyName PresentationCore
Add-type -AssemblyName PresentationFramework
[xml]$XAML = @"
<Window x:Class="System.Windows.Window"
xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"
xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml"
Title="MainWindow" Width="867" WindowStartupLocation="CenterScreen" WindowState="Maximized">
<Grid>
<DataGrid Name="dgv1" Margin="3" VerticalAlignment="Top" CanUserReorderColumns="False" Grid.ColumnSpan="2" MinHeight="50" AutoGenerateColumns="True">
<DataGrid.Columns>
<DataGridTextColumn Header="Test" Binding="{Binding Path=LogDetail}"/>
</DataGrid.Columns>
</DataGrid>
</Grid>
</Window>
"@
$Reader=(New-Object System.Xml.XmlNodeReader $xaml)
$Window=[Windows.Markup.XamlReader]::Load( $Reader )
$dgv1 = $Window.FindName("dgv1")

Nothing complex, a single DataGrid on a form, with one additional column bound to the LogDetail field.
So, this is where I lost a couple of hours of my life. Try this code, and you will see the application display two log entries:

$Log1 = New-Object LogEntry
$Log1.LogTime = Get-Date
$Log1.LogDetail = "Test Log 1"
$Log2 = New-Object LogEntry
$Log2.LogTime = Get-Date
$Log2.LogDetail = "Test Log 2"

$LogCollection = @()
$LogCollection += $Log1
$LogCollection += $Log2

$dgv1.ItemsSource = $LogCollection

However, supplying a List as the Item Source, such as this:

$LogList = New-Object System.Collections.Generic.List``1[LogEntry]
$LogList.Add($Log1)
$LogList.Add($Log2)

$dgv1.ItemsSource = $LogList

And we get nothing:

I have tried various conversions to arrays to try and solve this, but I have not succeeded in any direct way of supplying a list to the ItemSource property.
The workaround that I have currently come up with is:

$BuildArray = @()
$LogList | % {$BuildArray += $_}
$dgv1.ItemsSource = $LogList

Which works and supplies the correct values. It just seems a shame to do it this way round and have to build a new object when the data already exists in an enumerable state.
I have not found anything either way which says that this should or shouldn’t work. It seems strange that it does not though as you can supply a list as a datasource to a DataGrid in C#. Must be something in the implementation of PS or WPF.

Be the first to like.
Posted in Powershell | Leave a comment

PowerShell: Running processes independently of a PS Session on Remote Machines

PowerShell remoting is a great way of utilising commands and processing power of remote systems all from one console. It is also good at pulling information from remote systems and collating this together. There are plenty of examples of using PSSessions, and the Invoke-Command functions to manipulate remote machines, bring down remote modules to work with locally, etc.

One of the shortcomings that I have come across, is the apparent inability to create a long running job on a remote session.

For example, I have a function that performs some processing, which can take anywhere from 20 minutes to 6 hours – depending on the amount of information that is supplied. This job is self contained – and reports itself by email, so once it is started there is no interaction.

I attempted to create a PSSession, and use Invoke-Command. This started the remote job successfully, however, when I closed my local instance of the shell window, the remote process also stopped.

Using Invoke-Command to start a process on the remote machine, something like the snippet below, exhibited the same result.

$Script = {Start-Process -FilePath C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe -ArgumentList "-Command Get-Service"}
Invoke-Command -ComputerName remotepc -ScriptBlock $Script

I tried a number of variations of this, exhausting all of the options relating to both sessions and invoked commands, but nothing I found actually achieved my goal.
Looking outside of these commands, I found that the WMI Classes expose the Win32_Process class, which include a Process.Create method. PowerShell can interact with WMI well, so after some quick testing, I found this method created a new process on the remote machine which did not terminate when my local client disconnected.
I was able to wrap this up into a nice little function that can be re-used. It exposes the computer name, credentials and command options. The example included shows how you can start a new instance of PowerShell on the remote machine which can then run a number of commands. This could be changed to run any number of commands, or, if the script gets too long you could just get PowerShell to run a pre-created script file.

# ----------------------------------------------------------------------------------------------------------
# PURPOSE:    Starts a process on a remote computer that is not bound to the local PowerShell Session
#
# VERSION     DATE         USER                DETAILS
# 1           17/04/2015   Craig Tolley        First version
#                                              
# ----------------------------------------------------------------------------------------------------------

<# .Synopsis     Starts a process on the remote computer that is not tied to the PowerShell session that called this command.      Unlike Invoke-Command, the session that creates the process does not need to be maintained.      Any processes should be designed such that they will end themselves, else they will continue running in the background until the targeted machine is restarted.  .EXAMPLE    Start-RemoteProcess -ComputerName remotepc -Command notepad.exe    Starts Notepad on the remote computer called remotepc using the current session credentials .EXAMPLE     Start-RemoteProcess -ComputerName remotepc -Command "C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe -Command ""Get-Process | Out-File C:\Processes.txt"" " -Credential DOMAIN\Username     Starts Powershell on the remote PC, running the Get-Process command which will write output to C:\Processes.txt using the supplied credentials #>
function Start-RemoteProcess {
    Param(
        [Parameter(Mandatory=$true, Position =0)]
        [String]$ComputerName,
                
        [Parameter(Mandatory=$true, Position =1)]
        [String]$Command,

        [Parameter(Position = 2)]
        [System.Management.Automation.CredentialAttribute()]$Credential = [System.Management.Automation.PSCredential]::Empty
    )

    #Test that we can connect to the remote machine
    Write-Host "Testing Connection to $ComputerName"
    If ((Test-Connection $ComputerName -Quiet -Count 1) -eq $false) {
        Write-Error "Failed to ping the remote computer. Please check that the remote machine is available"
        Return
        }

    #Create a parameter collection to include the credentials parameter
    $ProcessParameters = @{}
    $ProcessParameters.Add("ComputerName", $ComputerName)
    $ProcessParameters.Add("Class", "Win32_Process")
    $ProcessParameters.Add("Name", "Create")
    $ProcessParameters.Add("ArgumentList", $Command)
    if($Credential) { $ProcessParameters.Add("Credential", $Credential) }     

    #Start the actual remote process
    Write-Host "Starting the remote process."
    Write-Host "Command: $Command" -ForegroundColor Gray

    $RemoteProcess = Invoke-WmiMethod @ProcessParameters

    if ($RemoteProcess.ReturnValue -eq 0) 
        { Write-Host "Successfully launched command on $ComputerName with a process id of $($RemoteProcess.ProcessId)" }
    else 
        { Write-Error "Failed to launch command on $ComputerName. The Return Value is $($RemoteProcess.ReturnValue)" }
} 

Once caveat of this approach is the expansion of variables. Every variable will be expanded before it is piped to the WMI command. For straight values, strings, integers, dates, that is all fine. However any objects need to be created as part of the script in the remote session. Remember that the new PowerShell session is just that, new. Everything that you want to use must be defined.

This code can be used to run any process. Generally you will want to ensure that you specify the full path to any executables. Remember that any paths are relative to the remote server, so be careful when you specify them.

Should you use this code to run PowerShell commands or scripts then you will need to keep a check on any punctuation that you use when specifying the command. Quotes will need to be doubled to escape them for example. This requires testing.

Also be aware, that this code will start a process, but there is nothing to stop it. Any process should either be self-terminating, or you will need to have another method of terminating the process. If you start a PowerShell session, they will generally terminate once the commands specified have completed.

Be the first to like.
Posted in Powershell | Leave a comment

PowerShell: Using AlphaFS to list files and folder longer than 260 characters and checking access

PowerShell is great. However, it has a couple of limitations – either by design or inheritance that are annoying to say the least. One commonly documented failing, which is inherited from the .NET framework is its inability to access files that have a total path length over 260 characters. Another limitation is the linear nature in which commands are executed.

The first issue here is a major issue, particularly when working with network file systems, roaming profiles or any area where longer path lengths exist. Having Mac or Linux users on your network means that path lengths over 260 characters are more likely, as both of these systems support long path names.

There is a very good library available which can help overcome the 260 character limit. It implements most of the .NET framework functions for accessing files and folders, without the path length limitation. It’s a great addition to any project that accesses files and folders.

I have been working on a project to migrate users who are still using roaming profiles to using folder redirection. Some scripting has been required to automate the process and minimise user interaction. This is being done using PowerShell. One of the components of the script involves finding how many files and folders existed, how big they are, and whether or not we had access to read them.

PowerShell could do this.

Get-ChildItem $path -Recurse -Force

can list all the files and the sizes (Length property). Piping that list to a

Get-Content -Tail 1 -ErrorAction SilentlyContinue -ErrorVariable $ReadErrors | Out-Null

will give you a variable that lists all files that have any errors. All good.

This command is susceptible to the path limit thought. It is also slow. Each item is processed in order, one at a time. Whilst getting just the end of a file is quick, this whole command still takes time. Running against a 200MB user profile, took it over 2 minutes to list all files with sizes into a variable and give me a list of files that have access denied. With over 2TB of user profiles to migrate, that was too long.

With this method out of the window, I looked at using some C# code that I could import. The .NET framework offers a host of solutions to processing this sort of data. I ended up with the function below. It uses the AlphaFS library to get details of the files and directories. This removed the limitation of the path length. Also, as I was using the .NET Framework, I could use File.Open(). This just opens the file without reading it. It still throws an access denied error if it cannot be read, just quicker. This whole process could then be combined into a Parallel For Each loop. Directories and files can be recursed concurrently. The result was a scan of a 200mb profile in around 10 seconds – a much more acceptable time.

The code could be used in a C# project, or in the format below it can be included in a PowerShell script. You will need to download the AlphaFS library and put it in an accessible location so that it can be included in your script.

# Start of File Details Definition
$RecursiveTypeDef = @"
using System;
using System.Collections;
using System.Collections.Generic;
using System.Data;
using System.Threading.Tasks;
using System.Diagnostics;
using System.Linq;

public class FileDetails
{
    public List GetRecursiveFileFolderList(string RootDirectory)
    {
        m_FileFolderList = new List();
        m_GetFileDetails(RootDirectory);
        return m_FileFolderList;
    }

    private List m_FileFolderList = new List();

    private void m_GetFileDetails(string DirectoryName)
    {
        List AllFiles = new List();
        List AllFolders = new List();

        FileInfo FI = new FileInfo();
        FI.FileName = DirectoryName;
        FI.Type = Type.Directory;
        FI.FileSize = 0;
        FI.ReadSuccess = true;
        try {
            AllFiles = Alphaleonis.Win32.Filesystem.Directory.GetFiles(DirectoryName).ToList();
        } catch {
            FI.ReadSuccess = false;
        }
        try {
            AllFolders = Alphaleonis.Win32.Filesystem.Directory.GetDirectories(DirectoryName).ToList();
        } catch {
            FI.ReadSuccess = false;
        }
        lock (m_FileFolderList) {
            m_FileFolderList.Add(FI);
        }

        Parallel.ForEach(AllFiles, File =>
        {
            FileInfo FileFI = new FileInfo();
            FileFI.FileName = File;
            FileFI.Type = Type.File;
            try {
                FileFI.FileSize = Alphaleonis.Win32.Filesystem.File.GetSize(File);
                FileFI.ReadSuccess = true;
            } catch {
                FileFI.ReadSuccess = false;
            }
            lock (m_FileFolderList) {
                m_FileFolderList.Add(FileFI);
            }
        });

        Parallel.ForEach(AllFolders, Folder => { m_GetFileDetails(Folder); });
    }

    public struct FileInfo
    {
        public long FileSize;
        public string FileName;
        public Type Type;
        public bool ReadSuccess;
    }

    public enum Type
    {
        Directory,
        File
    }
}
"@

#Update the following lines to point to your AlphaFS.dll file.
Add-Type -Path $PSScriptRoot\AlphaFS.dll
Add-Type -TypeDefinition $RecursiveTypeDef -ReferencedAssemblies "$PSScriptRoot\AlphaFS.dll", System.Data

# End of File Details Definition

# Use of the function: 
$FileInfo = New-Object FileDetails
$Info = $FileInfo.GetRecursiveFileFolderList("C:\Windows")
$Info | Format-Table -Autosize -Wrap

This will output a full file and directory list of the C:\Windows directory. The property ReadSuccess is true if the file could be opened for reading.

Plenty of scope to modify this to meet your needs if they are something different, but an example of how you can bring in the power of the .NET Framework into PowerShell to help really boost some of your scripts.

Be the first to like.
Posted in C#, Powershell, Programming | Leave a comment

‘You Have Been Logged On With a Temporary Profile’ when all profiles have been redirected to a specific location

This is a very strange issue, which I think will only affect a handful of people, and only those who have the right mix of configurations as described below.

Users logging on to a Windows 7 machine received the following popup:

Temporary ProfileThis message implied that there would be some informative details in the Event Log, unfortunately in this situation, nothing. No errors, no warnings, no information.

On this particular machine we were using the following GPO setting to force users to a specific roaming profile location. The machines are all sat inside a controlled network so access to the normal profile was not allowed.

Computer Configuration –> Administrative Templates –> System –> User Profiles –> Set roaming profile path for all users logging onto this computer

In the ProfileList key in the registry you can see the location that has been configured for the Central Profile (i.e the server copy of the roaming profile). Checking out the key for the specific user showed the following. The value can be found at: HKLM:\SOFTWARE\Microsoft\Windows NT\CurrentVersion\ProfileList\SID.

UserProfileRegKeyThe GPO was only configured with \\server\profiles$\%username% though. The addition of the Domain component into the path was unexpected.

After clearing all the profiles from the local machine, and rebooting, thinking that something must be corrupt, the issued recurred. Running a ProcMon against the system at boot time and tracking the change to this key showed the user profile service creating the CentralProfile value and populating it with the wrong value from the start.

This machine is quite heavily managed, and this involves running a couple of PowerShell scripts as scheduled tasks at startup. We had configured the tasks to run as local only, as they did not require any access to network resources. They were configured as below:

User Profile - Scheduled Task For some reason, even though this task was set to run locally, it was influencing the location of the roaming profile. Most strangely, it wasn’t just influencing the path of the profile for the account that was configured in the scheduled task, it was influencing all user accounts that logged on to the machine.

The fix for us was fortunately very simple. The job that the task was doing could quite easily be achieved by using the local SYSTEM account. After changing the task credentials, I did have to clear out all of the profiles from the system to remove the incorrect values, but since this change, the accounts have all loaded the correct profiles from the correct locations.

Be the first to like.
Posted in Windows 7 | Leave a comment

VB.net: Highlighting a search term in a DataGridView

I’m building a search form into an application that has a database back end. I managed to configure a nice little search which takes some user input, and then modifies the results shown in a DataGridView control. However, not being satisfied with just showing a subset of results, I wanted to be able to highlight the values that had matched so that it was clearer for the end user to see why the records still in the view were there.

This, it turns out, is not as simple as I hoped. However, I have now got it working. The form has a text box which has a sub for validating the input. This runs the actual search. The bit we are interested in here though is that we handle the CellPainting event on the DataGridView control, and then customise the painting of the control to meet our needs.

To give you an idea of what the highlighted form looks like:Highlighting in a DataGridView

 

This is the code that does the work. It is designed to not be case sensitive, and to pick up multiple occurrences of a string in the cell. It is well commented so that you can see what is going on:

''' <summary>
    ''' Highlight the currently entered search filter in the results to show how it was matched
    ''' </summary>
    ''' <param name="sender"></param>
    ''' <param name="e"></param>
    ''' <remarks></remarks>
    Private Sub dgv_Results_CellPainting(sender As Object, e As DataGridViewCellPaintingEventArgs) Handles dgv_Results.CellPainting

        'If there is no search string, no rows, or nothing in this cell, then get out. 
        If txt_SearchFilter.Text = String.Empty Then Return
        If (e.Value Is Nothing) Then Return
        If e.RowIndex < 0 Or e.ColumnIndex < 0 Then Return

        e.Handled = True
        e.PaintBackground(e.CellBounds, True)

        'Get the value of the text in the cell, and the search term. Work with everything in lowercase for more accurate highlighting
        Dim str_SearchTerm As String = txt_SearchFilter.Text.Trim.ToLower
        Dim str_CellText As String = DirectCast(e.FormattedValue, String).ToLower

        'Create a list of the character ranges that need to be highlighted. We need to know the start index and the length
        Dim HLRanges As New List(Of CharacterRange)
        Dim SearchIndex As Integer = str_CellText.IndexOf(str_SearchTerm)
        Do Until SearchIndex = -1
            HLRanges.Add(New CharacterRange(SearchIndex, str_SearchTerm.Length))
            SearchIndex = str_CellText.IndexOf(str_SearchTerm, SearchIndex + str_SearchTerm.Length)
        Loop

        ' We also work with the original cell text which is has not been converted to lowercase, else the sizes are incorrect
        str_CellText = DirectCast(e.FormattedValue, String)

        ' Choose your colours. A different colour is used on the currently selected rows
        Dim HLColour As SolidBrush
        If ((e.State And DataGridViewElementStates.Selected) <> DataGridViewElementStates.None) Then
            HLColour = New SolidBrush(Color.DarkGoldenrod)
        Else
            HLColour = New SolidBrush(Color.BurlyWood)
        End If

        'Loop through all of the found instances and draw the highlight box
        For Each HLRange In HLRanges

            ' Create the rectangle. It should start just underneath the top of the cell, and go to just above the bottom
            Dim HLRectangle As New Rectangle()
            HLRectangle.Y = e.CellBounds.Y + 2
            HLRectangle.Height = e.CellBounds.Height - 5

            ' Determine the size of the text before the area to highlight, and the size of the text to highlight. 
            ' We need to know the size of the text before so that we know where to start the highlight rectangle
            Dim TextBeforeHL As String = str_CellText.Substring(0, HLRange.First)
            Dim TextToHL As String = str_CellText.Substring(HLRange.First, HLRange.Length)
            Dim SizeOfTextBeforeHL As Size = TextRenderer.MeasureText(e.Graphics, TextBeforeHL, e.CellStyle.Font, e.CellBounds.Size)
            Dim SizeOfTextToHL As Size = TextRenderer.MeasureText(e.Graphics, TextToHL, e.CellStyle.Font, e.CellBounds.Size)

            'Set the width of the rectangle, a little wider to make the highlight clearer
            If SizeOfTextBeforeHL.Width > 5 Then
                HLRectangle.X = e.CellBounds.X + SizeOfTextBeforeHL.Width - 6
                HLRectangle.Width = SizeOfTextToHL.Width - 6
            Else
                HLRectangle.X = e.CellBounds.X + 2
                HLRectangle.Width = SizeOfTextToHL.Width - 6
            End If

            'Paint the highlight area
            e.Graphics.FillRectangle(HLColour, HLRectangle)
        Next

        'Paint the rest of the cell as usual
        e.PaintContent(e.CellBounds)

    End Sub

Be the first to like.
Posted in VB.net | Leave a comment

How to Create a Specific Customized Logon Page for Each VPN vServer based on FQDN without breaking Email Based Discovery

Citrix have published a guide (http://support.citrix.com/article/CTX123736) on creating a customised logon page for each virtual server, based on the FQDN received. The article works, and true to its intended aim, the sites respond on the relative FQDN and return the correctly customised login page for each of the vServers.

Once this has been completed though, the vServer that has been configured with a Responder in the NetScaler will no longer be able to use email based discovery or automatic configuration using the external store name. The error we were getting on the receiver was this:

“Your account cannot be added using this server address. Make sure you entered it correctly. You may need to enter your email address instead.”

Citrix_Receiver_Server_Error
The same error was displayed if using the email address or the FQDN of the vServer.

Disabling the Responder rule that was created following the KB allowed the configuration to work. Based on this, I fully removed the responder and in started looking for other ways to accomplish the customisation.

These are the steps that I took to enable the rewrite rule:

I am running NetScaler 10.5

Using the GUI:

1. Check that rewrite is enabled in System –> Settings –> Configure Basic Features.

2. Go to AppExpert –> Rewrite –> Actions. Create a new Action. Enter a name and set the type to Replace. In the ‘Expression to choose target location’ field, enter ‘HTTP.REQ.URL’. In the expression to replace with, you need to enter the full web address to the newly created custom logon page. In this example I have entered “https://myseconddomain.co.uk/vpn/index_custom.html”. It should look similar to the image below. Click Create when you are done. Citrix_NetScaler_Rewrite1_Action
3. Go to AppExpert –> Rewrite –> Policy. Create a new Policy. Enter a name and set the Action to the name of the action created in step 2. The Undefined-Result-Action should be set to ‘Global-undefined-result-action’. In the expression enter the following, substituting in your FQDN: ‘HTTP.REQ.HOSTNAME.CONTAINS(“myseconddomain.co.uk”) && HTTP.REQ.URL.CONTAINS(“index.html”)’
Citrix_NetScaler_Rewrite2_Policy4. Finally, we need to bind this policy to the Global HTTP Request receiver. Go to AppExpert –> Rewrite –> Policy. Select the policy that you just created, and then click Policy Manager at the top. Accept the default settings for the Bind Point (show below for completeness). Click Continue. Select Add Binding, then choose the Policy that you created in step 3. The other details can be left as default, and click Bind, then click Done in the Policy Manager.
Citrix_NetScaler_Rewrite4_AddBindingCitrix_NetScaler_Rewrite5_PolicyManager
5. Test, and hopefully all will work.

Using the CLI:
1. enable feature rewrite
2. add rewrite action REWRITE_ACT replace “HTTP.REQ.URL” “\”https://myseconddomain.co.uk/vpn/index_custom.html\””
3. add rewrite policy REWRITE_POL “HTTP.REQ.HOSTNAME.CONTAINS(\”myseconddomain.co.uk\”) && HTTP.REQ.URL.CONTAINS(\”index.html\”)” REWRITE_ACT
4. bind rewrite global REWRITE_POL 1 END -type REQ_DEFAULT
5. Test

Following this, both the custom page redirection, and email based discovery both working as they should.

Be the first to like.
Posted in Citrix | Leave a comment

XenDesktop 7 Move Database

Unfortunately once again the documentation provided by Citrix on moving databases from one SQL Server to another is incomplete. The documentation provided here: http://support.citrix.com/article/CTX140319 misses out the configuration of one of the services.

The listing misses out the configuration of the Citrix Analytics Service. In my environment this meant that after the reboot of the service, it tried to do a license server upgrade (I forgot to take a screenshot).

Secondly, the PowerShell that is supplied on the site removes the DB configuration for the MonitorDBConnection and LogDBConnection before removing the configuration from the DataStore of the respective services. This causes an error when running the commands in the order that they list.

My corrected version is below. This assumes that the database has already been moved to the new SQL server and the necessary logins created.

$OldSQLServer = "OLDSERVER\OLDINSTANCE
$NewSQLServer = "NEWSERVER\NEWINSTANCE"

Write-Host "Stopping Logging" -ForegroundColor Black -BackgroundColor Yellow
Set-LogSite -State "Disabled"

Write-Host "Updating Connection String" -ForegroundColor Black -BackgroundColor Yellow
$ConnStr = Get-ConfigDBConnection
Write-Host "Old Connection String: $ConnStr"
$ConnStr = $ConnStr.Replace($OldSQLServer,$NewSQLServer)
Write-Host "New Connection String: $ConnStr"

Write-Host "Clearing all current DB Connections" -ForegroundColor Black -BackgroundColor Yellow
Set-ConfigDBConnection -DBConnection $null
Set-AcctDBConnection -DBConnection $null
Set-AnalyticsDBConnection -DBConnection $null
Set-HypDBConnection -DBConnection $null
Set-ProvDBConnection -DBConnection $null
Set-BrokerDBConnection -DBConnection $null
Set-EnvTestDBConnection -DBConnection $null
Set-SfDBConnection -DBConnection $null
Set-MonitorDBConnection -DataStore Monitor -DBConnection $null
Set-MonitorDBConnection -DBConnection $null
Set-LogDBConnection -DataStore Logging -DBConnection $null
Set-LogDBConnection -DBConnection $null
Set-AdminDBConnection -DBConnection $null

Write-Host "Configuring new DB Connections" -ForegroundColor Black -BackgroundColor Yellow
Set-AdminDBConnection -DBConnection $ConnStr
Set-AnalyticsDBConnection -DBConnection $ConnStr
Set-ConfigDBConnection -DBConnection $ConnStr
Set-AcctDBConnection -DBConnection $ConnStr
Set-HypDBConnection -DBConnection $ConnStr
Set-ProvDBConnection -DBConnection $ConnStr
Set-BrokerDBConnection -DBConnection $ConnStr
Set-EnvTestDBConnection -DBConnection $ConnStr
Set-LogDBConnection -DBConnection $ConnStr
Set-LogDBConnection -DataStore Logging -DBConnection $ConnStr
Set-MonitorDBConnection -DBConnection $ConnStr
Set-MonitorDBConnection -DataStore Monitor -DBConnection $ConnStr
Set-SfDBConnection -DBConnection $ConnStr

Write-Host "Testing new DB Connections..." -ForegroundColor Black -BackgroundColor Yellow
Test-AdminDBConnection -DBConnection $ConnStr
Test-AnalyticsDBConnection -DBConnection $ConnStr
Test-ConfigDBConnection -DBConnection $ConnStr
Test-AcctDBConnection -DBConnection $ConnStr
Test-HypDBConnection -DBConnection $ConnStr
Test-ProvDBConnection -DBConnection $ConnStr
Test-BrokerDBConnection -DBConnection $ConnStr
Test-EnvTestDBConnection -DBConnection $ConnStr
Test-LogDBConnection -DBConnection $ConnStr
Test-MonitorDBConnection -DBConnection $ConnStr
Test-SfDBConnection -DBConnection $ConnStr

Write-Host "Re-enabling Logging" -ForegroundColor Black -BackgroundColor Yellow
Set-MonitorConfiguration -DataCollectionEnabled $true
Set-LogSite -State "Enabled"

Write-Host "Restarting all Citrix Services..." -ForegroundColor Black -BackgroundColor Yellow
Get-Service Citrix* | Stop-Service -Force
Get-Service Citrix* | Start-Service

1 person found this post useful.
Posted in Citrix, SQL Server | 2 Comments

NetScaler 10.5 53.9c StoreFront Monitor uses NSIP, not the SNIP

I believe I have come across a bug in the implementation of the StoreFront monitor in the Citrix NetScaler 10.5. The issue may also exist in previous versions, but I have not tested it.

The NetScaler I was working on was sited in a secure network, with a firewall between the NetScaler and the internal network. Shown below:

NetScaler Layout

I had the following firewall rules in place:

Source Destination Port
 Subnet IP  StoreFront Servers  443
 Management Machines  NetScaler IP  443

The two StoreFront servers, in a load balanced configuration, were constantly showing as down. Expanding the Service Group and looking at the probe results just showed ‘Probe Failed’. To verify connectivity, I created a HTTPS monitor for the same pair of servers, but strangely this monitor always showed as Up.

Running a WireShark trace on the SNIP was not showing any HTTPS requests being sent from the SNIP. Running the same WireShark trace but using the NSIP address showed multiple regular requests from the NetScaler. Another rule was added to the firewall, as per the table below, and the StoreFront monitors changed state to Up almost immediately.

Source Destination Port
Subnet IP StoreFront Servers 443
NetScaler IP StoreFront Servers 443
Management Machines NetScaler IP 443

I don’t believe that this is by design from Citrix, as their documentation for the NetScaler clearly states that the SNIP should be responsible for the monitoring of services and communication with backend services. Hopefully this bug/issue will be resolved in a future release.

“The NetScaler ADC uses the subnet IP address as a source IP address to proxy client connections to servers. It also uses the subnet IP address when generating its own packets, such as packets related to dynamic routing protocols, or to send monitor probes to check the health of the servers.”

http://support.citrix.com/proddocs/topic/ns-system-10-map/ns-nw-ipaddrssng-confrng-snips-tsk.html

Thankfully in the situation I am currently working on, allowing the NSIP the ability to see the StoreFront servers and subsequently monitor them was not a problem (although another hole in the firewall) but I can imagine that a number of deployments may not have the flexibility to configure their network or firewall in this way.

1 person found this post useful.
Posted in Citrix | 3 Comments

Back Up SharePoint Farm using PowerShell

We have a small SharePoint Foundation 2010 site here, which contains some information which we wanted to protect. The size of the site did not particularly warrant the purchase of a license for any specialist backup software, so we were looking for a way to back it up, send some notifications and retain a specific number of backups.

As SharePoint natively includes a backup and restore function, which backs up everything including the databases, it made sense to use this. We scheduled the script below to run daily, using an account with the following privileges:

  • Local Administrator on the server running SharePoint Foundation 2010
  • SysAdmin in the SQL Server instance.

All you need to do is configure the email settings, location of the backups and the desired retention at the top of the script.

# ----------------------------------------------------------------------------------------------------------
# PURPOSE:    Backup all the SharePoint Farm and retain a specified number of backups
#
# VERSION     DATE         USER                DETAILS
# 1           19/11/2014   Craig Tolley        First version
# 1.1         19/11/2014   Craig Tolley        Added in verification that all backups in the backup TOC can be found
#                                              Changed backup retention to a number of backups rather than a specific number of days.
# 1.2         24/11/2014   Craig Tolley        Corrected reference to backup location for directory sizing
#					       Corrected Start Time reference in report
#
# ----------------------------------------------------------------------------------------------------------

#This function is the exit point for the script. Called from a number of different points depending on the outcomes. 
function Send-NotificationEmailAndExit
{
    $smtpServer = "your.mailserver.co.uk"
    $smtpFrom = "sharepointbackup@yourdomain.co.uk"
    $smtpTo = "recipient@yourdomain.co.uk"
    $messageSubject = "SharePoint Farm - Backup - $((Get-Date).ToShortDateString())"
    $messageBody = "The SharePoint Farm backup script has completed running.`r`n`r`n" + [string]::join("`r`n",$Output)
    Send-MailMessage -From $smtpFrom -To $smtpTo -Subject $messageSubject -Body $messageBody -SmtpServer $smtpServer
    Exit 0
}

#Output Variable for notifications for the email
$Output = @()

#Specify the number of backups that you want to retain in the destination folder
[int]$RetainBackups = 7

#Destination Backup Folder
[string]$BackupPath = "\\servername\sharename"

#Check that that SharePoint PowerShell SnapIn is available, check it is loaded. 
If ((Get-PSSnapIn -Registered | Where {$_.Name -eq "Microsoft.Sharepoint.Powershell"} | Measure).Count -ne 1) {
    $Output += "ERROR: SharePoint Server PowerShell is not installed on targeted machine"
    Send-NotificationEmailAndExit
    }
If ((Get-PSSnapIn | Where {$_.Name -eq "Microsoft.Sharepoint.Powershell"} | Measure).Count -ne 1) {
    $Output += "INFO: Adding SharePoint Server PowerShell SnapIn..."
    Add-PSSnapIn -Name "Microsoft.Sharepoint.Powershell" | Out-Null
    }

#Test that the Backup Path Exists
If ((Test-Path $BackupPath) -eq $false) {
    $Output += "ERROR: The specified backup path ($BackupPath) does not exist. Create the destination before using it for SharePoint backups."
    Send-NotificationEmailAndExit
    }

#Perform a Backup of the SharePoint Farm. Script locks until this process completes.
$Output += "INFO: Starting Backup at $(Get-Date)"
#$BackupPath
Backup-SPFarm -BackupMethod Full -Directory "$($BackupPath)"
$Output += "INFO: Completed Backup at $(Get-Date)"

#Check the status of the last backup
$BackupStatus = Get-SPBackupHistory -Directory $BackupPath | Sort StartTime | Select -Last 1
$Output += "INFO: Size of the Backup: {0:N2}" -f ($(Get-ChildItem $BackupStatus.Directory -recurse | Measure-Object -property length -sum).sum / 1MB) + " MB"
$Output += "INFO: Backup Warnings: $($BackupStatus.WarningCount)"
$Output += "INFO: Backup Errors: $($BackupStatus.ErrorCount)"
If ($BackupStatus.IsFailure -eq $true) {
    $Output += "ERROR: The backup was not successful. The details of the backup operation are: $($BackupStatus | fl) "
    Send-NotificationEmailAndExit
    }
$Output += "INFO: Backup Report: $($BackupStatus | fl | Out-String)"

#Get the SPBackup Table of Contents
$spbrtoc = "$BackupPath\spbrtoc.xml"
[xml]$spbrtocxml = Get-Content $spbrtoc

#Find the old backups in spbrtoc.xml
$OldBackups = if ($spbrtocxml.SPBackupRestoreHistory.SPHistoryObject.Count -gt $RetainBackups) 
                {$spbrtocxml.SPBackupRestoreHistory.SPHistoryObject | Sort StartTime -Descending | Select -Last $($spbrtocxml.SPBackupRestoreHistory.SPHistoryObject.Count - $RetainBackups)}
                else {$OldBackups = $null} 
                    
if ($OldBackups -eq $Null) {
    $Output += "INFO: There is not more than $RetainBackups backups in the specified backup directory"
    }

#Delete the backup reference from the XML Table of Contents and delete the physical backup file.
ForEach ($BackupRef in $OldBackups) {
    $spbrtocxml.SPBackupRestoreHistory.RemoveChild($BackupRef)

    If ((Test-Path $BackupRef.SPBackupDirectory) -eq $true) {
        $Output += "INFO: Removing the SP Backup Directory: $($BackupRef.SPBackupDirectory)"
        Remove-Item $BackupRef.SPBackupDirectory -Recurse
        }
    Else {
        $Output += "ERROR: Backup directory $($BackupRef.SPBackupDirectory) not found."
        }
    $Output += "INFO: Removed old backup reference from: $($BackupRef.SPStartTime)"
    }

#Verify all other items referenced in the backup file are present
$Output += "INFO: Started checking for orphaned backup records"
ForEach ($BackupRef in $spbrtocxml.SPBackupRestoreHistory.SPHistoryObject) {
    If ((Test-Path $BackupRef.SPBackupDirectory) -eq $false) {
        $spbrtocxml.SPBackupRestoreHistory.RemoveChild($BackupRef)
        $Output += "INFO: Removed reference to non-existent backup at location: $($BackupRef.SPBackupDirectory)"
        }
    }
$Output += "INFO: Checking for orphaned backup records complete"

#Save the new Sharepoint backup report xml file
$spbrtocxml.Save($spbrtoc)
$Output += "INFO: Completed removal of old backups."

#All done. Send notification. 
$Output += "INFO: SharePoint Backup Script Completed"
Send-NotificationEmailAndExit

Be the first to like.
Posted in Powershell, SharePoint | Leave a comment

PowerShell: Move SnapMirror Destinations to a New Aggregate (7-Mode)

We got some new disk shelves in to replace some old ones on our NetApp filer. The old disks had a lot of SnapMirror destination volumes.All these needed migrating to the new disks. Initially we were planning on creating a new set of SnapMirrors from the source volumes to the destinations, and once these were complete, remove the original mirrors. This seemed like a long winded wasteful process – we did not particularly want 30Tb of data re-transmitted over the network, especially as an fully internal transfer would run at a significantly higher speed.

Also, we wanted to automate this process as much as possible, as this provided the smallest scope for error.

Hence the following script was born.

It runs through quite a simple process:

  • Load the DataONTAP PowerShell module if it is not available
  • Connect to the specified filer
  • Check that the source volume and the destination aggregate exist
  • Get details of the NFS exports on the volume and remove all NFS exports
  • Break the current SnapMirror relationship
  • Perform a Volume Move of the now writeable SnapMirror destination
  • Re-create the NFS exports
  • Re-sync the SnapMirror

The NFS exports have to be removed, as volume moves are not supported on volumes that have any exports configured. As long as the script runs to the end, the exports will be recreated once everything has completed.

# ----------------------------------------------------------------------------------------------------------
# PURPOSE:    Moves a SnapMirror Destination to a new aggregate, without re-initializing the SnapMirror from scratch. 
#
# VERSION     DATE         USER                DETAILS
# 1           27/10/2014   Craig Tolley        First version
# 1.1         14/11/2014   Craig Tolley        Added new parameter to pass in credentials, so that scripting multiple moves is easier and without prompts
#
# ----------------------------------------------------------------------------------------------------------

<#
.Synopsis
   Moves the specified volume to a new aggregate.

.EXAMPLE
   Move-SnapmirrorDestination -VolumeToMove Volume1 -DestinationAggr Aggr2 -FilerName Filer1

#>
function Move-SnapmirrorDestination
{
    [CmdletBinding()]
    Param
    (
        # The volume that we want to move
        [Parameter(Mandatory=$true,
                   ValueFromPipeline=$false,
                   Position=0)]
        [String]$VolumeToMove,

        # The destination aggregate for the new volume
        [Parameter(Mandatory=$true,
                   ValueFromPipeline=$false,
                   Position=1)]
        [String]$DestinationAggr,

        # The filer name to connect to 
        [Parameter(Mandatory=$true,
                   ValueFromPipeline=$false,
                   Position=2)]
        [String]$FilerName,

        [Parameter(Position=3)]
        [System.Management.Automation.PSCredential]$FilerCredentials

    )

    #Check that that DataOnTap Module is available, check it is loaded. 
    Write-Host "Checking for DataONTAP Module..."
    if ((Get-Module -ListAvailable | Where {$_.Name -eq "DataONTAP"}).Count -ne 1) {
        Write-Error "DataONTAP not installed on targeted machine"
        Exit 1
        }
    if ((Get-Module | Where {$_.Name -eq "DataONTAP"}).Count -ne 1) {
        Write-Host "Importing DataONTAP Module..."
        Import-Module -Name "DataONTAP" | Out-Null
        }

    #If we have not been passed credentials, then prompt for them. 
    If ($FilerCredentials -eq $null)
        {$FilerCredentials = Get-Credential -Message "Please supply credentials for $FilerName"}

    #Connect to the Filer.
    Write-Host "Connecting to $FilerName" -BackgroundColor Yellow -ForegroundColor Black
    $Error.Clear()
    Connect-NaController -Name $FilerName -Credential $FilerCredentials
    If ($Error.Count -gt 0)
        {
        Write-Host "There was an error connecting to the filer. Please check your credentials and try again."
        Break
        }
    Write-Host ""

    #Get the Source Volume
    Write-Host "Getting details of Volume: $VolumeToMove" -BackgroundColor Yellow -ForegroundColor Black
    $SrcVolume = Get-NaVol $VolumeToMove
    If ($Error.Count -gt 0)
        {
        Write-Host "There was an error getting the details of the Volume. Please check that the volume name is correct and the volume is online."
        Break
        }
    $SrcVolume | ft
  
    #Get the Destination Aggregate
    Write-Host "Getting details of the destination aggregate: $DestinationAggr" -BackgroundColor Yellow -ForegroundColor Black
    $DestAggr = Get-NaAggr $DestinationAggr
    If ($Error.Count -gt 0)
        {
        Write-Host "There was an error getting the details of the Volume. Please check that the volume name is correct and the volume is online."
        Break
        }
    $DestAggr | ft

    #Get the NFS Exports for the Volume and Remove them
    Write-Host "Getting details of the NFS Exports" -BackgroundColor Yellow -ForegroundColor Black
    $NFSExports = Get-NaNfsExport | Where {$_.Pathname -like "*$($SrcVolume.Name)"}
    If (($NFSExports).Count -gt 0)
        {
        ForEach ($Exp in $NFSExports)
            {Remove-NaNfsExport $Exp}
        }
    Else 
        {Write-Host "No NFS Exports are configured for this volume"}
    Write-Host ""

    #Break all of the snapmirrors which are configured
    Write-Host "Breaking existing Snapmirrors" -BackgroundColor Yellow -ForegroundColor Black
    $SrcSnapMirrors = Get-NaSnapmirror $SrcVolume
    $SrcSnapMirrors | ft
    If (($SrcSnapMirrors).Count -gt 0)
        {
        ForEach ($Snapmirror in $SrcSnapMirrors)
            {Get-NASnapMirror $Snapmirror.Destination | Invoke-NaSnapmirrorBreak -Confirm:$false | Out-Null}
        }
    Else 
        {Write-Host "No Snapmirrors are configured for this volume"}
    Write-Host ""

    #Start the actual volume move. 
    Write-Host "Starting the Vol Move (Update every 15 seconds)" -BackgroundColor Yellow -ForegroundColor Black
    Start-NaVolMove -Volume $SrcVolume -DestAggr $DestAggr

    #Keep Running Until the Vol Move completes
    Get-NaVolMove | Where {$_.SourceVolumeName -eq $SrcVolume.Name} | ft
    Do
        {
        Get-NaVolMove | Where {$_.SourceVolumeName -eq $SrcVolume.Name} | ft -HideTableHeaders
        Start-Sleep 15
        }
    Until ((Get-NaVolMove | Where {$_.SourceVolumeName -eq $SrcVolume.Name}).Count -eq 0)

    #Recreate the NFS Exports
    Write-Host "Recreating the NFS Exports" -BackgroundColor Yellow -ForegroundColor Black
    ForEach ($Exp in $NFSExports)
        {Add-NaNfsExport $Exp}
    Write-Host ""

    #Resync all of the snapmirrors which are configured
    Write-Host "Re-synching all SnapMirrors" -BackgroundColor Yellow -ForegroundColor Black
    If (($SrcSnapMirrors).Count -gt 0)
        {
        ForEach ($Snapmirror in $SrcSnapMirrors)
            {Get-NASnapMirror $Snapmirror.Destination | Invoke-NaSnapmirrorResync -Confirm:$false | Out-Null}
        }

    #Complete
    Write-Host "  --  Completed Volume Move  -- "
}

If you wanted to use this to move a bunch of destinations to a new volume, then a short snippet like this does the job for you:

$VolsToMove = "volA", "volB"
$FilerLogin = Get-Credential
ForEach($v in $VolsToMove)
    {Move-SnapmirrorDestination -VolumeToMove $v -DestinationAggr newAggr -FilerName Filer1 -FilerCredentials $FilerLogin}

Be the first to like.
Posted in NetApp, Storage | Leave a comment
  • Tags

  • Categories

  • My LinkedIn Profile

    To see my LinkedIn profile, click here:

    Craig Tolley
  • April 2015
    M T W T F S S
    « Feb    
     12345
    6789101112
    13141516171819
    20212223242526
    27282930  
  • Meta

  • Top Liked Posts

    Powered by WP Likes

Swedish Greys - a WordPress theme from Nordic Themepark.