‘You Have Been Logged On With a Temporary Profile’ when all profiles have been redirected to a specific location

This is a very strange issue, which I think will only affect a handful of people, and only those who have the right mix of configurations as described below.

Users logging on to a Windows 7 machine received the following popup:

Temporary ProfileThis message implied that there would be some informative details in the Event Log, unfortunately in this situation, nothing. No errors, no warnings, no information.

On this particular machine we were using the following GPO setting to force users to a specific roaming profile location. The machines are all sat inside a controlled network so access to the normal profile was not allowed.

Computer Configuration –> Administrative Templates –> System –> User Profiles –> Set roaming profile path for all users logging onto this computer

In the ProfileList key in the registry you can see the location that has been configured for the Central Profile (i.e the server copy of the roaming profile). Checking out the key for the specific user showed the following. The value can be found at: HKLM:\SOFTWARE\Microsoft\Windows NT\CurrentVersion\ProfileList\SID.

UserProfileRegKeyThe GPO was only configured with \\server\profiles$\%username% though. The addition of the Domain component into the path was unexpected.

After clearing all the profiles from the local machine, and rebooting, thinking that something must be corrupt, the issued recurred. Running a ProcMon against the system at boot time and tracking the change to this key showed the user profile service creating the CentralProfile value and populating it with the wrong value from the start.

This machine is quite heavily managed, and this involves running a couple of PowerShell scripts as scheduled tasks at startup. We had configured the tasks to run as local only, as they did not require any access to network resources. They were configured as below:

User Profile - Scheduled Task For some reason, even though this task was set to run locally, it was influencing the location of the roaming profile. Most strangely, it wasn’t just influencing the path of the profile for the account that was configured in the scheduled task, it was influencing all user accounts that logged on to the machine.

The fix for us was fortunately very simple. The job that the task was doing could quite easily be achieved by using the local SYSTEM account. After changing the task credentials, I did have to clear out all of the profiles from the system to remove the incorrect values, but since this change, the accounts have all loaded the correct profiles from the correct locations.

Be the first to like.
Posted in Windows 7 | Leave a comment

VB.net: Highlighting a search term in a DataGridView

I’m building a search form into an application that has a database back end. I managed to configure a nice little search which takes some user input, and then modifies the results shown in a DataGridView control. However, not being satisfied with just showing a subset of results, I wanted to be able to highlight the values that had matched so that it was clearer for the end user to see why the records still in the view were there.

This, it turns out, is not as simple as I hoped. However, I have now got it working. The form has a text box which has a sub for validating the input. This runs the actual search. The bit we are interested in here though is that we handle the CellPainting event on the DataGridView control, and then customise the painting of the control to meet our needs.

To give you an idea of what the highlighted form looks like:Highlighting in a DataGridView

 

This is the code that does the work. It is designed to not be case sensitive, and to pick up multiple occurrences of a string in the cell. It is well commented so that you can see what is going on:

''' <summary>
    ''' Highlight the currently entered search filter in the results to show how it was matched
    ''' </summary>
    ''' <param name="sender"></param>
    ''' <param name="e"></param>
    ''' <remarks></remarks>
    Private Sub dgv_Results_CellPainting(sender As Object, e As DataGridViewCellPaintingEventArgs) Handles dgv_Results.CellPainting

        'If there is no search string, no rows, or nothing in this cell, then get out. 
        If txt_SearchFilter.Text = String.Empty Then Return
        If (e.Value Is Nothing) Then Return
        If e.RowIndex < 0 Or e.ColumnIndex < 0 Then Return

        e.Handled = True
        e.PaintBackground(e.CellBounds, True)

        'Get the value of the text in the cell, and the search term. Work with everything in lowercase for more accurate highlighting
        Dim str_SearchTerm As String = txt_SearchFilter.Text.Trim.ToLower
        Dim str_CellText As String = DirectCast(e.FormattedValue, String).ToLower

        'Create a list of the character ranges that need to be highlighted. We need to know the start index and the length
        Dim HLRanges As New List(Of CharacterRange)
        Dim SearchIndex As Integer = str_CellText.IndexOf(str_SearchTerm)
        Do Until SearchIndex = -1
            HLRanges.Add(New CharacterRange(SearchIndex, str_SearchTerm.Length))
            SearchIndex = str_CellText.IndexOf(str_SearchTerm, SearchIndex + str_SearchTerm.Length)
        Loop

        ' We also work with the original cell text which is has not been converted to lowercase, else the sizes are incorrect
        str_CellText = DirectCast(e.FormattedValue, String)

        ' Choose your colours. A different colour is used on the currently selected rows
        Dim HLColour As SolidBrush
        If ((e.State And DataGridViewElementStates.Selected) <> DataGridViewElementStates.None) Then
            HLColour = New SolidBrush(Color.DarkGoldenrod)
        Else
            HLColour = New SolidBrush(Color.BurlyWood)
        End If

        'Loop through all of the found instances and draw the highlight box
        For Each HLRange In HLRanges

            ' Create the rectangle. It should start just underneath the top of the cell, and go to just above the bottom
            Dim HLRectangle As New Rectangle()
            HLRectangle.Y = e.CellBounds.Y + 2
            HLRectangle.Height = e.CellBounds.Height - 5

            ' Determine the size of the text before the area to highlight, and the size of the text to highlight. 
            ' We need to know the size of the text before so that we know where to start the highlight rectangle
            Dim TextBeforeHL As String = str_CellText.Substring(0, HLRange.First)
            Dim TextToHL As String = str_CellText.Substring(HLRange.First, HLRange.Length)
            Dim SizeOfTextBeforeHL As Size = TextRenderer.MeasureText(e.Graphics, TextBeforeHL, e.CellStyle.Font, e.CellBounds.Size)
            Dim SizeOfTextToHL As Size = TextRenderer.MeasureText(e.Graphics, TextToHL, e.CellStyle.Font, e.CellBounds.Size)

            'Set the width of the rectangle, a little wider to make the highlight clearer
            If SizeOfTextBeforeHL.Width > 5 Then
                HLRectangle.X = e.CellBounds.X + SizeOfTextBeforeHL.Width - 6
                HLRectangle.Width = SizeOfTextToHL.Width - 6
            Else
                HLRectangle.X = e.CellBounds.X + 2
                HLRectangle.Width = SizeOfTextToHL.Width - 6
            End If

            'Paint the highlight area
            e.Graphics.FillRectangle(HLColour, HLRectangle)
        Next

        'Paint the rest of the cell as usual
        e.PaintContent(e.CellBounds)

    End Sub

Be the first to like.
Posted in VB.net | Leave a comment

How to Create a Specific Customized Logon Page for Each VPN vServer based on FQDN without breaking Email Based Discovery

Citrix have published a guide (http://support.citrix.com/article/CTX123736) on creating a customised logon page for each virtual server, based on the FQDN received. The article works, and true to its intended aim, the sites respond on the relative FQDN and return the correctly customised login page for each of the vServers.

Once this has been completed though, the vServer that has been configured with a Responder in the NetScaler will no longer be able to use email based discovery or automatic configuration using the external store name. The error we were getting on the receiver was this:

“Your account cannot be added using this server address. Make sure you entered it correctly. You may need to enter your email address instead.”

Citrix_Receiver_Server_Error
The same error was displayed if using the email address or the FQDN of the vServer.

Disabling the Responder rule that was created following the KB allowed the configuration to work. Based on this, I fully removed the responder and in started looking for other ways to accomplish the customisation.

These are the steps that I took to enable the rewrite rule:

I am running NetScaler 10.5

Using the GUI:

1. Check that rewrite is enabled in System –> Settings –> Configure Basic Features.

2. Go to AppExpert –> Rewrite –> Actions. Create a new Action. Enter a name and set the type to Replace. In the ‘Expression to choose target location’ field, enter ‘HTTP.REQ.URL’. In the expression to replace with, you need to enter the full web address to the newly created custom logon page. In this example I have entered “https://myseconddomain.co.uk/vpn/index_custom.html”. It should look similar to the image below. Click Create when you are done. Citrix_NetScaler_Rewrite1_Action
3. Go to AppExpert –> Rewrite –> Policy. Create a new Policy. Enter a name and set the Action to the name of the action created in step 2. The Undefined-Result-Action should be set to ‘Global-undefined-result-action’. In the expression enter the following, substituting in your FQDN: ‘HTTP.REQ.HOSTNAME.CONTAINS(“myseconddomain.co.uk”) && HTTP.REQ.URL.CONTAINS(“index.html”)’
Citrix_NetScaler_Rewrite2_Policy4. Finally, we need to bind this policy to the Global HTTP Request receiver. Go to AppExpert –> Rewrite –> Policy. Select the policy that you just created, and then click Policy Manager at the top. Accept the default settings for the Bind Point (show below for completeness). Click Continue. Select Add Binding, then choose the Policy that you created in step 3. The other details can be left as default, and click Bind, then click Done in the Policy Manager.
Citrix_NetScaler_Rewrite4_AddBindingCitrix_NetScaler_Rewrite5_PolicyManager
5. Test, and hopefully all will work.

Using the CLI:
1. enable feature rewrite
2. add rewrite action REWRITE_ACT replace “HTTP.REQ.URL” “\”https://myseconddomain.co.uk/vpn/index_custom.html\””
3. add rewrite policy REWRITE_POL “HTTP.REQ.HOSTNAME.CONTAINS(\”myseconddomain.co.uk\”) && HTTP.REQ.URL.CONTAINS(\”index.html\”)” REWRITE_ACT
4. bind rewrite global REWRITE_POL 1 END -type REQ_DEFAULT
5. Test

Following this, both the custom page redirection, and email based discovery both working as they should.

Be the first to like.
Posted in Citrix | Leave a comment

XenDesktop 7 Move Database

Unfortunately once again the documentation provided by Citrix on moving databases from one SQL Server to another is incomplete. The documentation provided here: http://support.citrix.com/article/CTX140319 misses out the configuration of one of the services.

The listing misses out the configuration of the Citrix Analytics Service. In my environment this meant that after the reboot of the service, it tried to do a license server upgrade (I forgot to take a screenshot).

Secondly, the PowerShell that is supplied on the site removes the DB configuration for the MonitorDBConnection and LogDBConnection before removing the configuration from the DataStore of the respective services. This causes an error when running the commands in the order that they list.

My corrected version is below. This assumes that the database has already been moved to the new SQL server and the necessary logins created.

$OldSQLServer = "OLDSERVER\OLDINSTANCE
$NewSQLServer = "NEWSERVER\NEWINSTANCE"

Write-Host "Stopping Logging" -ForegroundColor Black -BackgroundColor Yellow
Set-LogSite -State "Disabled"

Write-Host "Updating Connection String" -ForegroundColor Black -BackgroundColor Yellow
$ConnStr = Get-ConfigDBConnection
Write-Host "Old Connection String: $ConnStr"
$ConnStr = $ConnStr.Replace($OldSQLServer,$NewSQLServer)
Write-Host "New Connection String: $ConnStr"

Write-Host "Clearing all current DB Connections" -ForegroundColor Black -BackgroundColor Yellow
Set-ConfigDBConnection -DBConnection $null
Set-AcctDBConnection -DBConnection $null
Set-AnalyticsDBConnection -DBConnection $null
Set-HypDBConnection -DBConnection $null
Set-ProvDBConnection -DBConnection $null
Set-BrokerDBConnection -DBConnection $null
Set-EnvTestDBConnection -DBConnection $null
Set-SfDBConnection -DBConnection $null
Set-MonitorDBConnection -DataStore Monitor -DBConnection $null
Set-MonitorDBConnection -DBConnection $null
Set-LogDBConnection -DataStore Logging -DBConnection $null
Set-LogDBConnection -DBConnection $null
Set-AdminDBConnection -DBConnection $null

Write-Host "Configuring new DB Connections" -ForegroundColor Black -BackgroundColor Yellow
Set-AdminDBConnection -DBConnection $ConnStr
Set-AnalyticsDBConnection -DBConnection $ConnStr
Set-ConfigDBConnection -DBConnection $ConnStr
Set-AcctDBConnection -DBConnection $ConnStr
Set-HypDBConnection -DBConnection $ConnStr
Set-ProvDBConnection -DBConnection $ConnStr
Set-BrokerDBConnection -DBConnection $ConnStr
Set-EnvTestDBConnection -DBConnection $ConnStr
Set-LogDBConnection -DBConnection $ConnStr
Set-LogDBConnection -DataStore Logging -DBConnection $ConnStr
Set-MonitorDBConnection -DBConnection $ConnStr
Set-MonitorDBConnection -DataStore Monitor -DBConnection $ConnStr
Set-SfDBConnection -DBConnection $ConnStr

Write-Host "Testing new DB Connections..." -ForegroundColor Black -BackgroundColor Yellow
Test-AdminDBConnection -DBConnection $ConnStr
Test-AnalyticsDBConnection -DBConnection $ConnStr
Test-ConfigDBConnection -DBConnection $ConnStr
Test-AcctDBConnection -DBConnection $ConnStr
Test-HypDBConnection -DBConnection $ConnStr
Test-ProvDBConnection -DBConnection $ConnStr
Test-BrokerDBConnection -DBConnection $ConnStr
Test-EnvTestDBConnection -DBConnection $ConnStr
Test-LogDBConnection -DBConnection $ConnStr
Test-MonitorDBConnection -DBConnection $ConnStr
Test-SfDBConnection -DBConnection $ConnStr

Write-Host "Re-enabling Logging" -ForegroundColor Black -BackgroundColor Yellow
Set-MonitorConfiguration -DataCollectionEnabled $true
Set-LogSite -State "Enabled"

Write-Host "Restarting all Citrix Services..." -ForegroundColor Black -BackgroundColor Yellow
Get-Service Citrix* | Stop-Service -Force
Get-Service Citrix* | Start-Service

1 person found this post useful.
Posted in Citrix, SQL Server | 2 Comments

NetScaler 10.5 53.9c StoreFront Monitor uses NSIP, not the SNIP

I believe I have come across a bug in the implementation of the StoreFront monitor in the Citrix NetScaler 10.5. The issue may also exist in previous versions, but I have not tested it.

The NetScaler I was working on was sited in a secure network, with a firewall between the NetScaler and the internal network. Shown below:

NetScaler Layout

I had the following firewall rules in place:

Source Destination Port
 Subnet IP  StoreFront Servers  443
 Management Machines  NetScaler IP  443

The two StoreFront servers, in a load balanced configuration, were constantly showing as down. Expanding the Service Group and looking at the probe results just showed ‘Probe Failed’. To verify connectivity, I created a HTTPS monitor for the same pair of servers, but strangely this monitor always showed as Up.

Running a WireShark trace on the SNIP was not showing any HTTPS requests being sent from the SNIP. Running the same WireShark trace but using the NSIP address showed multiple regular requests from the NetScaler. Another rule was added to the firewall, as per the table below, and the StoreFront monitors changed state to Up almost immediately.

Source Destination Port
Subnet IP StoreFront Servers 443
NetScaler IP StoreFront Servers 443
Management Machines NetScaler IP 443

I don’t believe that this is by design from Citrix, as their documentation for the NetScaler clearly states that the SNIP should be responsible for the monitoring of services and communication with backend services. Hopefully this bug/issue will be resolved in a future release.

“The NetScaler ADC uses the subnet IP address as a source IP address to proxy client connections to servers. It also uses the subnet IP address when generating its own packets, such as packets related to dynamic routing protocols, or to send monitor probes to check the health of the servers.”

http://support.citrix.com/proddocs/topic/ns-system-10-map/ns-nw-ipaddrssng-confrng-snips-tsk.html

Thankfully in the situation I am currently working on, allowing the NSIP the ability to see the StoreFront servers and subsequently monitor them was not a problem (although another hole in the firewall) but I can imagine that a number of deployments may not have the flexibility to configure their network or firewall in this way.

Be the first to like.
Posted in Citrix | 2 Comments

Back Up SharePoint Farm using PowerShell

We have a small SharePoint Foundation 2010 site here, which contains some information which we wanted to protect. The size of the site did not particularly warrant the purchase of a license for any specialist backup software, so we were looking for a way to back it up, send some notifications and retain a specific number of backups.

As SharePoint natively includes a backup and restore function, which backs up everything including the databases, it made sense to use this. We scheduled the script below to run daily, using an account with the following privileges:

  • Local Administrator on the server running SharePoint Foundation 2010
  • SysAdmin in the SQL Server instance.

All you need to do is configure the email settings, location of the backups and the desired retention at the top of the script.

# ----------------------------------------------------------------------------------------------------------
# PURPOSE:    Backup all the SharePoint Farm and retain a specified number of backups
#
# VERSION     DATE         USER                DETAILS
# 1           19/11/2014   Craig Tolley        First version
# 1.1         19/11/2014   Craig Tolley        Added in verification that all backups in the backup TOC can be found
#                                              Changed backup retention to a number of backups rather than a specific number of days.
# 1.2         24/11/2014   Craig Tolley        Corrected reference to backup location for directory sizing
#					       Corrected Start Time reference in report
#
# ----------------------------------------------------------------------------------------------------------

#This function is the exit point for the script. Called from a number of different points depending on the outcomes. 
function Send-NotificationEmailAndExit
{
    $smtpServer = "your.mailserver.co.uk"
    $smtpFrom = "sharepointbackup@yourdomain.co.uk"
    $smtpTo = "recipient@yourdomain.co.uk"
    $messageSubject = "SharePoint Farm - Backup - $((Get-Date).ToShortDateString())"
    $messageBody = "The SharePoint Farm backup script has completed running.`r`n`r`n" + [string]::join("`r`n",$Output)
    Send-MailMessage -From $smtpFrom -To $smtpTo -Subject $messageSubject -Body $messageBody -SmtpServer $smtpServer
    Exit 0
}

#Output Variable for notifications for the email
$Output = @()

#Specify the number of backups that you want to retain in the destination folder
[int]$RetainBackups = 7

#Destination Backup Folder
[string]$BackupPath = "\\servername\sharename"

#Check that that SharePoint PowerShell SnapIn is available, check it is loaded. 
If ((Get-PSSnapIn -Registered | Where {$_.Name -eq "Microsoft.Sharepoint.Powershell"} | Measure).Count -ne 1) {
    $Output += "ERROR: SharePoint Server PowerShell is not installed on targeted machine"
    Send-NotificationEmailAndExit
    }
If ((Get-PSSnapIn | Where {$_.Name -eq "Microsoft.Sharepoint.Powershell"} | Measure).Count -ne 1) {
    $Output += "INFO: Adding SharePoint Server PowerShell SnapIn..."
    Add-PSSnapIn -Name "Microsoft.Sharepoint.Powershell" | Out-Null
    }

#Test that the Backup Path Exists
If ((Test-Path $BackupPath) -eq $false) {
    $Output += "ERROR: The specified backup path ($BackupPath) does not exist. Create the destination before using it for SharePoint backups."
    Send-NotificationEmailAndExit
    }

#Perform a Backup of the SharePoint Farm. Script locks until this process completes.
$Output += "INFO: Starting Backup at $(Get-Date)"
#$BackupPath
Backup-SPFarm -BackupMethod Full -Directory "$($BackupPath)"
$Output += "INFO: Completed Backup at $(Get-Date)"

#Check the status of the last backup
$BackupStatus = Get-SPBackupHistory -Directory $BackupPath | Sort StartTime | Select -Last 1
$Output += "INFO: Size of the Backup: {0:N2}" -f ($(Get-ChildItem $BackupStatus.Directory -recurse | Measure-Object -property length -sum).sum / 1MB) + " MB"
$Output += "INFO: Backup Warnings: $($BackupStatus.WarningCount)"
$Output += "INFO: Backup Errors: $($BackupStatus.ErrorCount)"
If ($BackupStatus.IsFailure -eq $true) {
    $Output += "ERROR: The backup was not successful. The details of the backup operation are: $($BackupStatus | fl) "
    Send-NotificationEmailAndExit
    }
$Output += "INFO: Backup Report: $($BackupStatus | fl | Out-String)"

#Get the SPBackup Table of Contents
$spbrtoc = "$BackupPath\spbrtoc.xml"
[xml]$spbrtocxml = Get-Content $spbrtoc

#Find the old backups in spbrtoc.xml
$OldBackups = if ($spbrtocxml.SPBackupRestoreHistory.SPHistoryObject.Count -gt $RetainBackups) 
                {$spbrtocxml.SPBackupRestoreHistory.SPHistoryObject | Sort StartTime -Descending | Select -Last $($spbrtocxml.SPBackupRestoreHistory.SPHistoryObject.Count - $RetainBackups)}
                else {$OldBackups = $null} 
                    
if ($OldBackups -eq $Null) {
    $Output += "INFO: There is not more than $RetainBackups backups in the specified backup directory"
    }

#Delete the backup reference from the XML Table of Contents and delete the physical backup file.
ForEach ($BackupRef in $OldBackups) {
    $spbrtocxml.SPBackupRestoreHistory.RemoveChild($BackupRef)

    If ((Test-Path $BackupRef.SPBackupDirectory) -eq $true) {
        $Output += "INFO: Removing the SP Backup Directory: $($BackupRef.SPBackupDirectory)"
        Remove-Item $BackupRef.SPBackupDirectory -Recurse
        }
    Else {
        $Output += "ERROR: Backup directory $($BackupRef.SPBackupDirectory) not found."
        }
    $Output += "INFO: Removed old backup reference from: $($BackupRef.SPStartTime)"
    }

#Verify all other items referenced in the backup file are present
$Output += "INFO: Started checking for orphaned backup records"
ForEach ($BackupRef in $spbrtocxml.SPBackupRestoreHistory.SPHistoryObject) {
    If ((Test-Path $BackupRef.SPBackupDirectory) -eq $false) {
        $spbrtocxml.SPBackupRestoreHistory.RemoveChild($BackupRef)
        $Output += "INFO: Removed reference to non-existent backup at location: $($BackupRef.SPBackupDirectory)"
        }
    }
$Output += "INFO: Checking for orphaned backup records complete"

#Save the new Sharepoint backup report xml file
$spbrtocxml.Save($spbrtoc)
$Output += "INFO: Completed removal of old backups."

#All done. Send notification. 
$Output += "INFO: SharePoint Backup Script Completed"
Send-NotificationEmailAndExit

Be the first to like.
Posted in Powershell, SharePoint | Leave a comment

PowerShell: Move SnapMirror Destinations to a New Aggregate (7-Mode)

We got some new disk shelves in to replace some old ones on our NetApp filer. The old disks had a lot of SnapMirror destination volumes.All these needed migrating to the new disks. Initially we were planning on creating a new set of SnapMirrors from the source volumes to the destinations, and once these were complete, remove the original mirrors. This seemed like a long winded wasteful process – we did not particularly want 30Tb of data re-transmitted over the network, especially as an fully internal transfer would run at a significantly higher speed.

Also, we wanted to automate this process as much as possible, as this provided the smallest scope for error.

Hence the following script was born.

It runs through quite a simple process:

  • Load the DataONTAP PowerShell module if it is not available
  • Connect to the specified filer
  • Check that the source volume and the destination aggregate exist
  • Get details of the NFS exports on the volume and remove all NFS exports
  • Break the current SnapMirror relationship
  • Perform a Volume Move of the now writeable SnapMirror destination
  • Re-create the NFS exports
  • Re-sync the SnapMirror

The NFS exports have to be removed, as volume moves are not supported on volumes that have any exports configured. As long as the script runs to the end, the exports will be recreated once everything has completed.

# ----------------------------------------------------------------------------------------------------------
# PURPOSE:    Moves a SnapMirror Destination to a new aggregate, without re-initializing the SnapMirror from scratch. 
#
# VERSION     DATE         USER                DETAILS
# 1           27/10/2014   Craig Tolley        First version
# 1.1         14/11/2014   Craig Tolley        Added new parameter to pass in credentials, so that scripting multiple moves is easier and without prompts
#
# ----------------------------------------------------------------------------------------------------------

<#
.Synopsis
   Moves the specified volume to a new aggregate.

.EXAMPLE
   Move-SnapmirrorDestination -VolumeToMove Volume1 -DestinationAggr Aggr2 -FilerName Filer1

#>
function Move-SnapmirrorDestination
{
    [CmdletBinding()]
    Param
    (
        # The volume that we want to move
        [Parameter(Mandatory=$true,
                   ValueFromPipeline=$false,
                   Position=0)]
        [String]$VolumeToMove,

        # The destination aggregate for the new volume
        [Parameter(Mandatory=$true,
                   ValueFromPipeline=$false,
                   Position=1)]
        [String]$DestinationAggr,

        # The filer name to connect to 
        [Parameter(Mandatory=$true,
                   ValueFromPipeline=$false,
                   Position=2)]
        [String]$FilerName,

        [Parameter(Position=3)]
        [System.Management.Automation.PSCredential]$FilerCredentials

    )

    #Check that that DataOnTap Module is available, check it is loaded. 
    Write-Host "Checking for DataONTAP Module..."
    if ((Get-Module -ListAvailable | Where {$_.Name -eq "DataONTAP"}).Count -ne 1) {
        Write-Error "DataONTAP not installed on targeted machine"
        Exit 1
        }
    if ((Get-Module | Where {$_.Name -eq "DataONTAP"}).Count -ne 1) {
        Write-Host "Importing DataONTAP Module..."
        Import-Module -Name "DataONTAP" | Out-Null
        }

    #If we have not been passed credentials, then prompt for them. 
    If ($FilerCredentials -eq $null)
        {$FilerCredentials = Get-Credential -Message "Please supply credentials for $FilerName"}

    #Connect to the Filer.
    Write-Host "Connecting to $FilerName" -BackgroundColor Yellow -ForegroundColor Black
    $Error.Clear()
    Connect-NaController -Name $FilerName -Credential $FilerCredentials
    If ($Error.Count -gt 0)
        {
        Write-Host "There was an error connecting to the filer. Please check your credentials and try again."
        Break
        }
    Write-Host ""

    #Get the Source Volume
    Write-Host "Getting details of Volume: $VolumeToMove" -BackgroundColor Yellow -ForegroundColor Black
    $SrcVolume = Get-NaVol $VolumeToMove
    If ($Error.Count -gt 0)
        {
        Write-Host "There was an error getting the details of the Volume. Please check that the volume name is correct and the volume is online."
        Break
        }
    $SrcVolume | ft
  
    #Get the Destination Aggregate
    Write-Host "Getting details of the destination aggregate: $DestinationAggr" -BackgroundColor Yellow -ForegroundColor Black
    $DestAggr = Get-NaAggr $DestinationAggr
    If ($Error.Count -gt 0)
        {
        Write-Host "There was an error getting the details of the Volume. Please check that the volume name is correct and the volume is online."
        Break
        }
    $DestAggr | ft

    #Get the NFS Exports for the Volume and Remove them
    Write-Host "Getting details of the NFS Exports" -BackgroundColor Yellow -ForegroundColor Black
    $NFSExports = Get-NaNfsExport | Where {$_.Pathname -like "*$($SrcVolume.Name)"}
    If (($NFSExports).Count -gt 0)
        {
        ForEach ($Exp in $NFSExports)
            {Remove-NaNfsExport $Exp}
        }
    Else 
        {Write-Host "No NFS Exports are configured for this volume"}
    Write-Host ""

    #Break all of the snapmirrors which are configured
    Write-Host "Breaking existing Snapmirrors" -BackgroundColor Yellow -ForegroundColor Black
    $SrcSnapMirrors = Get-NaSnapmirror $SrcVolume
    $SrcSnapMirrors | ft
    If (($SrcSnapMirrors).Count -gt 0)
        {
        ForEach ($Snapmirror in $SrcSnapMirrors)
            {Get-NASnapMirror $Snapmirror.Destination | Invoke-NaSnapmirrorBreak -Confirm:$false | Out-Null}
        }
    Else 
        {Write-Host "No Snapmirrors are configured for this volume"}
    Write-Host ""

    #Start the actual volume move. 
    Write-Host "Starting the Vol Move (Update every 15 seconds)" -BackgroundColor Yellow -ForegroundColor Black
    Start-NaVolMove -Volume $SrcVolume -DestAggr $DestAggr

    #Keep Running Until the Vol Move completes
    Get-NaVolMove | Where {$_.SourceVolumeName -eq $SrcVolume.Name} | ft
    Do
        {
        Get-NaVolMove | Where {$_.SourceVolumeName -eq $SrcVolume.Name} | ft -HideTableHeaders
        Start-Sleep 15
        }
    Until ((Get-NaVolMove | Where {$_.SourceVolumeName -eq $SrcVolume.Name}).Count -eq 0)

    #Recreate the NFS Exports
    Write-Host "Recreating the NFS Exports" -BackgroundColor Yellow -ForegroundColor Black
    ForEach ($Exp in $NFSExports)
        {Add-NaNfsExport $Exp}
    Write-Host ""

    #Resync all of the snapmirrors which are configured
    Write-Host "Re-synching all SnapMirrors" -BackgroundColor Yellow -ForegroundColor Black
    If (($SrcSnapMirrors).Count -gt 0)
        {
        ForEach ($Snapmirror in $SrcSnapMirrors)
            {Get-NASnapMirror $Snapmirror.Destination | Invoke-NaSnapmirrorResync -Confirm:$false | Out-Null}
        }

    #Complete
    Write-Host "  --  Completed Volume Move  -- "
}

If you wanted to use this to move a bunch of destinations to a new volume, then a short snippet like this does the job for you:

$VolsToMove = "volA", "volB"
$FilerLogin = Get-Credential
ForEach($v in $VolsToMove)
    {Move-SnapmirrorDestination -VolumeToMove $v -DestinationAggr newAggr -FilerName Filer1 -FilerCredentials $FilerLogin}

Be the first to like.
Posted in NetApp, Storage | Leave a comment

Assigning Permissions to Assign Networks to VM in vSphere

If you need to allow a specific user or group the permission to change the connected network on a virtual machine in vSphere, then permissions have to be given in a couple of places. This provides very granular control about the machines and the networks that a person can use, however may not be totally apparent when you are trying to get it working (it wasn’t apparent to me until I thought about the problem for a while).

Two distinct permissions are required:

Against the Virtual Machine that they want to edit

  • Virtual Machine –> Configuration –> Modify Device Settings
  • Virtual Machine –> Configuration –> Settings

Against the Network objects that they can assign

  • Network –> Assign Network

If the user or group in question does not have the Assign Network permission applied to the network object, then the object does not appear in the list of networks selectable. If there are no networks that the user has the Assign Network permission on, then the option to change the network assigned to a VM will not be available to that user.

Be the first to like.
Posted in VMWare | Leave a comment

IISCrypto – Making SSL/TLS Configuration Easier

Following the recent Poodle vulnerability, and the general best practice that you should always use the most secure protocols available, I have been spending some time reconfiguring servers.

Setting the order of ciphers, and enabling Forward Secrecy in Windows requires editing the registry – a lot. This is susceptible to errors, as the process is manual. Also, it doesn’t really give you a holistic picture of the before and after settings.

I stumbled across this tool from Nartac Software – IISCrypto. A free tool that shows you the current settings that you have for SSL/TLS, and a quick and easy way to change the active protocols and re-order the ciphers.

It is speedy and accurate. Perfect for updating a number of servers/systems manually.

Get it here: https://www.nartac.com/Products/IISCrypto/Default.aspx

Be the first to like.
Posted in Internet Security, Server 2003, Server 2008, Server 2012 | Leave a comment

Automating pfSense Backups

Just found a fantastic tool which is so simple and just works.

https://knowledge.zomers.eu/pfsense/Pages/How-to-automate-pfSense-backup.aspx

Downloaded, tested, in place and working within 15 minutes. Perfect!

Be the first to like.
Posted in Internet Security | Leave a comment
  • Tags

  • Categories

  • My LinkedIn Profile

    To see my LinkedIn profile, click here:

    Craig Tolley
  • March 2015
    M T W T F S S
    « Feb    
     1
    2345678
    9101112131415
    16171819202122
    23242526272829
    3031  
  • Meta

  • Top Liked Posts

    Powered by WP Likes

Swedish Greys - a WordPress theme from Nordic Themepark.