XenDesktop 7 Move Database

Unfortunately once again the documentation provided by Citrix on moving databases from one SQL Server to another is incomplete. The documentation provided here: http://support.citrix.com/article/CTX140319 misses out the configuration of one of the services.

The listing misses out the configuration of the Citrix Analytics Service. In my environment this meant that after the reboot of the service, it tried to do a license server upgrade (I forgot to take a screenshot).

Secondly, the PowerShell that is supplied on the site removes the DB configuration for the MonitorDBConnection and LogDBConnection before removing the configuration from the DataStore of the respective services. This causes an error when running the commands in the order that they list.

My corrected version is below. This assumes that the database has already been moved to the new SQL server and the necessary logins created.

$OldSQLServer = "OLDSERVER\OLDINSTANCE
$NewSQLServer = "NEWSERVER\NEWINSTANCE"

Write-Host "Stopping Logging" -ForegroundColor Black -BackgroundColor Yellow
Set-LogSite -State "Disabled"

Write-Host "Updating Connection String" -ForegroundColor Black -BackgroundColor Yellow
$ConnStr = Get-ConfigDBConnection
Write-Host "Old Connection String: $ConnStr"
$ConnStr = $ConnStr.Replace($OldSQLServer,$NewSQLServer)
Write-Host "New Connection String: $ConnStr"

Write-Host "Clearing all current DB Connections" -ForegroundColor Black -BackgroundColor Yellow
Set-ConfigDBConnection -DBConnection $null
Set-AcctDBConnection -DBConnection $null
Set-AnalyticsDBConnection -DBConnection $null
Set-HypDBConnection -DBConnection $null
Set-ProvDBConnection -DBConnection $null
Set-BrokerDBConnection -DBConnection $null
Set-EnvTestDBConnection -DBConnection $null
Set-SfDBConnection -DBConnection $null
Set-MonitorDBConnection -DataStore Monitor -DBConnection $null
Set-MonitorDBConnection -DBConnection $null
Set-LogDBConnection -DataStore Logging -DBConnection $null
Set-LogDBConnection -DBConnection $null
Set-AdminDBConnection -DBConnection $null

Write-Host "Configuring new DB Connections" -ForegroundColor Black -BackgroundColor Yellow
Set-AdminDBConnection -DBConnection $ConnStr
Set-AnalyticsDBConnection -DBConnection $ConnStr
Set-ConfigDBConnection -DBConnection $ConnStr
Set-AcctDBConnection -DBConnection $ConnStr
Set-HypDBConnection -DBConnection $ConnStr
Set-ProvDBConnection -DBConnection $ConnStr
Set-BrokerDBConnection -DBConnection $ConnStr
Set-EnvTestDBConnection -DBConnection $ConnStr
Set-LogDBConnection -DBConnection $ConnStr
Set-LogDBConnection -DataStore Logging -DBConnection $ConnStr
Set-MonitorDBConnection -DBConnection $ConnStr
Set-MonitorDBConnection -DataStore Monitor -DBConnection $ConnStr
Set-SfDBConnection -DBConnection $ConnStr

Write-Host "Testing new DB Connections..." -ForegroundColor Black -BackgroundColor Yellow
Test-AdminDBConnection -DBConnection $ConnStr
Test-AnalyticsDBConnection -DBConnection $ConnStr
Test-ConfigDBConnection -DBConnection $ConnStr
Test-AcctDBConnection -DBConnection $ConnStr
Test-HypDBConnection -DBConnection $ConnStr
Test-ProvDBConnection -DBConnection $ConnStr
Test-BrokerDBConnection -DBConnection $ConnStr
Test-EnvTestDBConnection -DBConnection $ConnStr
Test-LogDBConnection -DBConnection $ConnStr
Test-MonitorDBConnection -DBConnection $ConnStr
Test-SfDBConnection -DBConnection $ConnStr

Write-Host "Re-enabling Logging" -ForegroundColor Black -BackgroundColor Yellow
Set-MonitorConfiguration -DataCollectionEnabled $true
Set-LogSite -State "Enabled"

Write-Host "Restarting all Citrix Services..." -ForegroundColor Black -BackgroundColor Yellow
Get-Service Citrix* | Stop-Service -Force
Get-Service Citrix* | Start-Service

Be the first to like.
Posted in Citrix, SQL Server | Leave a comment

NetScaler 10.5 53.9c StoreFront Monitor uses NSIP, not the SNIP

I believe I have come across a bug in the implementation of the StoreFront monitor in the Citrix NetScaler 10.5. The issue may also exist in previous versions, but I have not tested it.

The NetScaler I was working on was sited in a secure network, with a firewall between the NetScaler and the internal network. Shown below:

NetScaler Layout

I had the following firewall rules in place:

Source Destination Port
 Subnet IP  StoreFront Servers  443
 Management Machines  NetScaler IP  443

The two StoreFront servers, in a load balanced configuration, were constantly showing as down. Expanding the Service Group and looking at the probe results just showed ‘Probe Failed’. To verify connectivity, I created a HTTPS monitor for the same pair of servers, but strangely this monitor always showed as Up.

Running a WireShark trace on the SNIP was not showing any HTTPS requests being sent from the SNIP. Running the same WireShark trace but using the NSIP address showed multiple regular requests from the NetScaler. Another rule was added to the firewall, as per the table below, and the StoreFront monitors changed state to Up almost immediately.

Source Destination Port
Subnet IP StoreFront Servers 443
NetScaler IP StoreFront Servers 443
Management Machines NetScaler IP 443

I don’t believe that this is by design from Citrix, as their documentation for the NetScaler clearly states that the SNIP should be responsible for the monitoring of services and communication with backend services. Hopefully this bug/issue will be resolved in a future release.

“The NetScaler ADC uses the subnet IP address as a source IP address to proxy client connections to servers. It also uses the subnet IP address when generating its own packets, such as packets related to dynamic routing protocols, or to send monitor probes to check the health of the servers.”

http://support.citrix.com/proddocs/topic/ns-system-10-map/ns-nw-ipaddrssng-confrng-snips-tsk.html

Thankfully in the situation I am currently working on, allowing the NSIP the ability to see the StoreFront servers and subsequently monitor them was not a problem (although another hole in the firewall) but I can imagine that a number of deployments may not have the flexibility to configure their network or firewall in this way.

Be the first to like.
Posted in Citrix | 2 Comments

Back Up SharePoint Farm using PowerShell

We have a small SharePoint Foundation 2010 site here, which contains some information which we wanted to protect. The size of the site did not particularly warrant the purchase of a license for any specialist backup software, so we were looking for a way to back it up, send some notifications and retain a specific number of backups.

As SharePoint natively includes a backup and restore function, which backs up everything including the databases, it made sense to use this. We scheduled the script below to run daily, using an account with the following privileges:

  • Local Administrator on the server running SharePoint Foundation 2010
  • SysAdmin in the SQL Server instance.

All you need to do is configure the email settings, location of the backups and the desired retention at the top of the script.

# ----------------------------------------------------------------------------------------------------------
# PURPOSE:    Backup all the SharePoint Farm and retain a specified number of backups
#
# VERSION     DATE         USER                DETAILS
# 1           19/11/2014   Craig Tolley        First version
# 1.1         19/11/2014   Craig Tolley        Added in verification that all backups in the backup TOC can be found
#                                              Changed backup retention to a number of backups rather than a specific number of days.
# 1.2         24/11/2014   Craig Tolley        Corrected reference to backup location for directory sizing
#					       Corrected Start Time reference in report
#
# ----------------------------------------------------------------------------------------------------------

#This function is the exit point for the script. Called from a number of different points depending on the outcomes. 
function Send-NotificationEmailAndExit
{
    $smtpServer = "your.mailserver.co.uk"
    $smtpFrom = "sharepointbackup@yourdomain.co.uk"
    $smtpTo = "recipient@yourdomain.co.uk"
    $messageSubject = "SharePoint Farm - Backup - $((Get-Date).ToShortDateString())"
    $messageBody = "The SharePoint Farm backup script has completed running.`r`n`r`n" + [string]::join("`r`n",$Output)
    Send-MailMessage -From $smtpFrom -To $smtpTo -Subject $messageSubject -Body $messageBody -SmtpServer $smtpServer
    Exit 0
}

#Output Variable for notifications for the email
$Output = @()

#Specify the number of backups that you want to retain in the destination folder
[int]$RetainBackups = 7

#Destination Backup Folder
[string]$BackupPath = "\\servername\sharename"

#Check that that SharePoint PowerShell SnapIn is available, check it is loaded. 
If ((Get-PSSnapIn -Registered | Where {$_.Name -eq "Microsoft.Sharepoint.Powershell"} | Measure).Count -ne 1) {
    $Output += "ERROR: SharePoint Server PowerShell is not installed on targeted machine"
    Send-NotificationEmailAndExit
    }
If ((Get-PSSnapIn | Where {$_.Name -eq "Microsoft.Sharepoint.Powershell"} | Measure).Count -ne 1) {
    $Output += "INFO: Adding SharePoint Server PowerShell SnapIn..."
    Add-PSSnapIn -Name "Microsoft.Sharepoint.Powershell" | Out-Null
    }

#Test that the Backup Path Exists
If ((Test-Path $BackupPath) -eq $false) {
    $Output += "ERROR: The specified backup path ($BackupPath) does not exist. Create the destination before using it for SharePoint backups."
    Send-NotificationEmailAndExit
    }

#Perform a Backup of the SharePoint Farm. Script locks until this process completes.
$Output += "INFO: Starting Backup at $(Get-Date)"
#$BackupPath
Backup-SPFarm -BackupMethod Full -Directory "$($BackupPath)"
$Output += "INFO: Completed Backup at $(Get-Date)"

#Check the status of the last backup
$BackupStatus = Get-SPBackupHistory -Directory $BackupPath | Sort StartTime | Select -Last 1
$Output += "INFO: Size of the Backup: {0:N2}" -f ($(Get-ChildItem $BackupStatus.Directory -recurse | Measure-Object -property length -sum).sum / 1MB) + " MB"
$Output += "INFO: Backup Warnings: $($BackupStatus.WarningCount)"
$Output += "INFO: Backup Errors: $($BackupStatus.ErrorCount)"
If ($BackupStatus.IsFailure -eq $true) {
    $Output += "ERROR: The backup was not successful. The details of the backup operation are: $($BackupStatus | fl) "
    Send-NotificationEmailAndExit
    }
$Output += "INFO: Backup Report: $($BackupStatus | fl | Out-String)"

#Get the SPBackup Table of Contents
$spbrtoc = "$BackupPath\spbrtoc.xml"
[xml]$spbrtocxml = Get-Content $spbrtoc

#Find the old backups in spbrtoc.xml
$OldBackups = if ($spbrtocxml.SPBackupRestoreHistory.SPHistoryObject.Count -gt $RetainBackups) 
                {$spbrtocxml.SPBackupRestoreHistory.SPHistoryObject | Sort StartTime -Descending | Select -Last $($spbrtocxml.SPBackupRestoreHistory.SPHistoryObject.Count - $RetainBackups)}
                else {$OldBackups = $null} 
                    
if ($OldBackups -eq $Null) {
    $Output += "INFO: There is not more than $RetainBackups backups in the specified backup directory"
    }

#Delete the backup reference from the XML Table of Contents and delete the physical backup file.
ForEach ($BackupRef in $OldBackups) {
    $spbrtocxml.SPBackupRestoreHistory.RemoveChild($BackupRef)

    If ((Test-Path $BackupRef.SPBackupDirectory) -eq $true) {
        $Output += "INFO: Removing the SP Backup Directory: $($BackupRef.SPBackupDirectory)"
        Remove-Item $BackupRef.SPBackupDirectory -Recurse
        }
    Else {
        $Output += "ERROR: Backup directory $($BackupRef.SPBackupDirectory) not found."
        }
    $Output += "INFO: Removed old backup reference from: $($BackupRef.SPStartTime)"
    }

#Verify all other items referenced in the backup file are present
$Output += "INFO: Started checking for orphaned backup records"
ForEach ($BackupRef in $spbrtocxml.SPBackupRestoreHistory.SPHistoryObject) {
    If ((Test-Path $BackupRef.SPBackupDirectory) -eq $false) {
        $spbrtocxml.SPBackupRestoreHistory.RemoveChild($BackupRef)
        $Output += "INFO: Removed reference to non-existent backup at location: $($BackupRef.SPBackupDirectory)"
        }
    }
$Output += "INFO: Checking for orphaned backup records complete"

#Save the new Sharepoint backup report xml file
$spbrtocxml.Save($spbrtoc)
$Output += "INFO: Completed removal of old backups."

#All done. Send notification. 
$Output += "INFO: SharePoint Backup Script Completed"
Send-NotificationEmailAndExit

Be the first to like.
Posted in Powershell, SharePoint | Leave a comment

PowerShell: Move SnapMirror Destinations to a New Aggregate (7-Mode)

We got some new disk shelves in to replace some old ones on our NetApp filer. The old disks had a lot of SnapMirror destination volumes.All these needed migrating to the new disks. Initially we were planning on creating a new set of SnapMirrors from the source volumes to the destinations, and once these were complete, remove the original mirrors. This seemed like a long winded wasteful process – we did not particularly want 30Tb of data re-transmitted over the network, especially as an fully internal transfer would run at a significantly higher speed.

Also, we wanted to automate this process as much as possible, as this provided the smallest scope for error.

Hence the following script was born.

It runs through quite a simple process:

  • Load the DataONTAP PowerShell module if it is not available
  • Connect to the specified filer
  • Check that the source volume and the destination aggregate exist
  • Get details of the NFS exports on the volume and remove all NFS exports
  • Break the current SnapMirror relationship
  • Perform a Volume Move of the now writeable SnapMirror destination
  • Re-create the NFS exports
  • Re-sync the SnapMirror

The NFS exports have to be removed, as volume moves are not supported on volumes that have any exports configured. As long as the script runs to the end, the exports will be recreated once everything has completed.

# ----------------------------------------------------------------------------------------------------------
# PURPOSE:    Moves a SnapMirror Destination to a new aggregate, without re-initializing the SnapMirror from scratch. 
#
# VERSION     DATE         USER                DETAILS
# 1           27/10/2014   Craig Tolley        First version
# 1.1         14/11/2014   Craig Tolley        Added new parameter to pass in credentials, so that scripting multiple moves is easier and without prompts
#
# ----------------------------------------------------------------------------------------------------------

<#
.Synopsis
   Moves the specified volume to a new aggregate.

.EXAMPLE
   Move-SnapmirrorDestination -VolumeToMove Volume1 -DestinationAggr Aggr2 -FilerName Filer1

#>
function Move-SnapmirrorDestination
{
    [CmdletBinding()]
    Param
    (
        # The volume that we want to move
        [Parameter(Mandatory=$true,
                   ValueFromPipeline=$false,
                   Position=0)]
        [String]$VolumeToMove,

        # The destination aggregate for the new volume
        [Parameter(Mandatory=$true,
                   ValueFromPipeline=$false,
                   Position=1)]
        [String]$DestinationAggr,

        # The filer name to connect to 
        [Parameter(Mandatory=$true,
                   ValueFromPipeline=$false,
                   Position=2)]
        [String]$FilerName,

        [Parameter(Position=3)]
        [System.Management.Automation.PSCredential]$FilerCredentials

    )

    #Check that that DataOnTap Module is available, check it is loaded. 
    Write-Host "Checking for DataONTAP Module..."
    if ((Get-Module -ListAvailable | Where {$_.Name -eq "DataONTAP"}).Count -ne 1) {
        Write-Error "DataONTAP not installed on targeted machine"
        Exit 1
        }
    if ((Get-Module | Where {$_.Name -eq "DataONTAP"}).Count -ne 1) {
        Write-Host "Importing DataONTAP Module..."
        Import-Module -Name "DataONTAP" | Out-Null
        }

    #If we have not been passed credentials, then prompt for them. 
    If ($FilerCredentials -eq $null)
        {$FilerCredentials = Get-Credential -Message "Please supply credentials for $FilerName"}

    #Connect to the Filer.
    Write-Host "Connecting to $FilerName" -BackgroundColor Yellow -ForegroundColor Black
    $Error.Clear()
    Connect-NaController -Name $FilerName -Credential $FilerCredentials
    If ($Error.Count -gt 0)
        {
        Write-Host "There was an error connecting to the filer. Please check your credentials and try again."
        Break
        }
    Write-Host ""

    #Get the Source Volume
    Write-Host "Getting details of Volume: $VolumeToMove" -BackgroundColor Yellow -ForegroundColor Black
    $SrcVolume = Get-NaVol $VolumeToMove
    If ($Error.Count -gt 0)
        {
        Write-Host "There was an error getting the details of the Volume. Please check that the volume name is correct and the volume is online."
        Break
        }
    $SrcVolume | ft
  
    #Get the Destination Aggregate
    Write-Host "Getting details of the destination aggregate: $DestinationAggr" -BackgroundColor Yellow -ForegroundColor Black
    $DestAggr = Get-NaAggr $DestinationAggr
    If ($Error.Count -gt 0)
        {
        Write-Host "There was an error getting the details of the Volume. Please check that the volume name is correct and the volume is online."
        Break
        }
    $DestAggr | ft

    #Get the NFS Exports for the Volume and Remove them
    Write-Host "Getting details of the NFS Exports" -BackgroundColor Yellow -ForegroundColor Black
    $NFSExports = Get-NaNfsExport | Where {$_.Pathname -like "*$($SrcVolume.Name)"}
    If (($NFSExports).Count -gt 0)
        {
        ForEach ($Exp in $NFSExports)
            {Remove-NaNfsExport $Exp}
        }
    Else 
        {Write-Host "No NFS Exports are configured for this volume"}
    Write-Host ""

    #Break all of the snapmirrors which are configured
    Write-Host "Breaking existing Snapmirrors" -BackgroundColor Yellow -ForegroundColor Black
    $SrcSnapMirrors = Get-NaSnapmirror $SrcVolume
    $SrcSnapMirrors | ft
    If (($SrcSnapMirrors).Count -gt 0)
        {
        ForEach ($Snapmirror in $SrcSnapMirrors)
            {Get-NASnapMirror $Snapmirror.Destination | Invoke-NaSnapmirrorBreak -Confirm:$false | Out-Null}
        }
    Else 
        {Write-Host "No Snapmirrors are configured for this volume"}
    Write-Host ""

    #Start the actual volume move. 
    Write-Host "Starting the Vol Move (Update every 15 seconds)" -BackgroundColor Yellow -ForegroundColor Black
    Start-NaVolMove -Volume $SrcVolume -DestAggr $DestAggr

    #Keep Running Until the Vol Move completes
    Get-NaVolMove | Where {$_.SourceVolumeName -eq $SrcVolume.Name} | ft
    Do
        {
        Get-NaVolMove | Where {$_.SourceVolumeName -eq $SrcVolume.Name} | ft -HideTableHeaders
        Start-Sleep 15
        }
    Until ((Get-NaVolMove | Where {$_.SourceVolumeName -eq $SrcVolume.Name}).Count -eq 0)

    #Recreate the NFS Exports
    Write-Host "Recreating the NFS Exports" -BackgroundColor Yellow -ForegroundColor Black
    ForEach ($Exp in $NFSExports)
        {Add-NaNfsExport $Exp}
    Write-Host ""

    #Resync all of the snapmirrors which are configured
    Write-Host "Re-synching all SnapMirrors" -BackgroundColor Yellow -ForegroundColor Black
    If (($SrcSnapMirrors).Count -gt 0)
        {
        ForEach ($Snapmirror in $SrcSnapMirrors)
            {Get-NASnapMirror $Snapmirror.Destination | Invoke-NaSnapmirrorResync -Confirm:$false | Out-Null}
        }

    #Complete
    Write-Host "  --  Completed Volume Move  -- "
}

If you wanted to use this to move a bunch of destinations to a new volume, then a short snippet like this does the job for you:

$VolsToMove = "volA", "volB"
$FilerLogin = Get-Credential
ForEach($v in $VolsToMove)
    {Move-SnapmirrorDestination -VolumeToMove $v -DestinationAggr newAggr -FilerName Filer1 -FilerCredentials $FilerLogin}

Be the first to like.
Posted in NetApp, Storage | Leave a comment

Assigning Permissions to Assign Networks to VM in vSphere

If you need to allow a specific user or group the permission to change the connected network on a virtual machine in vSphere, then permissions have to be given in a couple of places. This provides very granular control about the machines and the networks that a person can use, however may not be totally apparent when you are trying to get it working (it wasn’t apparent to me until I thought about the problem for a while).

Two distinct permissions are required:

Against the Virtual Machine that they want to edit

  • Virtual Machine –> Configuration –> Modify Device Settings
  • Virtual Machine –> Configuration –> Settings

Against the Network objects that they can assign

  • Network –> Assign Network

If the user or group in question does not have the Assign Network permission applied to the network object, then the object does not appear in the list of networks selectable. If there are no networks that the user has the Assign Network permission on, then the option to change the network assigned to a VM will not be available to that user.

Be the first to like.
Posted in VMWare | Leave a comment

IISCrypto – Making SSL/TLS Configuration Easier

Following the recent Poodle vulnerability, and the general best practice that you should always use the most secure protocols available, I have been spending some time reconfiguring servers.

Setting the order of ciphers, and enabling Forward Secrecy in Windows requires editing the registry – a lot. This is susceptible to errors, as the process is manual. Also, it doesn’t really give you a holistic picture of the before and after settings.

I stumbled across this tool from Nartac Software – IISCrypto. A free tool that shows you the current settings that you have for SSL/TLS, and a quick and easy way to change the active protocols and re-order the ciphers.

It is speedy and accurate. Perfect for updating a number of servers/systems manually.

Get it here: https://www.nartac.com/Products/IISCrypto/Default.aspx

Be the first to like.
Posted in Internet Security, Server 2003, Server 2008, Server 2012 | Leave a comment

Automating pfSense Backups

Just found a fantastic tool which is so simple and just works.

https://knowledge.zomers.eu/pfsense/Pages/How-to-automate-pfSense-backup.aspx

Downloaded, tested, in place and working within 15 minutes. Perfect!

Be the first to like.
Posted in Internet Security | Leave a comment

SQL Server Reporting Services – Rebuild Performance Counters

Got a lot (20+) of this sort of error in my event log.

SQL Reporting Services Performance Counter ErrorsThere is a guide on the Microsoft Wiki abou this here: http://social.technet.microsoft.com/wiki/contents/articles/1916.how-to-rebuild-the-report-server-performance-counters-ssrs.aspx, however, this alone is not enough to fix the issue.

The commands that I ran in order to get mine back up and running, from an elevated command prompt were:

C:\Windows\Microsoft.NET\Framework\v4.0.30319\installutil.exe /u C:\Program Files\Microsoft SQL Server\MSRS10_50.MSSQLSERVER\Reporting Services\ReportServer\bin/ReportingServiceLibrary.dll
c:\windows\system32\lodctr.exe /r
C:\Windows\Microsoft.NET\Framework\v4.0.30319\installutil.exe C:\Program Files\Microsoft SQL Server\MSRS10_50.MSSQLSERVER\Reporting Services\ReportServer\bin/ReportingServiceLibrary.dll
net stop ReportServer
net start ReportServer

The errors no longer occurred when rebooting the server after this point.

Be the first to like.
Posted in SQL Server | Leave a comment

SharePoint 2013: Renaming the Database Server

There are a number of posts on this online, but none of them exactly described the steps that I needed to take in order to rename the server that was hosting the SQL databases for our SharePoint installation.

I was in the fortunate position that the system had not yet gone live, so rebooting and changing configuration was easy, however it had all been configured, so blowing it away and starting again was not an option.

Our setup is simple. 2 servers, In-SharePoint is the SharePoint 2013 server and was running the websites, and In-SharePointSQL the database server, running SQL Server.

Unfortunately we found that In-SharePointSQL was too long name, so the SQL instance actually was truncated to In-SharepointSQ <– note the missing L. This caused some issues with NetApp SnapManager for SQL, so for clarity and correctness I decided it would be better if this was called In-SharePointDB.

Our environment is virtualised, so before starting I took a snapshot and a backup. I dont need to tell you to be taking a backup before you make any changes :-)

1. We dont want the SharePoint site to be accessible, so first off, stop all the SharePoint services on the SharePoint server. Using PowerShell:

Get-Service -DisplayName SharePoint* | Stop-Service
Get-Service IISAdmin, W3SVC | Stop-Service

2. On the database server, perform the rename of server to the desired new name. Reboot when prompted.

3. Open up SQL Server Manager and run the following query, substituting in your server names:

sp_dropserver IN-SHAREPOINTSQL
go
sp_addserver IN-SHAREPOINTDB, local
go

4. Restart SQL Server on the database server

5. Verify that SQL is now returning the correct server name using this query:

Select @@SERVERNAME As 'ServerName

6. We now need to correct the SQL Server Reporting Server configuration. On the SQL Server, opne up SQL Server Reporting Services Configuration. Connect to the Report Server using the new SQL Server name.

7. Select Database Configuration on the left, then click Change Database. Choose the Existing Report Server Database option, and ollow through the wizard specifying your new SQL Server name and choosing the existing Report Server database.

8. Open the file %ProgramFiles%\Microsoft SQL Server\MSRS10_50.MSSQLSERVER\Reporting Services\ReportServer\rsreportserver.config in Notepad. Search for tags called URLRoot and ReportServerURL and update if they are defined. In my case URLRoot was present and defined, requiring updating. The ReportServerURL tag was just blank though, in which case it can be left.

9. Restart the SQL Server Reporting Services service on the SQL Server.

10. Go to your SharePoint 2013 Web Server. Open cliconfg.exe. On the Alias tab click Add. Type in the old server name at the top, select TCP/IP on the left and then enter new server name on the right. It should look like this:

SQLAlias11. Restart the SharePoint Services

Get-Service IISAdmin, W3SVC | Start-Service
Get-Service -DisplayName SharePoint* | Start-Service

12. Open the SharePoint Management Shell and run the following command to update all of the databases to the new SQL instance.

Get-SPDatabase | ForEach {$_.ChangeDatabaseInstance("IN-SHAREPOINTDB")}

13. As a personal belt and braces, I then restarted the SharePoint server.

14. On the SharePoint server, open cliconfg.exe, go to the Alias tab and remove the alias that you created.

Complete!

Everything should now be working using the new server name for the SQL Server.

 

Be the first to like.
Posted in Server 2012, SharePoint, SQL Server | Leave a comment

Modifying GPO Printer Deployment using PowerShell

We use the Deployed Printers feature in the Print Management console to deploy printers to users. As part of the printer migration that I have been working on, I needed to modify all of these GPOs so that all of the policies directed all of the users to the new print server.

There are over 200 policies in existence. That would be a lot of manual work, that would be prone to errors.

Also, I needed to be able to change over print servers in around an hour. Editing 200 policies in an hour would be impossible.

This is what the result of this conundrum is. The script below goes out to find all of the deployed printers in all GPOs in the domain. It then looks for the name of the old print server, and replaces it with the name of the new print server.

The script fully supports the -WhatIf switch, and also an output of all of the printers that were found and what changes were made is output at the end of the process. I suggest that the output is piped to a variable, which can then be formatted as a table or exported into something more readable. I just provide the raw data here, you can do with it as you wish.

You will need the ActiveDirectory module loaded in your Powershell session, and you will also need to be running this script as a user that has sufficient permissions to modify GPO in your domain.

As always, test on a lab before running this on your production servers!

# ----------------------------------------------------------------------------------------------------------
# PURPOSE:    Modify all GPO's which have Pushed Printer Connections to a new print server name.
#
# VERSION     DATE         USER                DETAILS
# 1           22/08/2014   Craig Tolley        First version
#
#
# ----------------------------------------------------------------------------------------------------------

#Define the print server names. Should include the leading \\ to ensure it only matches at the start.
Function Modify-PushedPrinterConnections
{
[cmdletbinding(SupportsShouldProcess=$True)]

Param
    (
        #The name of the Old Print Server. This string will be searched for in order to be replaced.
        [Parameter(Mandatory=$true)]
        [string]$OldPrintServerName,

        #The name of the New Print Server. This will replace the Old Print Server value.
        [Parameter(Mandatory=$true)]
        [string]$NewPrintServerName
    )

#Collection detailing all of the work
$GPOPrinterDetails = @()

#Get all of the GPO objects in the domain.
$GPOs = Get-GPO -All
Write-Host "GPOs Retrieved: $($GPOs.Count)"


ForEach ($GPO in $GPOs)
{
    $PrintObjects = Get-ADObject -SearchBase "CN={$($GPO.Id)},CN=Policies,CN=System,DC=medlan,DC=cam,DC=ac,DC=uk" -Filter {objectClass -eq "msPrint-ConnectionPolicy"} -SearchScope Subtree
    
    ForEach ($PCO in $PrintObjects)
    {
        #Get the properties of the Print Connection Object that we actually need.
        $PrintConnection = Get-ADObject $PCO.DistinguishedName -Properties printerName, serverName, uNCName
    
        #Log details of the policy that we have found    
        $GPOPrinterDetail = @{
                    GPOId = $GPO.Id
                    GPOName = $GPO.DisplayName
                    PrintConnectionID = $PrintConnection.ObjectGUID
                    PrinterName = $PrintConnection.printerName
                    OriginalPrintServer = $PrintConnection.serverName
                    OriginalUNCName = $PrintConnection.uNCName
                    NewPrintServer = $null
                    NewUNCName = $null
                    ChangeStatus = "NotEvaluated"
                    }
        
        #Find out if we need to make a change or not.
        If ($PrintConnection.serverName.ToLower() -eq $OldPrintServerName.ToLower())
        {
            #Change the local instance
            $PrintConnection.serverName = $NewPrintServerName
            $PrintConnection.uNCName = $PrintConnection.uNCName.Replace($OldPrintServerName,$NewPrintServerName)
            
            #Update our reporting collection
            $GPOPrinterDetail.NewPrintServer = $PrintConnection.serverName
            $GPOPrinterDetail.NewUNCName = $PrintConnection.uNCName
            $GPOPrinterDetail.ChangeStatus = "ChangePending"
                        
            #Write the changes and catch any errors
            Try
                {Set-ADObject -Instance $PrintConnection -Verbose
                $GPOPrinterDetail.ChangeStatus = "ChangeSuccess"}
            Catch
                {$GPOPrinterDetail.ChangeStatus = "ChangeFailed"}
                
        }
        Else
        {
            $GPOPrinterDetail.ChangeStatus = "NoChange"
        }

        #Update the table
        $GPOPrinterDetails += New-Object PSObject -Property $GPOPrinterDetail
    }
 
}

#Finally write out the changes
Write-Output $GPOPrinterDetails

}

Be the first to like.
Posted in Active Directory, Powershell, Server 2012 | Leave a comment
  • Tags

  • Categories

  • My LinkedIn Profile

    To see my LinkedIn profile, click here:

    Craig Tolley
  • December 2014
    M T W T F S S
    « Nov    
    1234567
    891011121314
    15161718192021
    22232425262728
    293031  
  • Meta

  • Top Liked Posts

    Powered by WP Likes

Swedish Greys - a WordPress theme from Nordic Themepark.