How to Create a Specific Customized Logon Page for Each VPN vServer based on FQDN without breaking Email Based Discovery

Citrix have published a guide ( on creating a customised logon page for each virtual server, based on the FQDN received. The article works, and true to its intended aim, the sites respond on the relative FQDN and return the correctly customised login page for each of the vServers.

Once this has been completed though, the vServer that has been configured with a Responder in the NetScaler will no longer be able to use email based discovery or automatic configuration using the external store name. The error we were getting on the receiver was this:

“Your account cannot be added using this server address. Make sure you entered it correctly. You may need to enter your email address instead.”

The same error was displayed if using the email address or the FQDN of the vServer.

Disabling the Responder rule that was created following the KB allowed the configuration to work. Based on this, I fully removed the responder and in started looking for other ways to accomplish the customisation.

These are the steps that I took to enable the rewrite rule:

I am running NetScaler 10.5

Using the GUI:

1. Check that rewrite is enabled in System –> Settings –> Configure Basic Features.

2. Go to AppExpert –> Rewrite –> Actions. Create a new Action. Enter a name and set the type to Replace. In the ‘Expression to choose target location’ field, enter ‘HTTP.REQ.URL’. In the expression to replace with, you need to enter the full web address to the newly created custom logon page. In this example I have entered “”. It should look similar to the image below. Click Create when you are done. Citrix_NetScaler_Rewrite1_Action
3. Go to AppExpert –> Rewrite –> Policy. Create a new Policy. Enter a name and set the Action to the name of the action created in step 2. The Undefined-Result-Action should be set to ‘Global-undefined-result-action’. In the expression enter the following, substituting in your FQDN: ‘HTTP.REQ.HOSTNAME.CONTAINS(“”) && HTTP.REQ.URL.CONTAINS(“index.html”)’
Citrix_NetScaler_Rewrite2_Policy4. Finally, we need to bind this policy to the Global HTTP Request receiver. Go to AppExpert –> Rewrite –> Policy. Select the policy that you just created, and then click Policy Manager at the top. Accept the default settings for the Bind Point (show below for completeness). Click Continue. Select Add Binding, then choose the Policy that you created in step 3. The other details can be left as default, and click Bind, then click Done in the Policy Manager.
5. Test, and hopefully all will work.

Using the CLI:
1. enable feature rewrite
2. add rewrite action REWRITE_ACT replace “HTTP.REQ.URL” “\”\””
4. bind rewrite global REWRITE_POL 1 END -type REQ_DEFAULT
5. Test

Following this, both the custom page redirection, and email based discovery both working as they should.

Be the first to like.
Posted in Citrix | Leave a comment

XenDesktop 7 Move Database

Unfortunately once again the documentation provided by Citrix on moving databases from one SQL Server to another is incomplete. The documentation provided here: misses out the configuration of one of the services.

The listing misses out the configuration of the Citrix Analytics Service. In my environment this meant that after the reboot of the service, it tried to do a license server upgrade (I forgot to take a screenshot).

Secondly, the PowerShell that is supplied on the site removes the DB configuration for the MonitorDBConnection and LogDBConnection before removing the configuration from the DataStore of the respective services. This causes an error when running the commands in the order that they list.

My corrected version is below. This assumes that the database has already been moved to the new SQL server and the necessary logins created.


Write-Host "Stopping Logging" -ForegroundColor Black -BackgroundColor Yellow
Set-LogSite -State "Disabled"

Write-Host "Updating Connection String" -ForegroundColor Black -BackgroundColor Yellow
$ConnStr = Get-ConfigDBConnection
Write-Host "Old Connection String: $ConnStr"
$ConnStr = $ConnStr.Replace($OldSQLServer,$NewSQLServer)
Write-Host "New Connection String: $ConnStr"

Write-Host "Clearing all current DB Connections" -ForegroundColor Black -BackgroundColor Yellow
Set-ConfigDBConnection -DBConnection $null
Set-AcctDBConnection -DBConnection $null
Set-AnalyticsDBConnection -DBConnection $null
Set-HypDBConnection -DBConnection $null
Set-ProvDBConnection -DBConnection $null
Set-BrokerDBConnection -DBConnection $null
Set-EnvTestDBConnection -DBConnection $null
Set-SfDBConnection -DBConnection $null
Set-MonitorDBConnection -DataStore Monitor -DBConnection $null
Set-MonitorDBConnection -DBConnection $null
Set-LogDBConnection -DataStore Logging -DBConnection $null
Set-LogDBConnection -DBConnection $null
Set-AdminDBConnection -DBConnection $null

Write-Host "Configuring new DB Connections" -ForegroundColor Black -BackgroundColor Yellow
Set-AdminDBConnection -DBConnection $ConnStr
Set-AnalyticsDBConnection -DBConnection $ConnStr
Set-ConfigDBConnection -DBConnection $ConnStr
Set-AcctDBConnection -DBConnection $ConnStr
Set-HypDBConnection -DBConnection $ConnStr
Set-ProvDBConnection -DBConnection $ConnStr
Set-BrokerDBConnection -DBConnection $ConnStr
Set-EnvTestDBConnection -DBConnection $ConnStr
Set-LogDBConnection -DBConnection $ConnStr
Set-LogDBConnection -DataStore Logging -DBConnection $ConnStr
Set-MonitorDBConnection -DBConnection $ConnStr
Set-MonitorDBConnection -DataStore Monitor -DBConnection $ConnStr
Set-SfDBConnection -DBConnection $ConnStr

Write-Host "Testing new DB Connections..." -ForegroundColor Black -BackgroundColor Yellow
Test-AdminDBConnection -DBConnection $ConnStr
Test-AnalyticsDBConnection -DBConnection $ConnStr
Test-ConfigDBConnection -DBConnection $ConnStr
Test-AcctDBConnection -DBConnection $ConnStr
Test-HypDBConnection -DBConnection $ConnStr
Test-ProvDBConnection -DBConnection $ConnStr
Test-BrokerDBConnection -DBConnection $ConnStr
Test-EnvTestDBConnection -DBConnection $ConnStr
Test-LogDBConnection -DBConnection $ConnStr
Test-MonitorDBConnection -DBConnection $ConnStr
Test-SfDBConnection -DBConnection $ConnStr

Write-Host "Re-enabling Logging" -ForegroundColor Black -BackgroundColor Yellow
Set-MonitorConfiguration -DataCollectionEnabled $true
Set-LogSite -State "Enabled"

Write-Host "Restarting all Citrix Services..." -ForegroundColor Black -BackgroundColor Yellow
Get-Service Citrix* | Stop-Service -Force
Get-Service Citrix* | Start-Service

Be the first to like.
Posted in Citrix, SQL Server | Leave a comment

NetScaler 10.5 53.9c StoreFront Monitor uses NSIP, not the SNIP

I believe I have come across a bug in the implementation of the StoreFront monitor in the Citrix NetScaler 10.5. The issue may also exist in previous versions, but I have not tested it.

The NetScaler I was working on was sited in a secure network, with a firewall between the NetScaler and the internal network. Shown below:

NetScaler Layout

I had the following firewall rules in place:

Source Destination Port
 Subnet IP  StoreFront Servers  443
 Management Machines  NetScaler IP  443

The two StoreFront servers, in a load balanced configuration, were constantly showing as down. Expanding the Service Group and looking at the probe results just showed ‘Probe Failed’. To verify connectivity, I created a HTTPS monitor for the same pair of servers, but strangely this monitor always showed as Up.

Running a WireShark trace on the SNIP was not showing any HTTPS requests being sent from the SNIP. Running the same WireShark trace but using the NSIP address showed multiple regular requests from the NetScaler. Another rule was added to the firewall, as per the table below, and the StoreFront monitors changed state to Up almost immediately.

Source Destination Port
Subnet IP StoreFront Servers 443
NetScaler IP StoreFront Servers 443
Management Machines NetScaler IP 443

I don’t believe that this is by design from Citrix, as their documentation for the NetScaler clearly states that the SNIP should be responsible for the monitoring of services and communication with backend services. Hopefully this bug/issue will be resolved in a future release.

“The NetScaler ADC uses the subnet IP address as a source IP address to proxy client connections to servers. It also uses the subnet IP address when generating its own packets, such as packets related to dynamic routing protocols, or to send monitor probes to check the health of the servers.”

Thankfully in the situation I am currently working on, allowing the NSIP the ability to see the StoreFront servers and subsequently monitor them was not a problem (although another hole in the firewall) but I can imagine that a number of deployments may not have the flexibility to configure their network or firewall in this way.

Be the first to like.
Posted in Citrix | 2 Comments

Back Up SharePoint Farm using PowerShell

We have a small SharePoint Foundation 2010 site here, which contains some information which we wanted to protect. The size of the site did not particularly warrant the purchase of a license for any specialist backup software, so we were looking for a way to back it up, send some notifications and retain a specific number of backups.

As SharePoint natively includes a backup and restore function, which backs up everything including the databases, it made sense to use this. We scheduled the script below to run daily, using an account with the following privileges:

  • Local Administrator on the server running SharePoint Foundation 2010
  • SysAdmin in the SQL Server instance.

All you need to do is configure the email settings, location of the backups and the desired retention at the top of the script.

# ----------------------------------------------------------------------------------------------------------
# PURPOSE:    Backup all the SharePoint Farm and retain a specified number of backups
# VERSION     DATE         USER                DETAILS
# 1           19/11/2014   Craig Tolley        First version
# 1.1         19/11/2014   Craig Tolley        Added in verification that all backups in the backup TOC can be found
#                                              Changed backup retention to a number of backups rather than a specific number of days.
# 1.2         24/11/2014   Craig Tolley        Corrected reference to backup location for directory sizing
#					       Corrected Start Time reference in report
# ----------------------------------------------------------------------------------------------------------

#This function is the exit point for the script. Called from a number of different points depending on the outcomes. 
function Send-NotificationEmailAndExit
    $smtpServer = ""
    $smtpFrom = ""
    $smtpTo = ""
    $messageSubject = "SharePoint Farm - Backup - $((Get-Date).ToShortDateString())"
    $messageBody = "The SharePoint Farm backup script has completed running.`r`n`r`n" + [string]::join("`r`n",$Output)
    Send-MailMessage -From $smtpFrom -To $smtpTo -Subject $messageSubject -Body $messageBody -SmtpServer $smtpServer
    Exit 0

#Output Variable for notifications for the email
$Output = @()

#Specify the number of backups that you want to retain in the destination folder
[int]$RetainBackups = 7

#Destination Backup Folder
[string]$BackupPath = "\\servername\sharename"

#Check that that SharePoint PowerShell SnapIn is available, check it is loaded. 
If ((Get-PSSnapIn -Registered | Where {$_.Name -eq "Microsoft.Sharepoint.Powershell"} | Measure).Count -ne 1) {
    $Output += "ERROR: SharePoint Server PowerShell is not installed on targeted machine"
If ((Get-PSSnapIn | Where {$_.Name -eq "Microsoft.Sharepoint.Powershell"} | Measure).Count -ne 1) {
    $Output += "INFO: Adding SharePoint Server PowerShell SnapIn..."
    Add-PSSnapIn -Name "Microsoft.Sharepoint.Powershell" | Out-Null

#Test that the Backup Path Exists
If ((Test-Path $BackupPath) -eq $false) {
    $Output += "ERROR: The specified backup path ($BackupPath) does not exist. Create the destination before using it for SharePoint backups."

#Perform a Backup of the SharePoint Farm. Script locks until this process completes.
$Output += "INFO: Starting Backup at $(Get-Date)"
Backup-SPFarm -BackupMethod Full -Directory "$($BackupPath)"
$Output += "INFO: Completed Backup at $(Get-Date)"

#Check the status of the last backup
$BackupStatus = Get-SPBackupHistory -Directory $BackupPath | Sort StartTime | Select -Last 1
$Output += "INFO: Size of the Backup: {0:N2}" -f ($(Get-ChildItem $BackupStatus.Directory -recurse | Measure-Object -property length -sum).sum / 1MB) + " MB"
$Output += "INFO: Backup Warnings: $($BackupStatus.WarningCount)"
$Output += "INFO: Backup Errors: $($BackupStatus.ErrorCount)"
If ($BackupStatus.IsFailure -eq $true) {
    $Output += "ERROR: The backup was not successful. The details of the backup operation are: $($BackupStatus | fl) "
$Output += "INFO: Backup Report: $($BackupStatus | fl | Out-String)"

#Get the SPBackup Table of Contents
$spbrtoc = "$BackupPath\spbrtoc.xml"
[xml]$spbrtocxml = Get-Content $spbrtoc

#Find the old backups in spbrtoc.xml
$OldBackups = if ($spbrtocxml.SPBackupRestoreHistory.SPHistoryObject.Count -gt $RetainBackups) 
                {$spbrtocxml.SPBackupRestoreHistory.SPHistoryObject | Sort StartTime -Descending | Select -Last $($spbrtocxml.SPBackupRestoreHistory.SPHistoryObject.Count - $RetainBackups)}
                else {$OldBackups = $null} 
if ($OldBackups -eq $Null) {
    $Output += "INFO: There is not more than $RetainBackups backups in the specified backup directory"

#Delete the backup reference from the XML Table of Contents and delete the physical backup file.
ForEach ($BackupRef in $OldBackups) {

    If ((Test-Path $BackupRef.SPBackupDirectory) -eq $true) {
        $Output += "INFO: Removing the SP Backup Directory: $($BackupRef.SPBackupDirectory)"
        Remove-Item $BackupRef.SPBackupDirectory -Recurse
    Else {
        $Output += "ERROR: Backup directory $($BackupRef.SPBackupDirectory) not found."
    $Output += "INFO: Removed old backup reference from: $($BackupRef.SPStartTime)"

#Verify all other items referenced in the backup file are present
$Output += "INFO: Started checking for orphaned backup records"
ForEach ($BackupRef in $spbrtocxml.SPBackupRestoreHistory.SPHistoryObject) {
    If ((Test-Path $BackupRef.SPBackupDirectory) -eq $false) {
        $Output += "INFO: Removed reference to non-existent backup at location: $($BackupRef.SPBackupDirectory)"
$Output += "INFO: Checking for orphaned backup records complete"

#Save the new Sharepoint backup report xml file
$Output += "INFO: Completed removal of old backups."

#All done. Send notification. 
$Output += "INFO: SharePoint Backup Script Completed"

Be the first to like.
Posted in Powershell, SharePoint | Leave a comment

PowerShell: Move SnapMirror Destinations to a New Aggregate (7-Mode)

We got some new disk shelves in to replace some old ones on our NetApp filer. The old disks had a lot of SnapMirror destination volumes.All these needed migrating to the new disks. Initially we were planning on creating a new set of SnapMirrors from the source volumes to the destinations, and once these were complete, remove the original mirrors. This seemed like a long winded wasteful process – we did not particularly want 30Tb of data re-transmitted over the network, especially as an fully internal transfer would run at a significantly higher speed.

Also, we wanted to automate this process as much as possible, as this provided the smallest scope for error.

Hence the following script was born.

It runs through quite a simple process:

  • Load the DataONTAP PowerShell module if it is not available
  • Connect to the specified filer
  • Check that the source volume and the destination aggregate exist
  • Get details of the NFS exports on the volume and remove all NFS exports
  • Break the current SnapMirror relationship
  • Perform a Volume Move of the now writeable SnapMirror destination
  • Re-create the NFS exports
  • Re-sync the SnapMirror

The NFS exports have to be removed, as volume moves are not supported on volumes that have any exports configured. As long as the script runs to the end, the exports will be recreated once everything has completed.

# ----------------------------------------------------------------------------------------------------------
# PURPOSE:    Moves a SnapMirror Destination to a new aggregate, without re-initializing the SnapMirror from scratch. 
# VERSION     DATE         USER                DETAILS
# 1           27/10/2014   Craig Tolley        First version
# 1.1         14/11/2014   Craig Tolley        Added new parameter to pass in credentials, so that scripting multiple moves is easier and without prompts
# ----------------------------------------------------------------------------------------------------------

   Moves the specified volume to a new aggregate.

   Move-SnapmirrorDestination -VolumeToMove Volume1 -DestinationAggr Aggr2 -FilerName Filer1

function Move-SnapmirrorDestination
        # The volume that we want to move

        # The destination aggregate for the new volume

        # The filer name to connect to 



    #Check that that DataOnTap Module is available, check it is loaded. 
    Write-Host "Checking for DataONTAP Module..."
    if ((Get-Module -ListAvailable | Where {$_.Name -eq "DataONTAP"}).Count -ne 1) {
        Write-Error "DataONTAP not installed on targeted machine"
        Exit 1
    if ((Get-Module | Where {$_.Name -eq "DataONTAP"}).Count -ne 1) {
        Write-Host "Importing DataONTAP Module..."
        Import-Module -Name "DataONTAP" | Out-Null

    #If we have not been passed credentials, then prompt for them. 
    If ($FilerCredentials -eq $null)
        {$FilerCredentials = Get-Credential -Message "Please supply credentials for $FilerName"}

    #Connect to the Filer.
    Write-Host "Connecting to $FilerName" -BackgroundColor Yellow -ForegroundColor Black
    Connect-NaController -Name $FilerName -Credential $FilerCredentials
    If ($Error.Count -gt 0)
        Write-Host "There was an error connecting to the filer. Please check your credentials and try again."
    Write-Host ""

    #Get the Source Volume
    Write-Host "Getting details of Volume: $VolumeToMove" -BackgroundColor Yellow -ForegroundColor Black
    $SrcVolume = Get-NaVol $VolumeToMove
    If ($Error.Count -gt 0)
        Write-Host "There was an error getting the details of the Volume. Please check that the volume name is correct and the volume is online."
    $SrcVolume | ft
    #Get the Destination Aggregate
    Write-Host "Getting details of the destination aggregate: $DestinationAggr" -BackgroundColor Yellow -ForegroundColor Black
    $DestAggr = Get-NaAggr $DestinationAggr
    If ($Error.Count -gt 0)
        Write-Host "There was an error getting the details of the Volume. Please check that the volume name is correct and the volume is online."
    $DestAggr | ft

    #Get the NFS Exports for the Volume and Remove them
    Write-Host "Getting details of the NFS Exports" -BackgroundColor Yellow -ForegroundColor Black
    $NFSExports = Get-NaNfsExport | Where {$_.Pathname -like "*$($SrcVolume.Name)"}
    If (($NFSExports).Count -gt 0)
        ForEach ($Exp in $NFSExports)
            {Remove-NaNfsExport $Exp}
        {Write-Host "No NFS Exports are configured for this volume"}
    Write-Host ""

    #Break all of the snapmirrors which are configured
    Write-Host "Breaking existing Snapmirrors" -BackgroundColor Yellow -ForegroundColor Black
    $SrcSnapMirrors = Get-NaSnapmirror $SrcVolume
    $SrcSnapMirrors | ft
    If (($SrcSnapMirrors).Count -gt 0)
        ForEach ($Snapmirror in $SrcSnapMirrors)
            {Get-NASnapMirror $Snapmirror.Destination | Invoke-NaSnapmirrorBreak -Confirm:$false | Out-Null}
        {Write-Host "No Snapmirrors are configured for this volume"}
    Write-Host ""

    #Start the actual volume move. 
    Write-Host "Starting the Vol Move (Update every 15 seconds)" -BackgroundColor Yellow -ForegroundColor Black
    Start-NaVolMove -Volume $SrcVolume -DestAggr $DestAggr

    #Keep Running Until the Vol Move completes
    Get-NaVolMove | Where {$_.SourceVolumeName -eq $SrcVolume.Name} | ft
        Get-NaVolMove | Where {$_.SourceVolumeName -eq $SrcVolume.Name} | ft -HideTableHeaders
        Start-Sleep 15
    Until ((Get-NaVolMove | Where {$_.SourceVolumeName -eq $SrcVolume.Name}).Count -eq 0)

    #Recreate the NFS Exports
    Write-Host "Recreating the NFS Exports" -BackgroundColor Yellow -ForegroundColor Black
    ForEach ($Exp in $NFSExports)
        {Add-NaNfsExport $Exp}
    Write-Host ""

    #Resync all of the snapmirrors which are configured
    Write-Host "Re-synching all SnapMirrors" -BackgroundColor Yellow -ForegroundColor Black
    If (($SrcSnapMirrors).Count -gt 0)
        ForEach ($Snapmirror in $SrcSnapMirrors)
            {Get-NASnapMirror $Snapmirror.Destination | Invoke-NaSnapmirrorResync -Confirm:$false | Out-Null}

    Write-Host "  --  Completed Volume Move  -- "

If you wanted to use this to move a bunch of destinations to a new volume, then a short snippet like this does the job for you:

$VolsToMove = "volA", "volB"
$FilerLogin = Get-Credential
ForEach($v in $VolsToMove)
    {Move-SnapmirrorDestination -VolumeToMove $v -DestinationAggr newAggr -FilerName Filer1 -FilerCredentials $FilerLogin}

Be the first to like.
Posted in NetApp, Storage | Leave a comment

Assigning Permissions to Assign Networks to VM in vSphere

If you need to allow a specific user or group the permission to change the connected network on a virtual machine in vSphere, then permissions have to be given in a couple of places. This provides very granular control about the machines and the networks that a person can use, however may not be totally apparent when you are trying to get it working (it wasn’t apparent to me until I thought about the problem for a while).

Two distinct permissions are required:

Against the Virtual Machine that they want to edit

  • Virtual Machine –> Configuration –> Modify Device Settings
  • Virtual Machine –> Configuration –> Settings

Against the Network objects that they can assign

  • Network –> Assign Network

If the user or group in question does not have the Assign Network permission applied to the network object, then the object does not appear in the list of networks selectable. If there are no networks that the user has the Assign Network permission on, then the option to change the network assigned to a VM will not be available to that user.

Be the first to like.
Posted in VMWare | Leave a comment

IISCrypto – Making SSL/TLS Configuration Easier

Following the recent Poodle vulnerability, and the general best practice that you should always use the most secure protocols available, I have been spending some time reconfiguring servers.

Setting the order of ciphers, and enabling Forward Secrecy in Windows requires editing the registry – a lot. This is susceptible to errors, as the process is manual. Also, it doesn’t really give you a holistic picture of the before and after settings.

I stumbled across this tool from Nartac Software – IISCrypto. A free tool that shows you the current settings that you have for SSL/TLS, and a quick and easy way to change the active protocols and re-order the ciphers.

It is speedy and accurate. Perfect for updating a number of servers/systems manually.

Get it here:

Be the first to like.
Posted in Internet Security, Server 2003, Server 2008, Server 2012 | Leave a comment

Automating pfSense Backups

Just found a fantastic tool which is so simple and just works.

Downloaded, tested, in place and working within 15 minutes. Perfect!

Be the first to like.
Posted in Internet Security | Leave a comment

SQL Server Reporting Services – Rebuild Performance Counters

Got a lot (20+) of this sort of error in my event log.

SQL Reporting Services Performance Counter ErrorsThere is a guide on the Microsoft Wiki abou this here:, however, this alone is not enough to fix the issue.

The commands that I ran in order to get mine back up and running, from an elevated command prompt were:

C:\Windows\Microsoft.NET\Framework\v4.0.30319\installutil.exe /u C:\Program Files\Microsoft SQL Server\MSRS10_50.MSSQLSERVER\Reporting Services\ReportServer\bin/ReportingServiceLibrary.dll
c:\windows\system32\lodctr.exe /r
C:\Windows\Microsoft.NET\Framework\v4.0.30319\installutil.exe C:\Program Files\Microsoft SQL Server\MSRS10_50.MSSQLSERVER\Reporting Services\ReportServer\bin/ReportingServiceLibrary.dll
net stop ReportServer
net start ReportServer

The errors no longer occurred when rebooting the server after this point.

Be the first to like.
Posted in SQL Server | Leave a comment

SharePoint 2013: Renaming the Database Server

There are a number of posts on this online, but none of them exactly described the steps that I needed to take in order to rename the server that was hosting the SQL databases for our SharePoint installation.

I was in the fortunate position that the system had not yet gone live, so rebooting and changing configuration was easy, however it had all been configured, so blowing it away and starting again was not an option.

Our setup is simple. 2 servers, In-SharePoint is the SharePoint 2013 server and was running the websites, and In-SharePointSQL the database server, running SQL Server.

Unfortunately we found that In-SharePointSQL was too long name, so the SQL instance actually was truncated to In-SharepointSQ <– note the missing L. This caused some issues with NetApp SnapManager for SQL, so for clarity and correctness I decided it would be better if this was called In-SharePointDB.

Our environment is virtualised, so before starting I took a snapshot and a backup. I dont need to tell you to be taking a backup before you make any changes :-)

1. We dont want the SharePoint site to be accessible, so first off, stop all the SharePoint services on the SharePoint server. Using PowerShell:

Get-Service -DisplayName SharePoint* | Stop-Service
Get-Service IISAdmin, W3SVC | Stop-Service

2. On the database server, perform the rename of server to the desired new name. Reboot when prompted.

3. Open up SQL Server Manager and run the following query, substituting in your server names:

sp_dropserver IN-SHAREPOINTSQL
sp_addserver IN-SHAREPOINTDB, local

4. Restart SQL Server on the database server

5. Verify that SQL is now returning the correct server name using this query:

Select @@SERVERNAME As 'ServerName

6. We now need to correct the SQL Server Reporting Server configuration. On the SQL Server, opne up SQL Server Reporting Services Configuration. Connect to the Report Server using the new SQL Server name.

7. Select Database Configuration on the left, then click Change Database. Choose the Existing Report Server Database option, and ollow through the wizard specifying your new SQL Server name and choosing the existing Report Server database.

8. Open the file %ProgramFiles%\Microsoft SQL Server\MSRS10_50.MSSQLSERVER\Reporting Services\ReportServer\rsreportserver.config in Notepad. Search for tags called URLRoot and ReportServerURL and update if they are defined. In my case URLRoot was present and defined, requiring updating. The ReportServerURL tag was just blank though, in which case it can be left.

9. Restart the SQL Server Reporting Services service on the SQL Server.

10. Go to your SharePoint 2013 Web Server. Open cliconfg.exe. On the Alias tab click Add. Type in the old server name at the top, select TCP/IP on the left and then enter new server name on the right. It should look like this:

SQLAlias11. Restart the SharePoint Services

Get-Service IISAdmin, W3SVC | Start-Service
Get-Service -DisplayName SharePoint* | Start-Service

12. Open the SharePoint Management Shell and run the following command to update all of the databases to the new SQL instance.

Get-SPDatabase | ForEach {$_.ChangeDatabaseInstance("IN-SHAREPOINTDB")}

13. As a personal belt and braces, I then restarted the SharePoint server.

14. On the SharePoint server, open cliconfg.exe, go to the Alias tab and remove the alias that you created.


Everything should now be working using the new server name for the SQL Server.


Be the first to like.
Posted in Server 2012, SharePoint, SQL Server | Leave a comment
  • Tags

  • Categories

  • My LinkedIn Profile

    To see my LinkedIn profile, click here:

    Craig Tolley
  • January 2015
    M T W T F S S
    « Dec    
  • Meta

  • Top Liked Posts

    Powered by WP Likes

Swedish Greys - a WordPress theme from Nordic Themepark.