Synology

Title: Windows Setup Body: Windows could not parse or process unattend answer file [C:windowsPantherunattend.xml] for pass [specialize]. The answer file is invalid.

MDT Tutorial Part 11: Troubleshooting Part 2: Windows could not parse or process unattend answer file [C:\windows\Panther\unattend.xml] for pass [specialize].  The answer file is invalid.

Living Table of Contents

 

What These Guides Are:
A guide to help give you some insight into the troubleshooting process in general.

What These Guides Are Not:
A guide to fix all issues you’re going to encounter.

We’re going to role-play a bunch of scenarios and try to work through them.  Remember in math where you had to show your work?  Well, what follows is like that which is why this post is [more than] a [little] lengthy.

Windows could not parse or process unattend answer file [C:\windows\Panther\unattend.xml] for pass [specialize].  The answer file is invalid.

You boot your special VM, click the ‘Run the Deployment Wizard to install a new Operating System‘ button and it immediately starts.  Excellent!  It applies the OS, reboots and you’re faced with this error:

Title: Windows Setup Body: Windows could not parse or process unattend answer file [C:\windows\Panther\unattend.xml] for pass [specialize].  The answer file is invalid.

Windows could not parse or process unattend answer file [C:\windows\Panther\unattend.xml] for pass [specialize]. The answer file is invalid.

Well this is strange, because you didn’t touch the unattend.xml so what gives?
Fortunately, this dialog provides some meaningful insight:

    • The unattend file is C:\Windows\Panther\unattend.xml
    • The specific area is the specialize pass

Press SHIFT+F10 here to open a command prompt and then open C:\Windows\Panther\unattend.xml with notepad

Troubleshoot-005

You search for ‘specialize’ and after taking a very close look see that your computer name is incorrect.  It should be some two or three character prefix not %OfficeCode%.

Troubleshoot-006

Since that is set via the CS.INI, you run the CustomSettings.ini test again and now you see what was missed before:

Troubleshoot-007.PNG

You review the CS.INI and find your problems

  1. You didn’t define the OfficeCode property: Wasn’t added to the Properties line
  2. You didn’t set a value for OfficeCode.

With that fixed, you run the test again, the variable is populated and as you reimage the machine, you see it is named correctly in the logs.

Copypasta Closing

Hopefully these examples will help give you an idea of the overall troubleshooting process.  Most of the time the problems you’ll encounter will be caused by a typso, order of operations or a ‘known issue’ that requires a specific process to be followed.

As you make changes to your environment, here’s what I recommend:

  • Be diligent about keeping a change log so you can easily backtrack
  • Backup your CS.INI or Bootstrap.ini before you make any changes
  • Backup your ts.xml or unattend.xml (in DeploymentShare\Control\TaskSequenceID) before you make any changes
  • Introduce small changes at time with set checkpoints in between and set milestones markers where you backup core files (e.g cs.ini bootstrap.ini ts.xml unattend.xml etc) to help minimize frustration troubleshooting.

And if when you do run into some turbulence, upload relevant logs (at least smsts.log but be prepared to submit others depending on the issue) to a file sharing service like OneDrive, post on TechNet then give a shout to your resources on Twitter.

Good Providence to you!

Advertisements

Synology Cloud Station Client Can’t Handle Read-Only Files

Script Updated 2017-03-01

I picked up a Synology NAS (DS411+ii) years ago and it’s still alive and kicking today, mostly due to the rock-solid hardware and the amazing improvements in DSM.

I’ve been slowly working on weaning myself off services like DropBox, OneDrive, Google Drive to rely fully on the NAS but it certainly hasn’t been without it’s struggles.

I setup Cloud Station on the NAS and installed the Cloud Station Client (CSC) on my home machines to sync key directories on the machine to the NAS.  Initially things worked well, but as I added more data – like 300GB of data – I noticed things were not synchronizing correctly.  Upon further investigation, the Cloud Station Client (CSC) was having trouble processing read-only files.

So here’s my operation:

  1. I work on system A, generating files in a synchronized directory
  2. The CSC syncs the data up to the NAS and all is well.
  3. I jump to system B and setup CSC which syncs the data down from the NAS

Both systems now have identical sets of data.

  1. I modify a handful of read-only files on system B which gets synced up to the NAS
  2. I switch to system A and the read-only files are now updated with copies from the NAS.
  3. I make changes to the same read-only files on system A which get synched up to the NAS
  4. I switch back to system B and CSC is up in arms because “FILENAME cannot be synced due to access permission denied, or it is in use.”

In this state, a file sync queue builds because CSC is hung up on the handful of files that it cannot overwrite locally because they’re read-only.  I thought it was going to fix itself so I left it in this state for a while until I realized changes I made were not reaching the other system.  That’s when I noticed the file sync queue was something like 90k and CSC needed some help.

The other cloud sync services don’t fall prey to this issue, so it’s a little strange that the CSC isn’t able to handle it.  I took to the Synology forums and not only found I wasn’t the only one experiencing this problem, but that someone wrote a small script to address this issue.  Score!

I ran the script which mostly worked but I ran into some odd problems.  Since I had nothing better to do, I expanded on it, adding some checks & balances, visual feedback and I tried to add some error handling.

It may not be perfect, but it resolved the issues I was running into and provides meaningful output.


[CmdletBinding()]
Param
    (
        # How long we want to give the client to catch up
        # Note, this is a calculated timeout based on this + the number of files it found that needed fixing
        # In my testing, this worked better than a fixed number that was either
        #     insanely high for a low number of files
        #     too low for a high number of files
        [Parameter(Mandatory=$false)]
            [int]$CloudStationClientCatchUpTimeOut = 60,

        # How long we wait before restarting the loop
        [Parameter(Mandatory=$false)]
            [int]$RestartFixTimeOut = 30,

        # Creation of this file will allow you to:
        #    safely stop the script, and
        #    un-fix (read: re-apply the read-only attribute) on files it already processed
        [Parameter(Mandatory=$false)]
            [string]$StopMonitoringFlagFile = "$env:LOCALAPPDATA\CloudStation\log\StopMonitoring.txt",

        # This is the path to the CloudStation log file; you should't have to change this.
        [Parameter(Mandatory=$false)]
            [string]$CSDaemonLog = $env:LOCALAPPDATA + '\CloudStation\log\daemon.log',

        # Use this for debugging purposes only.
        [Parameter(Mandatory=$false)]
            [bool]$DebugEnabled = $false
    )

###############################################################
###          DON'T CHANGE ANYTHING BELOW THIS LINE          ###
###############################################################

##*=============================================
##* VARIABLE DECLARATION
##*=============================================
#region VariableDeclaration

# Master Counter - Number of times the script ran
[int]$Global:Loop = 0

# Counter for number of files fixed
[int]$TotalNumberOfFilesFixed = 0

# Global Variable for Debug Mode
[bool]$Global:DebugEnabled = $DebugEnabled
if($Global:DebugEnabled -eq $true) { write-host "$(Get-Date -Format s) - [$Global:Loop] :: DEBUG :: DEBUG MODE HAS BEEN ENABLED [$Global:DebugEnabled]" }

#endregion VariableDeclaration
##*=============================================
##* END VARIABLE DECLARATION
##*=============================================

##*=============================================
##* FUNCTION LISTINGS
##*=============================================
#region FunctionListings

#region Function Pause-Script
Function Pause-Script
    {
        Param
            (
                [Parameter(Mandatory=$false)]
                    [string]$MSG,

                [Parameter(Mandatory=$false)]
                    [string]$Title
            )

        if([string]::IsNullOrEmpty($MSG)) { $MSG = 'Script Paused.  Press any key to continue.' }
        if([string]::IsNullOrEmpty($Title)) { $Title = 'Script Paused.  Press any key to continue.' }

        if($host.Name -notmatch 'ISE')
            {
                write-host `r`n`r`n$MSG
                $HOST.UI.RawUI.ReadKey(“NoEcho,IncludeKeyDown”) | Out-Null
                $HOST.UI.RawUI.Flushinputbuffer()
            }
        Else
            {
                [System.Reflection.Assembly]::LoadWithPartialName("System.Windows.Forms") | Out-Null
                [System.Windows.Forms.MessageBox]::Show($MSG,$Title,[System.Windows.Forms.MessageBoxButtons]::OK) | Out-Null
            }
    }
#endregion #region Function Pause-Script

#region Function Toggle-ReadOnly
Function Toggle-ReadOnly
    {
        [CmdletBinding()]
        Param
            (
	            [Parameter(Mandatory=$true)]
                [ValidateScript({If(Test-Path $_){$true}else{Throw "Invalid path given: [$_]"}})]
                    [string[]]$File,

                [Parameter(Mandatory=$false)]
                    [switch]$MakeReadOnly
            )

        Begin {  }

        Process
            {
                Foreach($Item in $File)
                    {
                        if($Global:DebugEnabled -eq $true) { write-host "$(get-date -Format s) - [$Global:Loop] :: DEBUG :: Item [$Item]" }
                        [string]$ItemFixed = $Item.ToString().Replace('"','')
                        if($Global:DebugEnabled -eq $true) { write-host "$(get-date -Format s) - [$Global:Loop] :: DEBUG :: ItemFixed [$ItemFixed]" }

                        If(Test-Path -Path $ItemFixed)
                            {
                                Try { (Get-ChildItem -Path "$ItemFixed" -Force -ErrorAction Stop).IsReadOnly = $MakeReadOnly; $Return = $true } Catch { write-host "$(get-date -Format s) - [$Global:Loop] :: ERROR TOGGLING READONLY ATTRIBUTE ON [$ItemFixed]: [$_]"; $Return = $_ }
                            }
                        Else { write-host "$(get-date -Format s) - [$Global:Loop] :: ERROR DID NOT FIND [$ItemFixed]" }
                    }
            }

        End { return $Return }
    }
#endregion Function Toggle-ReadOnly

#endregion FunctionListings
##*=============================================
##* END FUNCTION LISTINGS
##*=============================================

##*=============================================
##* SCRIPT BODY
##*=============================================
#region ScriptBody

Do
    {
        # Increment Counter
        [int]$Global:Loop++ | Out-Null

        # Reset these at the start of each loop
        # This is the abort flag
        [bool]$Abort = $false

        # This is the error flag
        [bool]$CriticalErrorEncountered = $false

        # This is the number of files we need to fix this time raound
        [int]$CountOfFilesToFix = 0

        # This is the array that holds the files we've successfully processed
        [string]$arrProcessed = $null;[System.Collections.ArrayList]$arrProcessed = @()

        if(Test-Path -Path $StopMonitoringFlagFile -PathType Leaf) { [bool]$Abort = $true }

        if(!(Test-Path $CSDaemonLog -PathType Leaf)) { write-host "$(get-date -Format s) - [$Global:Loop] :: ERROR MISSING CLOUD STATION DAEMON LOG [$CSDaemonLog]"; $CriticalErrorEncountered = $true; continue }

        Try
            {
                write-host "$(get-date -Format s) - [$Global:Loop] :: Ingesting log [$CSDaemonLog]"
                $CSDaemonLogFullContent = $null;$CSDaemonLogFullContent = Get-Content $CSDaemonLog -ErrorAction Stop

                ##############################################################################
                #                             SUPER DEBUG                                    #
                # Only enable this if you're seeing //really// strange or unexpected results #
                <#                 if($Global:DebugEnabled -eq $true)                     {                         write-host "$(get-date -Format s) - [$Global:Loop] :: DEBUG :: CSDaemonLogFullContent BEGIN>>>>"
                        write-host $CSDaemonLogFullContent
                        write-host "$(get-date -Format s) - [$Global:Loop] :: DEBUG :: CSDaemonLogFullContent END<<<<"                     }                 #>
                ##############################################################################

                Try
                    {
                        write-host "$(get-date -Format s) - [$Global:Loop] :: Checking ingested log content for errors"
                        # Only grab lines that contain '[Error]' since those are the lines we want to focus on.
                        $CSDaemonLogContent = $null; [System.Collections.ArrayList]$CSDaemonLogContent = {$CSDaemonLogFullContent | ? { $_ -match [regex]::Escape('[Error]') }}.Invoke()

                        ##############################################################################
                        #                             SUPER DEBUG                                    #
                        # Only enable this if you're seeing //really// strange or unexpected results #
                        <#                         if($Global:DebugEnabled -eq $true)                             {                                 write-host "$(get-date -Format s) - [$Global:Loop] :: DEBUG :: Number of items in `$CSDaemonLogContent BEFORE refinement [$($CSDaemonLogContent.Count)]"                                 write-host "$(get-date -Format s) - [$Global:Loop] :: DEBUG :: CSDaemonLogContent BEGIN>>>>"
                                for($i=0; $i -lt $CSDaemonLogContent.Count;$i++) { write-host "$(get-date -Format s) - [$Global:Loop] :: DEBUG :: CSDaemonLogContent[$i] $($CSDaemonLogContent[$i])" }
                                write-host "$(get-date -Format s) - [$Global:Loop] :: DEBUG :: CSDaemonLogContent END<<<<"                             }                         #>
                        ##############################################################################

                        # However, there are certain types of errors we can likely ignore since they don't specifically speak to the read-only issue this script aims to resolves.
                        $arrErrorsToIgnore = @('*channel error while connecting to server*',
                                               '*Failed to remove local signature*',
                                               '*Failed to write magic*',
                                               '*Failed to send protocol header.*',
                                               '*Failed to send protocol.*'
                                               )
                                               <#                                                ,                                                'Failed to prepare file block for',                                                'Failed to read file for'                                                #>

                        # Loop through the errors to ignore and remove them from the array if they match
                        # At the same time we'll eliminate any lines that don't look like they contain a real drive letter (*:\*)
                        foreach($Line in @($CSDaemonLogContent))
                            {
                                [bool]$IgnoreLine = $false
                                foreach($ErrorToIgnore in $arrErrorsToIgnore)
                                    {
                                        ##############################################################################
                                        #                             SUPER DEBUG                                    #
                                        # Only enable this if you're seeing //really// strange or unexpected results #
                                        #if($Global:DebugEnabled -eq $true) { write-host "$(get-date -Format s) - [$Global:Loop] :: DEBUG :: [$i][$ErrorToIgnore][$Line]" }
                                        ##############################################################################

                                        if(($Line -like $ErrorToIgnore) -eq $true)
                                            {
                                                $IgnoreLine = $true
                                                ##############################################################################
                                                #                             SUPER DEBUG                                    #
                                                # Only enable this if you're seeing //really// strange or unexpected results #
                                                #if($Global:DebugEnabled -eq $true) { write-host "$(get-date -Format s) - [$Global:Loop] :: DEBUG :: Ignoring [$IgnoreLine] because [$ErrorToIgnore] was found in: [$Line] " }
                                                ##############################################################################
                                            }
                                        else
                                            {
                                                ##############################################################################
                                                #                             SUPER DEBUG                                    #
                                                # Only enable this if you're seeing //really// strange or unexpected results #
                                                #if($Global:DebugEnabled -eq $true) { write-host "$(get-date -Format s) - [$Global:Loop] :: DEBUG :: Proceeding normally because [$ErrorToIgnore] was not found in: [$Line]" }
                                                ##############################################################################
                                            }
                                    }

                                # No need to check if we're already ignoring the line
                                if($IgnoreLine -ne $true)
                                    {
                                        ##############################################################################
                                        #                             SUPER DEBUG                                    #
                                        # Only enable this if you're seeing //really// strange or unexpected results #
                                        #if($Global:DebugEnabled -eq $true) { write-host "$(get-date -Format s) - [$Global:Loop] :: DEBUG :: Performing drive check (*:\*) because IgnoreLine is not true [$IgnoreLine]" }
                                        ##############################################################################

                                        # Check to ensure the line contains :\ which is more than likely a real path, otherwise we can skip it
                                        if($Line -notlike '*:\*')
                                            {
                                                $IgnoreLine = $true
                                                ##############################################################################
                                                #                             SUPER DEBUG                                    #
                                                # Only enable this if you're seeing //really// strange or unexpected results #
                                                #if($Global:DebugEnabled -eq $true) { write-host "$(get-date -Format s) - [$Global:Loop] :: DEBUG :: Ignoring [$IgnoreLine] because it doesn't appear to correspond to a local path: [$Line] " }
                                                ##############################################################################
                                            }
                                        else
                                            {
                                                ##############################################################################
                                                #                             SUPER DEBUG                                    #
                                                # Only enable this if you're seeing //really// strange or unexpected results #
                                                #if($Global:DebugEnabled -eq $true) { write-host "$(get-date -Format s) - [$Global:Loop] :: DEBUG :: Proceeding normally because it appears to correspond to a local path: [$Line]" }
                                                ##############################################################################

                                                # If it does not have slashes, it's not a file path so we can ignore it
                                                if(-not $Matches[1].Contains('/'))
                                                    {
                                                        $IgnoreLine = $true
                                                        ##############################################################################
                                                        #                             SUPER DEBUG                                    #
                                                        # Only enable this if you're seeing //really// strange or unexpected results #
                                                        #if($Global:DebugEnabled -eq $true) { write-host "$(get-date -Format s) - [$Global:Loop] :: DEBUG :: Ignoring [$IgnoreLine] because it doesn't contain a '/': [$Line]" }
                                                        ##############################################################################
                                                    }
                                                else
                                                    {
                                                        ##############################################################################
                                                        #                             SUPER DEBUG                                    #
                                                        # Only enable this if you're seeing //really// strange or unexpected results #
                                                        #if($Global:DebugEnabled -eq $true) { write-host "$(get-date -Format s) - [$Global:Loop] :: DEBUG :: Proceeding normally because it does contain a '/': [$Line]" }
                                                        ##############################################################################
                                                    }
                                            }
                                    }
                                else
                                    {
                                        ##############################################################################
                                        #                             SUPER DEBUG                                    #
                                        # Only enable this if you're seeing //really// strange or unexpected results #
                                        #if($Global:DebugEnabled -eq $true) { write-host "$(get-date -Format s) - [$Global:Loop] :: DEBUG :: Skipping Drive (*:\*) check because IgnoreLine is already true [$IgnoreLine]" }
                                        ##############################################################################
                                    }

                                if($IgnoreLine -eq $true)
                                    {
                                        ##############################################################################
                                        #                             SUPER DEBUG                                    #
                                        # Only enable this if you're seeing //really// strange or unexpected results #
                                        #if($Global:DebugEnabled -eq $true) { write-host "$(get-date -Format s) - [$Global:Loop] :: DEBUG :: IgnoreLine is true [$IgnoreLine]: Removing from `$CSDaemonLogContent [$Line]" }
                                        ##############################################################################

                                        # Remove that line/log entry from the array since we deemed it wasn't valid.
                                        $CSDaemonLogContent.Remove($Line) | Out-Null
                                    }
                                else
                                    {
                                        ##############################################################################
                                        #                             SUPER DEBUG                                    #
                                        # Only enable this if you're seeing //really// strange or unexpected results #
                                        #if($Global:DebugEnabled -eq $true) { write-host "$(get-date -Format s) - [$Global:Loop] :: DEBUG :: IgnoreLine is false [$IgnoreLine]: Leaving `$CSDaemonLogContent [$Line]" }
                                        ##############################################################################
                                    }
                            }

                        ##############################################################################
                        #                             SUPER DEBUG                                    #
                        # Only enable this if you're seeing //really// strange or unexpected results #
                        <#                         if($Global:DebugEnabled -eq $true)                             {                                 write-host "$(get-date -Format s) - [$Global:Loop] :: DEBUG :: Number of items in `$CSDaemonLogContent AFTER refinement [$($CSDaemonLogContent.Count)]"                                 write-host "$(get-date -Format s) - [$Global:Loop] :: DEBUG :: CSDaemonLogContent BEGIN>>>>"
                                for($i=0; $i -lt $CSDaemonLogContent.Count;$i++) { write-host "$(get-date -Format s) - [$Global:Loop] :: DEBUG :: CSDaemonLogContent[$i] $($CSDaemonLogContent[$i])" }
                                write-host "$(get-date -Format s) - [$Global:Loop] :: DEBUG :: CSDaemonLogContent END<<<<"                             }                         #>
                        ##############################################################################

                        # Here we check if the array contains content (read: errors) and if not, we pause then start the loop again.
                        if(([string]::Isnullorempty($CSDaemonLogContent)) -or ($CSDaemonLogContent.Count -eq 0))
                            {
                                write-host "$(get-date -Format s) - [$Global:Loop] :: Congratulations - No file errors found!"
                                if(($CriticalErrorEncountered -ne $true) -and ($Abort -ne $true) -and  (!(Test-Path -Path $StopMonitoringFlagFile -PathType Leaf)))
                                    {
                                        write-host "$(get-date -Format s) - [$Global:Loop] :: Restarting in [$RestartFixTimeOut] seconds.`r`n"
                                        Start-Sleep -Seconds $RestartFixTimeOut
                                    }
                                continue
                            }

                        # Otherwise there //are// errors so we need to dive into that further.
                        write-host "$(get-date -Format s) - [$Global:Loop] :: `tWARNING: Found [$($CSDaemonLogContent.Count)] error(s) in the log [$CSDaemonLog]`r`n"

                        ##############################################################################
                        #                             SUPER DEBUG                                    #
                        # Only enable this if you're seeing //really// strange or unexpected results #
                        #if($Global:DebugEnabled -eq $true) { for($i=0; $i -ne $CSDaemonLogContent.Count;$i++) { write-host "$(get-date -Format s) - [$Global:Loop] :: DEBUG :: Error $($i+1) of $($CSDaemonLogContent.Count): $($CSDaemonLogContent[$i])" } }
                        ##############################################################################

                        for($i=0; $i -lt $CSDaemonLogContent.Count;$i++)
                            {
                                [bool]$IgnoreError = $false

                                ##############################################################################
                                #                             SUPER DEBUG                                    #
                                # Only enable this if you're seeing //really// strange or unexpected results #
                                #if($Global:DebugEnabled -eq $true) { write-host "$(get-date -Format s) - [$Global:Loop] :: DEBUG :: `$CSDaemonLogContent[$i] [$($CSDaemonLogContent[$i])]" }
                                ##############################################################################

                                # This is a regex that aims to extract just the file name bits from the 'error' line entry.
                                # The regex looks for what ever is in between the single quotes '' and extracts just that
                                # Example
                                #     Mar 01 11:15:48 [ERROR] upload-local-handler.cpp(624): Failed to read file for 'path:\some/nested/directory/fi.le', try it again 0 潎攠牲牯
                                #     Mar 01 11:15:48 [ERROR] upload-local-handler.cpp(435): Failed to prepare file block for 'path:\some/nested/directory/fi.le'. System error.
                                # Leaving us with just: 'path:\some/nested/directory/fi.le'
                                $CSDaemonLogContent[$i] -match ".+'(.*?.+?)'" | Out-Null

                                # Matches[0] is the entire line
                                # Matches[1] is regex result

                                # Checks to see if $Matches[1] is null or empty; hopefully shouldn't happen at this point
                                if([string]::IsNullOrEmpty($Matches[1]) -eq $true)
                                    {
                                        ##############################################################################
                                        #                             SUPER DEBUG                                    #
                                        # Only enable this if you're seeing //really// strange or unexpected results #
                                        #if($Global:DebugEnabled -eq $true) { write-host "$(get-date -Format s) - [$Global:Loop] :: DEBUG :: WARNING Matches[1] is null or empty [$($Matches[1])]" }
                                        ##############################################################################
                                        continue
                                    }
                                else
                                    {
                                        ##############################################################################
                                        #                             SUPER DEBUG                                    #
                                        # Only enable this if you're seeing //really// strange or unexpected results #
                                        #if($Global:DebugEnabled -eq $true) { write-host "$(get-date -Format s) - [$Global:Loop] :: DEBUG :: Matches[1] is not null or empty [$($Matches[1])]" }
                                        ##############################################################################
                                    }

                                # If it does not have slashes, it's not a file path so we can ignore it
                                if(-not $Matches[1].Contains('/'))
                                    {
                                        ##############################################################################
                                        #                             SUPER DEBUG                                    #
                                        # Only enable this if you're seeing //really// strange or unexpected results #
                                        #if($Global:DebugEnabled -eq $true) { write-host "$(get-date -Format s) - [$Global:Loop] :: DEBUG :: Matches[1] [$($Matches[1])] DOES NOT contain a '/' [$($Matches[1].Contains('/'))]" }
                                        ##############################################################################
                                        continue
                                    }

                                # Unix to Windows path fix
                                $FilePathFixed = $null; [string]$FilePathFixed = $Matches[1].Replace('/','\')
                                if($Global:DebugEnabled -eq $true) { write-host "$(get-date -Format s) - [$Global:Loop] :: DEBUG :: FilePathFixed: [$FilePathFixed]" }

                                # If $Matches[1] contains something and it's not already in the array $arrProcessed, we need to check it out
                                if(([string]::IsNullOrEmpty($Matches[1]) -eq $false) -and ($arrProcessed.Contains($FilePathFixed) -eq $false))
                                    {
                                        if($Global:DebugEnabled -eq $true) { write-host "$(get-date -Format s) - [$Global:Loop] :: DEBUG :: Matches[1] is not null or empty [$($Matches[1])] AND `$arrProcessed does not contain [$FilePathFixed] [$($arrProcessed.Contains($FilePathFixed))]" }

                                        write-host "$(get-date -Format s) - [$Global:Loop] :: The Synology Cloud Station Client had trouble processing file [$FilePathFixed]"

                                        if(Test-Path -Path "$FilePathFixed")
                                            {
                                                if($Global:DebugEnabled -eq $true) { write-host "$(get-date -Format s) - [$Global:Loop] :: DEBUG :: Found FilePathFixed [$FilePathFixed]" }

                                                Try
                                                    {
                                                        # Determine if the file is read-only or not
                                                        [bool]$IsReadOnly = (Get-ChildItem -Path "$FilePathFixed" -Force -ErrorAction Stop ).IsReadOnly
                                                        if($Global:DebugEnabled -eq $true) { write-host "$(get-date -Format s) - [$Global:Loop] :: DEBUG :: IsReadOnly is [$IsReadOnly]" }

                                                        If ($IsReadOnly -eq $true)
                                                            {
                                                                write-host "$(get-date -Format s) - [$Global:Loop] :: REASON: Possibly because the file is read-only; Removing read-only attribute..."
                                                                $DisableReadOnly = Toggle-ReadOnly -File "$FilePathFixed"
                                                                if($Global:DebugEnabled -eq $true) { write-host "$(get-date -Format s) - [$Global:Loop] :: DEBUG :: DisableReadOnly is [$DisableReadOnly]" }

                                                                if($DisableReadOnly -eq $true)
                                                                    {
                                                                        write-host "$(get-date -Format s) - [$Global:Loop] :: Successfully Toggled Read-Only Attribute on [$FilePathFixed]"
                                                                        Try { $arrProcessed.Add($FilePathFixed) | Out-Null; [int]$CountOfFilesToFix++ | Out-Null; [int]$TotalNumberOfFilesFixed++ | Out-Null } Catch { write-host "$(get-date -Format s) - [$Global:Loop] :: ERROR ADDING TO COLLECTION [$FilePathFixed]: $_" }
                                                                    }
                                                                else { write-host "$(get-date -Format s) - [$Global:Loop] :: `tERROR ENABLING READ-ONLY ATTRIBUTE ON FILE [$FilePathFixed]: [$DisableReadOnly]" }
                                                            }
                                                        else { write-host "$(get-date -Format s) - [$Global:Loop] :: `tWARNING: Unsure why the Synology Cloud Station Client had trouble with file [$FilePathFixed] since it's not read-only [$IsReadOnly]." }
                                                    }
                                                Catch { write-host "$(get-date -Format s) - [$Global:Loop] :: `tERROR CHECKING READ-ONLY ATTRIBUTES ON FILE [$FilePathFixed]: $_" }
                                            }
                                        else { write-host "$(get-date -Format s) - [$Global:Loop] :: REASON: Possibly because the file is missing [$FilePathFixed]" }
                                    }
                                Elseif(([string]::IsNullOrEmpty($Matches[1]) -eq $false) -and ($arrProcessed.Contains($FilePathFixed) -eq $true)) { write-host "$(get-date -Format s) - [$Global:Loop] :: Skipping Processed File [$FilePathFixed]"  }
                                else
                                    {
                                        write-host "$(get-date -Format s) - [$Global:Loop] :: ERROR: SOMETHING UNEXPECTED HAPPENED:"
                                        write-host "$(get-date -Format s) - [$Global:Loop] :: ERROR: `$Matches[1] [$($Matches[1])]"
                                        write-host "$(get-date -Format s) - [$Global:Loop] :: ERROR: `$arrProcessed.Contains($($Matches[1])) [$($arrProcessed.Contains($Matches[1]))]"
                                        write-host "$(get-date -Format s) - [$Global:Loop] :: ERROR: `$FilePathFixed [$FilePathFixed]"
                                        write-host "$(get-date -Format s) - [$Global:Loop] :: ERROR: `$arrProcessed.Contains($FilePathFixed) [$($arrProcessed.Contains($FilePathFixed))]"
                                        write-host "$(get-date -Format s) - [$Global:Loop] :: ERROR: DUE TO THE UNEXPECTED NATURE OF THE ERROR ABOVE WE ARE BREAKING OUT OF THE FIXING LOOP"
                                        $CriticalErrorEncountered = $true
                                    }

                                # Check for the flag file that will stop the script
                                if(Test-Path -Path $StopMonitoringFlagFile -PathType Leaf)
                                    {
                                        if($Global:DebugEnabled -eq $true) { write-host "$(get-date -Format s) - [$Global:Loop] :: DEBUG :: WARNING Stop monitoring flag file found [$StopMonitoringFlagFile]; BREAKING OUT OF THE FIXING LOOP" }
                                        [bool]$Abort = $true
                                    }
                                else { if($Global:DebugEnabled -eq $true) { write-host "$(get-date -Format s) - [$Global:Loop] :: DEBUG :: Stop monitoring flag file not found [$StopMonitoringFlagFile]; Continuing execution" } }

                                if(($CriticalErrorEncountered -eq $true) -or ($Abort -eq $true))
                                    {
                                        if($Global:DebugEnabled -eq $true) { write-host "$(get-date -Format s) - [$Global:Loop] :: DEBUG :: ERROR EITHER CRITICALERRORENCOUNTERED IS TRUE [$CriticalErrorEncountered] OR ABORT IS TRUE [$Abort]; BREAKING OUT OF THE FIXING LOOP" }
                                        break
                                    }
                                else
                                    {
                                        if($Global:DebugEnabled -eq $true) { write-host "$(get-date -Format s) - [$Global:Loop] :: DEBUG :: Both CriticalErrorEncountered [$CriticalErrorEncountered] and Abort [$Abort] are false; Continuing execution" }
                                        Write-Host
                                    }
                            }

                        if($Global:DebugEnabled -eq $true) { write-host "$(get-date -Format s) - [$Global:Loop] :: DEBUG :: CountOfFilesToFix is [$CountOfFilesToFix]" }

                        if ($CountOfFilesToFix -gt 0)
                            {
                                write-host "$(get-date -Format s) - [$Global:Loop] :: Finished processing [$CountOfFilesToFix] file(s)."
                                if($DebugEnabled -eq $true) { write-host "$(get-date -Format s) - [$Global:Loop] :: DEBUG :: `$CountOfFilesToFix.ToString().Length is [$([string]$CountOfFilesToFix.ToString().Length)]" }

                                # I find that the more files we fix, the longer we have to wait for the Cloud Station Client to catch up.
                                # So this is some simple logic based on my [limited] testing with a little over 125000 files to figure out how long we should wait.
                                switch([string]$CountOfFilesToFix.ToString().Length)
                                    {
                                        { ($_ -eq 1) } { $CalculatedTimeOut = $CloudStationClientCatchUpTimeOut }
                                        { ($_ -eq 2) } { $CalculatedTimeOut = $CloudStationClientCatchUpTimeOut + ([math]::Round($CountOfFilesToFix/10)*10) }
                                        { ($_ -eq 3) } { $CalculatedTimeOut = $CloudStationClientCatchUpTimeOut + ([math]::Round($CountOfFilesToFix/100)*10) }
                                        { ($_ -ge 4) } { $CalculatedTimeOut = $CloudStationClientCatchUpTimeOut + ([math]::Round($CountOfFilesToFix/1000)*10) }
                                    }
                                write-host "$(get-date -Format s) - [$Global:Loop] :: Waiting [$CalculatedTimeOut] seconds for the Cloud Station Client to resync the [$CountOfFilesToFix] file(s) we fixed."
                                Start-Sleep -Seconds $CalculatedTimeOut

                                write-host "$(get-date -Format s) - [$Global:Loop] :: Unfixing read-only attributes on [$CountOfFilesToFix] changed file(s)."
                                foreach($ProcessedFile in @($arrProcessed))
                                    {
                                        if($Global:DebugEnabled -eq $true) { write-host "$(get-date -Format s) - [$Global:Loop] :: DEBUG :: ProcessedFile is [$ProcessedFile]" }
                                        if(Test-Path -Path "$ProcessedFile")
                                            {
                                                Try
                                                    {
                                                        [bool]$IsReadOnly = (Get-ChildItem -Path "$ProcessedFile" -Force -ErrorAction Stop ).isreadonly
                                                        if($Global:DebugEnabled -eq $true) { write-host "$(get-date -Format s) - [$Global:Loop] :: DEBUG :: IsReadOnly is [$IsReadOnly]" }

                                                        If ($IsReadOnly -eq $false)
                                                            {
                                                                write-host "$(get-date -Format s) - [$Global:Loop] :: Re-adding read-only attribute on file [$ProcessedFile]..."
                                                                $EnableReadOnly = Toggle-ReadOnly -File $ProcessedFile -MakeReadOnly
                                                                if($Global:DebugEnabled -eq $true) { write-host "$(get-date -Format s) - [$Global:Loop] :: DEBUG :: EnableReadOnly is [$EnableReadOnly]" }

                                                                if($EnableReadOnly -eq $true)
                                                                    {
                                                                        write-host "$(get-date -Format s) - [$Global:Loop] :: Successfully re-set read-only attribute on file [$ProcessedFile]"
                                                                        Try { $arrProcessed.Remove($ProcessedFile) | Out-Null } Catch { write-host "$(get-date -Format s) - [$Global:Loop] :: ERROR REMOVING FILE [$ProcessedFile] FROM COLLECTION: $_" }
                                                                    }
                                                                else { write-host "$(get-date -Format s) - [$Global:Loop] :: `tERROR TOGGLING READ-ONLY ATTRIBUTE ON FILE [$ProcessedFile]: [$EnableReadOnly]" }
                                                            }
                                                        else { write-host "$(get-date -Format s) - [$Global:Loop] :: `tWARNING File [$ProcessedFile] is already set to read-only [$IsReadOnly]; Skipping..." }
                                                    }
                                                Catch { write-host "$(get-date -Format s) - [$Global:Loop] :: `tERROR CHECKING READ-ONLY ATTRIBUTES ON FILE [$ProcessedFile]: $_" }
                                            }
                                        else { write-host "$(get-date -Format s) - [$Global:Loop] :: `tERROR FILE MISSING: [$ProcessedFile]" }

                                        write-host
                                    }
                            }
                        else { write-host "$(get-date -Format s) - [$Global:Loop] :: Congratulations - No [$CountOfFilesToFix] files needed processing!"}
                    }
                Catch { write-host "$(get-date -Format s) - [$Global:Loop] :: `tERROR AN UNKNOWN INNER ERROR OCCURRED: $_"; $CriticalErrorEncountered = $true }
            }
        Catch { write-host "$(get-date -Format s) - [$Global:Loop] :: `tERROR GETTING CONTENT OF [$CSDaemonLog]: $_"; $CriticalErrorEncountered = $true }

        if(Test-Path -Path $StopMonitoringFlagFile -PathType Leaf)
            {
                if($Global:DebugEnabled -eq $true) { write-host "$(get-date -Format s) - [$Global:Loop] :: DEBUG :: WARNING Stop monitoring flag file found [$StopMonitoringFlagFile]" }
                [bool]$Abort = $true
            }
        Else { if($Global:DebugEnabled -eq $true) { write-host "$(get-date -Format s) - [$Global:Loop] :: DEBUG :: Stop monitoring flag file not found [$StopMonitoringFlagFile]; Continuing execution" } }

        if(($CriticalErrorEncountered -ne $true) -and ($Abort -ne $true))
            {
                write-host "$(get-date -Format s) - [$Global:Loop] :: Loop Completed - waiting [$RestartFixTimeOut] seconds before restarting.`r`n"
                Start-Sleep -Seconds $RestartFixTimeOut
            }
        else
            {
                if($Global:DebugEnabled -eq $true) { write-host "$(get-date -Format s) - [$Global:Loop] :: DEBUG :: ERROR EITHER CRITICALERRORENCOUNTERED IS TRUE [$CriticalErrorEncountered] OR ABORT IS TRUE [$Abort]; BREAKING OUT OF THE MAIN LOOP" }
                if($Abort -eq $true) { write-host "$(get-date -Format s) - [$Global:Loop] :: WARNING Process aborted due to presence of flag file [$StopMonitoringFlagFile]" }
                if($CriticalErrorEncountered -eq $true) { write-host "$(get-date -Format s) - [$Global:Loop] :: CRITICAL ERROR ENCOUNTERED; SCRIPT QUITTING!" }
            }
    }
Until(($Abort -eq $true ) -or ($CriticalErrorEncountered -eq $true))

write-host "$(get-date -Format s) - [$Global:Loop] :: Total number of files fixed this session: [$TotalNumberOfFilesFixed]`r`n"

Pause-Script

#endregion ScriptBody
##*=============================================
##* END SCRIPT BODY
##*=============================================

Running fsck/e2fsck on Synology NAS

I opened a ticket with Synology Support because some ‘Hyper Backup’ data backup tasks were failing.  The tech there were errors on the volume, which is odd because it hasn’t been powered off incorrectly and I just created it maybe 2-3 weeks prior.  The recommendation: move the data off the volume, delete the volume and recreate.
No thanks.

It really irks me when they don’t even bother to troubleshoot.

Synology doesn’t provide any command line support so you’re really at the mercy of the tech you get and the community forum.  And since the box lacks lsof and fuser (unless you install ipkg which I’m not interested in doing just yet) it makes tracking down which process is tying up the volume a real pain.

Enter syno_poweroff_task.

TL;DR

This process worked well for me on DSM 6 Beta 2 but your mileage may vary.

I recommend you run e2fsck in read-only mode first before attempting to fix anything.

 

# reboot the NAS
# when it comes up, ssh into it
# login &amp; su to root
# run the following and sit tight; reconnect if you're disconnected
syno_poweroff_task -d

# activate the volume
vgchange -ay

# play it safe - replace with the correct logical volume
e2fsck -fnvtt /dev/vg1/volume_1

# run e2fsck for real
e2fsck -Dfttvy -C 0 /dev/vg1/volume_1

# go order a pizza and watch a movie
# run the command again to be sure everything is clean
# reboot and all should be fine
 # if you're having //serious// issues, you can try the following
# this is more of an overnight + work day + weekend task.
e2fsck -ccDfttvy -C 0 /dev/vg1/volume_1

Running e2fsck cleared the volume errors but the issues I was having continued.  I ended up recreating the data backup task in Hyper-Backup and it worked.

How Did I Get Here?

During my 2014 Thanksgiving I had a massive data loss scare that brought to light my lack of an adequate backup solution for my – well – backup.  I was able to recover and I’ve since implemented a bare bones backup strategy (rsync to a local NAS) and I’m working up to storing on Azure.

After upgrading my DS411+ii to DSM 6 Beta 2, I’ve been kicking the tires on Hyper Backup Synology’s ‘Backup and Restore’ replacement which so far impresses.  And by impresses I mean it seems to be working.  After running the tasks nightly for maybe a week or two, one task failed.  A little while later, another task failed; then another.  Clearly, something is wrong somewhere with the data I’m trying to backup, and because logs are slim pickins, it’s impossible for me to make any real progress.

This is why I opened the ticket with Synology to hopefully have them troubleshoot further but they darn near shotgunned it.

 

After fixing some minor issues, the tasks still failed so I resorted to removing the task, deleting the existing data and starting over.  This worked, so far, but I’m concerned there may be a larger underlying issue here.  All drives were replaced within the last year with one failing 10 days before the 1 year mark.  Hopefully it’s something silly like a strangely named file – we’ll see.

Good Providence to you if you go down this route!

 

Backing Up a Synology NAS to a WD My Cloud NAS

The built-in Synology Backup & Replication feature has not been very reliable for me, prompting me to explore other options.  Since the underlying backup process was rsync based, I kept with the theme and rolled my own script; really convenient since I did something similar in ESXi years ago with the ‘poor-mans VMotion’.

TL;DR

The script below does have a few requirements:

  • You need rsyncd running on the remote host
  • The poor man’s script will need:
    • The volume where your data resides (probably /volume1)
    • The rsync user you setup on the remote host
    • The remote host be it via IP or hostname
    • The remote path to store your data
    • Your backup jobs at the bottom in the form of:
      StartBackup ‘sharename’
  • Please note there’s a Dry-run enabled by default and it will need to be removed when you’re ready to execute for real. (so delete -n in the rsync command line)

Setup rsync on WD MyCloud

The steps below require a combination of access to the WD MyCloud web portal and ssh access to the box.  For setting up ssh I’ll refer you to the Internets (aka search for it) and WD:

Two big things I want to point out if you don’t already know this.

  1. You could seriously break something so be careful.  I recommend starting with a clean slate on your WD or at least backing up whatever data you have on there some place else in the event you break something.
    .
  2. Linux/Unix is a case SeNsItIve world.  Everything from file, to user names to directory names and commands are all case sensitive.  So where possible, error on the side of caution and keep everything in lowercase to be sure.

Create a New Share via the WD MyCloud Portal

Login to your WD MyCloud and create a new share (e.g.: synologynas) specifically for this purpose.

Create an rsync.secrets file in /etc

Keep it simple like rsyncd.secrets in /etc


synologynasuser:super sekret 1337 passw0rd!

Create an rsyncd.conf file in /etc


pid file = /var/run/rsyncd.pid
lock file = /var/run/rsync.lock
log file = /var/log/rsync.log

# This is the rsync module name you're going to access
[SynologyNAS Share for rsync]
comment = "My rsync share for Synology"
# This is the local path on the WD MyCloud
path = /shares/synologynas
# Or if you prefer
#path = /DataVolumes/shares
use chroot = true
uid = root
gid = share
read only = no
list = yes
# add the user you specified via the WD MyCloud portal
auth users = synologynas
# put the path to your secrets file here
secrets file = /etc/rsyncd.secrets
# OPTIONAL But Recommended: Put the IP of your Synology NAS here & uncomment (meaning remove the # in front of the line below)
#hosts allow = Synology.NAS.IP.ADDRESS

Enable rync in /etc/default/rsync

Edit /etc/default/rsync and change RSYNC_ENABLE=false to RSYNC_ENABLE=true.

Start rsync

You could reboot your WD MyCloud but since you already have a shell open just start or restart the service


# Check to see if the service is running

/etc/init.d/rsync status

# If not running, start it

/etc/init.d/rsync start

# If it is running, reload it

/etc/init.d/rsync reload

# or force-reload it but you shouldn't have to

/etc/init.d/rsync force-reload

# or just restart it

/etc/init.d/rsync restart

At this point, rsync is setup and ready to receive connections.

Bonus: Create a New User via the WD MyCloud Portal

This is more of an optional step.  I spend a lot of time in DSM so I decided to mount the remote CIFS folder I setup on the WD MyCloud so I could access it in DSM.  To do that, login to your WD MyCloud and create a new user specifically for this purpose and grant that user full rights to the share you setup above.  To keep things simple, you could use the same username and password you setup above in the secrets file.

The Script

I tested this script in DSM 5.2 through DSM 6 Beta 2 and an early build of DSM 6 when it went GA.

Keep in mind:

  • I’m no rsync or [ba|k]sh, guru but it works perfectly for my needs.
  • Minimal error checking in this version
  • You’re free to edit to suit your needs. (obviously)
  • The rsync options are what Synology uses by default for its rsync-based backup tasks; feel free to alter as they’re not necessarily ideal.
#!/bin/sh

Volume=/volumeN
RsyncPasswordFile=/path/to/passwordFile
RemoteUser=synologynas
RemoteHost=127.0.0.1
RemotePath=Backups

LogDir=/var/log
LogFile=$LogDir/MyCustomBackup.log

echo "vvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvv" >> $LogFile
echo "Started Main Script" >> $LogFile

StartBackup() {
 if [ -z $1 ]; then
      echo "ERROR 1 IS EMPTY [$1]" >> $LogFile
      return
 fi

 local SourceDir=$1
 echo "SourceDir is [$SourceDir]" >> $LogFile

 local DestinationDir=$SourceDir
 echo "DestinationDir is [$DestinationDir]" >> $LogFile

 if [ ! -d $Volume/$SourceDir ]; then
      echo "ERROR: SourceDir [$SourceDir] was NOT found" >> $LogFile
      return
 fi

 local JobLog=$LogDir/MyCustomBackup-$(echo $SourceDir | sed -e 's/\//_/g').log
 echo "Rsync log is [$JobLog]" >> $LogFile

 echo ">>>>>> Starting Backup Job for [$SourceDir] to [$DestinationDir]">> $LogFile

 # Remember - dry-run is enabled by default
 /usr/syno/bin/rsync --timeout=600 -rlt -p --chmod=ugo=rwx -H -W --password-file=$RsyncPasswordFile $Volume/$SourceDir $RemoteUser@$RemoteHost::$RemotePath/$DestinationDir --exclude=/*/#recycle/ -Phriivv --stats --log-file=$JobLog -n

 echo "Finished Backup Job for [$SourceDir] to [$DestinationDir] <<<<<<" >> $LogFile
}

StartBackup 'photos'
StartBackup 'music'
StartBackup 'video'

echo "Finished Main Script" >> $LogFile
echo "^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^" >> $LogFile

I have customized this script further because I’m a crazy man, so if you’re interested in seeing that version, let me know and I’ll put up a link here.

Good Providence to you!

How Did I Get Here?

I have an aging, but still fast & reliable, Synology DS411+ii that we, and several members of our family, rely on.  I won’t list all the features, just know that it does a lot and is capable of much more.  As with most NAS, a built-in backup feature exists so you might be wondering why I’m not using that.

Synology does have a ‘Backup and Restore’ application package that can be used to backup data to a variety of sources:

  • Amazon S3
  • Microsoft Azure
  • CHT hicloud
  • SFR NAS BACKUP
  • Another Synology
  • An rsync-compatible server
  • Local storage
  • External storage
  • Network share

And that is where I began.  I created a new share on the WD My Cloud, added a user and gave them rights to said share, setup rsyncd on the WD My CLoud then added the WD My Cloud as ‘an rsync-compatible server backup destination’ on the Synology as part of the backup task.

However, although my initial test backup completed successfully, after creating 6 or 7 other backup tasks (basically one for each share on the Synology) the results were hit or miss:

  • One or two data backup task seemed to always complete successfully
  • Most data backup tasks seemed to fail most of the time but succeeded on the second or third attempt
  • Some data backup tasks fail more often than they succeed
  • One or two data backup tasks always fail; never completed successfully once

I thought maybe I screwed something up so I deleted all the data backup tasks that were failing and re-created them one at a time, testing between each.  I initially had better success the second time around so I pressed on for a few kept failing.  Thinking maybe there was something wrong with the existing backed up data on the WD My Cloud I deleted everything and recated.  Still, a couple tasks wouldn’t work, and after a few days things that were successful were now failing.  Bonkers.

Since the WD My Cloud isn’t exactly a workhorse, I looked there thinking it was struggling, and aside from the indexing service bogging the system, which I nerfed:


for i in wdphotodbmergerd wdmcserverd
do
     for j in $( ls /etc/rc2.d/*$i ); do
          $j stop
     done
     update-rc.d $i disable
done

There were no logs that suggested rsyncd or the system were the source of the failures.

I grabbed my shovel and went back to the Synology to dig into some logs and the backup process as a whole.  Unfortunately very few logs are created, the logs contain minimal information and Synology has obfuscated the backup process by using custom scripts and binaries making it impossible (for me anyway) to alter the rsync command line to create verbose log files.

It’s been well over a month now, the Synology backup process runs nightly, the failure rate is still no better and I’m getting worried: Back at the end of November I ran into a series of unfortunate events that resulted in nearly complete data loss:

  • A drive died in the NAS – no big deal because RAID5
  • The next day I lost power ONLY in the room where the NAS was located and when it came back up the volume couldn’t be mounted (bad superblock) – UPS was not connected (shame on me) because I was in the middle of a move
  • A few days after that, the external drive where a subset of the data on the NAS was backed up to died – no idea what happened there!

Thanks be to God, I was able to get everything restored with zero data loss!  I was able to fix the superblock error on the NAS, mount the volume read-only on the NAS and restore the data.  Since I had about 5TB of data on the NAS and only 1-something TB on my Server, I needed a place to back it up to.  Best Buy had a 4TB WD My Cloud NAS on sale for around $170 which was about the same price as a 4TB drive – score!

Once I backed everything up to my Server and the WD My Cloud I recreated the volume, restored the data and re-evaluated my backup strategy which is what prompted me to look at the ‘Backup and Restore’ process.  Not wanting to wait for another emergency, I took matters into my own hands and rolled my own rsync shell script above to backup the NAS when the built-in process was failing.

My custom backup script has been been running nightly for the past month and, as far as I can tell, it works every time.  I’ve spot checked the data as well as the logs to verify the process completes successfully, which it does, giving me peace of mind.

Synology is on the cusp of releasing DSM 6 and I hope they’ve improved the backup process so I can simply use that.  I’m considering getting a DS1515+ (or the 16 series equivalent) so maybe I’ll be able to tap into the Synology-to-Synology backups and block-level replication.

Good Providence (again) to you!