prod@blog:~$

Safe and Sound: Preserving VPN Logs Against Unauthorized Changes

I’ve previously written about how to query audit and activity logs on Absolute Secure Access on-premises servers, focusing on extracting meaningful events from the Mobility event logs for troubleshooting and investigation. That approach works well when logs are present and intact.

It is important to clarify, however, that Absolute Secure Access on-premises does not write its audit or activity events to the Windows Event Log. The only entries you may see there are basic service start or stop messages. All meaningful session and audit data is written exclusively to the Secure Access log files under the server’s Logs directory. This is why collecting and archiving these files off the source servers is critical — the Windows Event Log cannot be relied upon for forensic or investigative purposes.

Even with the best procedures in place, there is always a risk that someone with administrative access could inadvertently or deliberately remove logs, either by cleaning up disk space, rolling over files aggressively, or attempting to hide their actions. This script and process were implemented long before I published guidance on querying logs, as a methodical approach to preserve evidence without relying on luck.

Understanding the Source and Event Flow

Absolute Secure Access on-premises writes its authoritative audit and activity data to local log files on each server. This means that any log deletions or rollovers on the source server directly affect what is visible. To protect against this, it is crucial to treat each server as a write-only source and copy logs to a separate management server that is restricted from general administrative access.

Defining Source Servers and Log Paths

The first step is identifying which servers to monitor and where the logs are located. Each server has a dedicated path:

$ServerPaths = @{
    "st1w3335" = "c`$\Program Files\NetMotion Server\Logs"
    "st1w3336" = "c`$\Program Files\NetMotion Server\Logs"
    "st1w3373" = "c`$\Program Files\Secure Access Server\Logs"
}

By explicitly defining each server, I ensure the script only touches intended sources. This eliminates ambiguity and reduces the risk of copying unintended files.

Creating a Central Archive

On the management server, logs are stored in a central archive with a folder per server:

$LocalRoot = "D:\SecureAccessLogs"
$DestPath   = Join-Path $LocalRoot $Server

Each server’s logs are kept separately, which makes it easy to attribute log files to the source system. It also allows for later correlation across servers without mixing sources.

Filtering by Date and Selecting Files

Rather than copying everything, the script identifies only the relevant Mobility event logs that have been modified recently. This reduces network traffic and ensures incremental updates are efficient.

$CutoffDate = (Get-Date).AddDays(-7)

$FileList = Get-ChildItem -Path $SourcePath -Filter "Mobility_events*" -File |
            Where-Object { $_.LastWriteTime -ge $CutoffDate }

Here, $CutoffDate defines how far back we want to go. In this case, I keep a rolling window of seven days for efficiency, but the script can be adjusted to longer retention if desired. Get-ChildItem retrieves the relevant files, and Where-Objectensures only those modified since $CutoffDate are considered.

Temporary File Handling for Robocopy

Robocopy can use a file list to control what is copied. I generate a temporary file containing the filenames to ensure only selected files are copied:

$ListFile = New-TemporaryFile
$FileList | ForEach-Object { $_.Name } | Set-Content $ListFile

This temporary file is passed to Robocopy with the /FILELIST option. Using a temporary file has multiple advantages: it isolates the list of files for a single run, reduces complexity in command arguments, and is deleted after the copy completes to avoid clutter.

Copying Files with Robocopy

Robocopy is invoked with parameters designed to copy files safely and incrementally:

robocopy $SourcePath $DestPath /FILELIST:$ListFile `
    /R:2 `
    /W:2 `
    /Z `
    /COPY:DAT `
    /DCOPY:T `
    /FFT `
    /LOG+:$LocalRoot\Robocopy_$Server.log

Breaking down each option:

  • $SourcePath and $DestPath: The source server path and the destination folder on the management server.

  • /FILELIST:$ListFile: Ensures only files in the temporary list are copied.

  • /R:2 and /W:2: Minimal retries and wait times handle transient network errors without long delays.

  • /Z: Restartable mode allows partial copies to resume if interrupted.

  • /COPY:DAT: Preserves data, attributes, and timestamps. Essential for audit integrity.

  • /DCOPY:T: Preserves directory timestamps.

  • /FFT: Accounts for slight differences in timestamp resolution between filesystems.

  • /LOG+:<path>: Appends output to a log file on the management server for traceability.

By not using /MIR or /PURGE, deletions on the source server do not remove files from the archive. This safeguard ensures historical logs remain intact.

Iterating Across Multiple Servers

The script loops through each server defined in $ServerPaths, testing connectivity and creating destination folders if necessary:

foreach ($Server in $ServerPaths.Keys) {
    $SourcePath = "\\$Server\$($ServerPaths[$Server])"
    $DestPath   = Join-Path $LocalRoot $Server

    if (-not (Test-Path $SourcePath)) { continue }
    if (-not (Test-Path $DestPath)) { New-Item -ItemType Directory -Path $DestPath | Out-Null }

    # select files and copy with Robocopy
}

Each server is handled independently. If a server is unreachable, the script continues with the others. Creating the destination folder dynamically ensures the archive structure is maintained.

Logging and Traceability

Every Robocopy run appends to a log file for the server:

/LOG+:$LocalRoot\Robocopy_$Server.log

This provides a historical record of which files were copied, when, and with what result. Combined with the server folder structure, this allows reconstruction of exactly what was collected and when, supporting both operational troubleshooting and audit requirements.

Handling Rollovers and Deleted Files

The archive is intentionally append-only. If a source file is deleted or rolled over, it remains in the management server’s archive. New files are copied, existing files are updated if changed, and deleted source files do not remove anything in the archive. Over time, this strategy builds a complete historical record of all logs, independent of the source.

Conclusion

By separating log generation from log retention and using an incremental, append-only copy process, I can preserve Secure Access logs against accidental or deliberate deletion. The script uses simple, well-understood tooling, preserves timestamps and attributes, and produces a traceable log of its own activity. 

This approach removes operational risk without introducing complexity, ensuring historical records remain intact, verifiable, and auditable. It is methodical, reproducible, and robust, providing confidence that the data will remain available when it is needed.