Recently, I needed to create an automated system that would monitor a specific network folder for new directories and then process them through a three-step workflow: ZIP creation, OneDrive upload, and email notification. What started as a simple file monitoring task evolved into a robust automation solution that handles edge cases and ensures reliable processing.
The Challenge
The requirement was straightforward: whenever someone creates a new folder in R:\Secure\Packages, the script needed to:
- Detect the new folder immediately
- Create a ZIP file of the folder contents
- Upload the ZIP to OneDrive
- Send email notifications with the OneDrive links
However, as with most automation tasks, the devil was in the details. The script needed to handle network drive quirks, avoid duplicate processing, and manage the handoff between different processing scripts.
Solving the Detection Problem
The first challenge was reliably detecting new folders on a network drive. File system watchers can be unreliable on network shares, so I implemented a dual approach:
Primary Detection: File System Watcher
$watcher = New-Object System.IO.FileSystemWatcher
$watcher.Path = $MonitorPath
$watcher.IncludeSubdirectories = $false
$watcher.EnableRaisingEvents = $true
$watcher.NotifyFilter = [System.IO.NotifyFilters]::DirectoryName -bor [System.IO.NotifyFilters]::CreationTime
The script watches for directory creation events and logs detailed information about what it detects:
[12:04:08] File system event detected:
Type: Created
Name: WhamBam2
Path: R:\Secure\Packages
Checking if path is a directory...
✓ Confirmed: Directory detected
Backup Detection: Periodic Scanning
To catch anything the file system watcher might miss, the script also performs periodic scans every 30 seconds:
$timer = New-Object System.Timers.Timer
$timer.Interval = 30000 # 30 seconds
$timer.AutoReset = $true
This ensures that even if the primary detection fails, folders won't go unprocessed for long.
Managing State with JSON Tracking
One crucial requirement was avoiding duplicate processing. The script maintains a JSON log file that tracks which folders have been processed:
{
"ProcessedFolders": {
"WigWham": {
"ProcessedDate": "2025-09-17 12:15:23",
"FolderPath": "R:\\Secure\\Packages",
"Status": "Completed"
}
},
"LastUpdated": "2025-09-17 12:15:23"
}
Before processing any folder, the script checks this log to ensure it hasn't already been handled. This prevents unnecessary work and avoids sending duplicate emails.
The Three-Step Processing Workflow
Once a new folder is detected and confirmed as unprocessed, the script executes a carefully orchestrated workflow:
Step 1: ZIP Creation
The script first checks if a ZIP file already exists to avoid unnecessary work:
function Test-ZipExists {
param($FolderPath)
$folderName = Split-Path $FolderPath -Leaf
# Check in the parent directory (where ZIP files are actually created)
$parentPath = Split-Path $FolderPath -Parent
$zipPath = Join-Path $parentPath "$folderName.zip"
return Test-Path $zipPath
}
If no ZIP exists, it calls the existing 7Zip-Compress.ps1 script to create one.
Step 2: OneDrive Upload
This step had a critical gotcha: the ZIP files are created in the parent directory, not within the folder itself. The script needed to pass the correct path to the upload script:
# Step 2: Upload to OneDrive (pass parent directory where ZIP files are located)
$parentPath = Split-Path $FolderPath -Parent
if (-not (Invoke-ProcessingScript $UploadScript -Arguments $parentPath -StepName "OneDrive Upload")) {
Mark-FolderProcessed $FolderPath $LogData "Failed at OneDrive Upload"
return $false
}
Step 3: Email Notification
The final step sends email notifications, but it required clearing the Links.txt file between runs to avoid sending stale information:
function Clear-LinksFile {
param([string]$LinksFilePath = "C:\Quarantine\OneDrive-Uploads\Links.txt")
try {
if (Test-Path $LinksFilePath) {
Clear-Content $LinksFilePath -Force
Write-Host "Cleared Links.txt file" -ForegroundColor Green
}
}
catch {
Write-Warning "Could not clear Links.txt file: $_"
}
}
Handling the Threading Challenge
Initially, I tried to handle processing within the event handler script blocks, but this created scope issues where functions weren't accessible. The solution was to use a simple communication pattern:
- Event handlers detect folders and add them to a global array
- Main thread continuously checks the array and processes any new entries
- All processing happens in the main thread context where functions are accessible
# In event handler - just detect and flag
$global:NewFoldersDetected += @($path)
# In main loop - do the actual work
if ($global:NewFoldersDetected.Count -gt 0) {
$foldersToProcess = $global:NewFoldersDetected
$global:NewFoldersDetected = @()
foreach ($folderPath in $foldersToProcess) {
Process-NewFolder $folderPath $currentLogData
}
}
Error Handling and Recovery
The script includes comprehensive error handling at each step. If any step fails, it's recorded in the JSON log with the specific failure point:
if (-not (Invoke-ProcessingScript $ZipScript -Arguments $FolderPath -StepName "ZIP Creation")) {
Mark-FolderProcessed $FolderPath $LogData "Failed at ZIP Creation"
return $false
}
This makes troubleshooting much easier and prevents failed folders from being forgotten.