In the world of DevOps and automation, "weaponization" doesn't mean anything malicious. Instead, it refers to the process of hardening and optimizing scripts for production deployment. Today, I'm going to walk you through how I transformed a continuous monitoring system into a single-run, scheduled task-ready automation tool.
The Challenge: From Always-On to Just-In-Time
Originally, I was designed as a continuous monitoring script, constantly watching a network folder for changes, consuming resources 24/7 like a hungry bear that never hibernates. But in production environments, this approach has drawbacks:
- Constant resource consumption
- Potential memory leaks over time
- Difficult to maintain and debug
- Complex error recovery
The solution? Transform into a single-run process that executes on a schedule when called by the website interface in the post here
The "Bear" Folder Name Prefix
Before diving into the technical details, let's understand the unique folder structure I process. Each folder follows a specific pattern:
BEAR1544678 - Lee
BEAR2891045 - Johnson
BEAR9876543 - Smith
BEAR3456789 - Williams
The magic here is in the numbers. That 7-digit sequence after "BEAR" isn't just an identifier - it's also the password for the encrypted ZIP file. Clever, right? Like a bear leaving tracks that only other bears can follow.
The Four-Stage Process
Stage 1: Folder Discovery and Validation
When I wake up (triggered by Windows Task Scheduler), my first job is to scan the configured directory:
$MonitorPath = "<share-Unc-path>"
$folders = Get-ChildItem -Path $MonitorPath -Directory
$unprocessedFolders = @()
foreach ($folder in $folders) {
if (-not (Test-FolderProcessed $folder.FullName $LogData)) {
Write-RuntimeLog "Found unprocessed folder: $($folder.Name)" -Level 'Info'
$unprocessedFolders += $folder
}
}
I maintain a JSON-based processing log to track which folders have already been handled, preventing duplicate processing - like a bear marking its territory.
Stage 2: 7-Zip Compression with Automatic Password Protection
Here's where the BEAR naming convention shines. I extract the password from the folder name and use 7-Zip to create secure archives:
# Extract password from folder name
$folderName = "BEAR1544678 - Lee"
$password = $folderName -match 'BEAR(\d{7})' | Out-Null; $matches[1]
# Create password-protected ZIP
$zipCommand = "7z a -tzip -p$password `"$folderName.zip`" `"$folderName`""
The actual 7-Zip command executed would be:
7z a -tzip -p1544678 "BEAR1544678 - Lee.zip" "BEAR1544678 - Lee"
This happens for all discovered folders in a batch operation - efficient and systematic.
Stage 3: OneDrive Upload and Link Generation
Once all ZIP files are created, I trigger a single upload operation to OneDrive:
# Clear previous links
Clear-Content "C:\Quarantine\automation-monitor\Links.txt" -Force
# Execute upload script
& "C:\Quarantine\onedrive-uploader-linkemailer\Onedrive-Uploader.ps1" $parentPath
The upload script uses a pre-configured token to authenticate with OneDrive and generates shareable links, which are stored in Links.txt. These links are configured as view-only, ensuring security while maintaining accessibility.
Example Links.txt content:
https://bearco.sharepoint.com/:u:/s/Uploads/EaBcDeFgHiJkLmNoPqRsTuVwX
https://bearco.sharepoint.com/:u:/s/Uploads/EbCdEfGhIjKlMnOpQrStUvWxY
https://bearco.sharepoint.com/:u:/s/Uploads/EcDeFgHiJkLmNoPqRsTuVwXyZStage 4: Professional Email Notification
Finally, I send a professionally formatted HTML email to notify recipients:
& "C:\Quarantine\sarc-automation-monitor\mail-thelinks.ps1" "C:\Quarantine\sarc-automation-monitor\Links.txt"
The email includes:
- File names (without revealing the password structure)
- OneDrive download links
- Professional formatting
- Clear instructions for access
Importantly, the email never includes the actual passwords - those remain encoded in the folder names for authorized personnel only.
The Runtime Logging System
One crucial aspect of weaponization is comprehensive logging. I maintain detailed logs at runtime.log:
[2025-12-03 14:23:45] [Info] === Batch Processor Started ===
[2025-12-03 14:23:45] [Info] Monitor Path: \\stwater.intra\stw\SharedData\CR\Secure\CRAA\SARS\Lee - One Drive Links
[2025-12-03 14:23:45] [Info] Scanning for unprocessed folders...
[2025-12-03 14:23:46] [Info] Found unprocessed folder: BEAR4729183 - Anderson
[2025-12-03 14:23:46] [Info] Found unprocessed folder: BEAR8502916 - Thompson
[2025-12-03 14:23:46] [Info] === Starting Batch Processing ===
[2025-12-03 14:23:47] [Success] ZIP Creation for BEAR4729183 - Anderson completed successfully
[2025-12-03 14:23:48] [Success] ZIP Creation for BEAR8502916 - Thompson completed successfully
[2025-12-03 14:23:52] [Success] OneDrive Batch Upload completed successfully
[2025-12-03 14:23:55] [Success] Email Notification completed successfully
[2025-12-03 14:23:55] [Success] === Batch Processor Completed ===
The logging function is basic but effective:
function Write-RuntimeLog {
param(
[string]$Message,
[ValidateSet('Info', 'Warning', 'Error', 'Success')]
[string]$Level = 'Info'
)
$timestamp = Get-Date -Format "yyyy-MM-dd HH:mm:ss"
$logEntry = "[$timestamp] [$Level] $Message"
Add-Content -Path $RuntimeLogFile -Value $logEntry -Encoding UTF8
}
Exit Codes and Error Handling
Proper weaponization means predictable behavior. I use standard exit codes:
try {
Start-BatchProcessing
exit 0 # Success
}
catch {
Write-RuntimeLog "Unexpected error during execution: $_" -Level 'Error'
exit 1 # Failure
}