If your organization implement enterprise Active Directory management solutions, they often unknowingly create a blind spot in their security monitoring. While these platforms provide excellent governance, workflow, and change management capabilities, their audit trails frequently remain invisible to Security Information and Event Management (SIEM) systems like Microsoft Sentinel.
The Service Account Visibility Gap
Most enterprise AD management platforms don't write changes directly to Active Directory using the administrator's credentials. Instead, they operate through a dedicated service account that acts as an intermediary. This architectural design creates a significant observability challenge for security teams.
When changes are made through these management platforms:
- Administrator initiates a change through the management interface
- Management platform logs the operation to its internal database
- Service account executes the change in Active Directory
- SIEM only sees the service account performing the action
This means your SIEM logs show something like BEAR\svc-manager making user modifications, but you lose the crucial context of who actually initiated the change. Without this attribution, you cannot establish proper accountability, investigate incidents effectively, or maintain compliance with audit requirements.
One Identity Active Roles: A Case Study
In this example, we'll focus on One Identity Active Roles Server (formerly Dell Active Roles Server), a popular enterprise AD management solution. Active Roles provides a centralized management interface that connects to a service that writes data to Active Directory using a dedicated service account.
The platform maintains detailed operational logs in its own database, including:
- Who initiated each operation (the actual administrator)
- What changes were made (attribute modifications, group memberships)
- When operations occurred (precise timestamps)
- Target objects (which users, groups, or OUs were affected)
However, these logs remain trapped within the Active Roles database, invisible to security monitoring tools that rely on Windows Event Logs or AD audit events.
Scripting CSV Data
Fortunately, Active Roles includes a comprehensive PowerShell interface that allows direct querying of its operational database. By leveraging this interface, we can extract the detailed audit information and format it for SIEM ingestion.
Data Extraction
The core command for retrieving operational data is straightforward:
# Load the Active Roles Management Shell
. 'C:\Program Files\One Identity\Active Roles\Shell\ManagementShell.ps1'
# Connect to the Active Roles service
Connect-QADService -proxy
# Retrieve recent operations
Get-QARSOperation -SizeLimit 5000
However, the raw output contains complex .NET objects that don't export cleanly to CSV format. The challenge lies in properly extracting and formatting the nested object properties.
Data Structure Challenges
When you attempt a basic CSV export, you encounter serialization issues:
# This produces unusable output
Get-QARSOperation -SizeLimit 5000 | Export-Csv output.csv -NoTypeInformation
The resulting CSV contains .NET type names instead of actual data:
"System.Collections.Generic.List`1[ActiveRoles.ManagementShell.BusinessLogic.ManagementHistory.IAttributeChangeInfo]"
Precise Data Extraction
The solution requires carefully selecting and transforming the object properties:
Get-QARSOperation -SizeLimit 5000 |
Select-Object ID, OperationGuid, Type, Status,
@{Name='InitiatedOn'; Expression={$_.Initiated}},
@{Name='Completed'; Expression={$_.Completed}},
@{Name='InitiatorAccount'; Expression={$_.InitiatorInfo.NTAccountName}},
@{Name='InitiatorDN'; Expression={$_.InitiatorInfo.DN}},
@{Name='TargetDN'; Expression={$_.TargetObjectInfo.DN}},
TasksCount,
@{Name='AttributeChanges'; Expression={($_.AttributeChanges | ForEach-Object {"$($_.AttributeName)=$($_.NewValue)"}) -join '; '}} |
Export-Csv c:\temp\output.csv -NoTypeInformation
Remote Execution Challenges
While the PowerShell commands work perfectly in an interactive session, remote execution introduces significant technical hurdles. The primary obstacle is the Active Roles Management Shell's console initialization requirements.
The Console Sizing Problem
The ManagementShell.ps1 script attempts to set specific console window dimensions during initialization:
$ui.WindowSize = $windowSize
$ui.BufferSize = $bufferSize
Remote execution environments impose strict limitations on console dimensions. When run remotely, the script fails with errors like:
Exception setting "WindowSize": "Window cannot be taller than 44.
Parameter name: value.Height
Actual value was 50."
PSExec with File Modification
The successful approach combines PSExec remote execution with temporary file modification:
# 1. Create a script that modifies the ManagementShell.ps1 file
$backupAndRunScript = @'
# Backup original file
Copy-Item "C:\Program Files\One Identity\Active Roles\7.3\Shell\ManagementShell.ps1" "C:\Program Files\One Identity\Active Roles\7.3\Shell\ManagementShell.ps1.backup"
# Read and modify content
$content = Get-Content "C:\Program Files\One Identity\Active Roles\7.3\Shell\ManagementShell.ps1" -Raw
$content = $content -replace '\$ui\.WindowSize = \$windowSize', '# $ui.WindowSize = $windowSize'
$content = $content -replace '\$ui\.BufferSize = \$bufferSize', '# $ui.BufferSize = $bufferSize'
# Apply the fix
$content | Out-File "C:\Program Files\One Identity\Active Roles\7.3\Shell\ManagementShell.ps1" -Encoding UTF8
try {
# Run the actual export
. "C:\Program Files\One Identity\Active Roles\7.3\Shell\ManagementShell.ps1"
Connect-QADService -proxy
Get-QARSOperation -SizeLimit 5000 |
Select-Object ID, OperationGuid, Type, Status,
@{Name='InitiatedOn'; Expression={$_.Initiated}},
@{Name='Completed'; Expression={$_.Completed}},
@{Name='InitiatorAccount'; Expression={$_.InitiatorInfo.NTAccountName}},
@{Name='InitiatorDN'; Expression={$_.InitiatorInfo.DN}},
@{Name='TargetDN'; Expression={$_.TargetObjectInfo.DN}},
TasksCount,
@{Name='AttributeChanges'; Expression={($_.AttributeChanges | ForEach-Object {"$($_.AttributeName)=$($_.NewValue)"}) -join '; '}} |
Export-Csv c:\temp\output.csv -NoTypeInformation
"SUCCESS" | Out-File "c:\temp\status.txt"
} finally {
# Always restore the original file
Copy-Item "C:\Program Files\One Identity\Active Roles\7.3\Shell\ManagementShell.ps1.backup" "C:\Program Files\One Identity\Active Roles\7.3\Shell\ManagementShell.ps1" -Force
Remove-Item "C:\Program Files\One Identity\Active Roles\7.3\Shell\ManagementShell.ps1.backup" -Force
}
'@
# 2. Copy script to remote server and execute with PSExec
$backupAndRunScript | Out-File -FilePath "\\server\c$\temp\backup-run.ps1" -Encoding UTF8
$output = cmd /c "psexec \\server -s powershell.exe -ExecutionPolicy Bypass -File c:\temp\backup-run.ps1"
# 3. Retrieve the CSV file
Copy-Item -Path "\\server\c$\temp\output.csv" -Destination ".\output.csv"
Initial Run : Weekly reports, split up by Day.
We need the initial run to contain the last 7 days of data to get SIEM integration working, this will require a one time loop operation:
# Loop through the last 7 days
for ($i = 1; $i -le 7; $i++) {
$targetDate = (Get-Date).AddDays(-$i)
$dateStamp = $targetDate.ToString("yyyy-MM-dd")
Write-Host "Processing operations for date: $dateStamp"
# Get operations for this specific day
$operations = Get-QARSOperation -CompletedOn $targetDate -SizeLimit 10000
This will then export all the data for the last week as the "initial" run:
Hourly Run : Hour by Hour
If however you are looking for hour by hour then you will need to get a little bit more custom with the code which will look like this:
# Get previous hour operations
$now = Get-Date
$previousHour = $now.AddHours(-1)
$hourStart = Get-Date -Year $previousHour.Year -Month $previousHour.Month -Day $previousHour.Day -Hour $previousHour.Hour -Minute 0 -Second 0
$hourEnd = $hourStart.AddHours(1).AddSeconds(-1)
$hourStamp = $hourStart.ToString("yyyy-MM-dd-HH")
$operations = Get-QARSOperation -CompletedAfter $hourStart -CompletedBefore $hourEnd -SizeLimit 10000
if ($operations.Count -gt 0) {
$operations |
Select-Object ID, OperationGuid, Type, Status,
@{Name='InitiatedOn'; Expression={$_.Initiated}},
@{Name='Completed'; Expression={$_.Completed}},
@{Name='InitiatorAccount'; Expression={$_.InitiatorInfo.NTAccountName}},
@{Name='InitiatorDN'; Expression={$_.InitiatorInfo.DN}},
@{Name='TargetDN'; Expression={$_.TargetObjectInfo.DN}},
TasksCount,
@{Name='AttributeChanges'; Expression={($_.AttributeChanges | ForEach-Object {"$($_.AttributeName)=$($_.NewValue)"}) -join '; '}} |
Export-Csv "c:\temp\ARS-Operations-$hourStamp.csv" -NoTypeInformation
} else {
# Create empty file for hours with no operations
"ID,OperationGuid,Type,Status,InitiatedOn,Completed,InitiatorAccount,InitiatorDN,TargetDN,TasksCount,AttributeChanges" | Out-File "c:\temp\ARS-Operations-$hourStamp.csv" -Encoding UTF8
}
for ($i = 1; $i -le 7; $i++) {
Once you have the CSV export, integration with your SIEM becomes straightforward. The extracted data provides the missing attribution context for valid alerts.
Conclusion
Enterprise Active Directory management platforms provide essential governance and control capabilities, but their audit trails often remain invisible to security monitoring systems. By leveraging the PowerShell interfaces provided by these platforms, security teams can extract detailed operational data and integrate it with their SIEM systems.
This approach transforms opaque service account activities into detailed, attributable audit trails that enhance your organization's security posture and compliance capabilities.