Notice: Due to size constraints and loading performance considerations, scripts referenced in blog posts are not attached directly. To request access, please complete the following form: Script Request Form Note: A Google account is required to access the form.
Disclaimer: I do not accept responsibility for any issues arising from scripts being run without adequate understanding. It is the user's responsibility to review and assess any code before execution. More information

Real-Time Password Reset Monitoring System: From Log Files to Interactive Dashboard

I spoke recently about the password reset application, I found myself constantly needing to monitor user activity, troubleshoot failed attempts, and track security events. The default logging provided by most password reset tools gives you raw log files that are difficult to parse and analyze quickly. That's when I decided to build a comprehensive monitoring solution that transforms these logs into an interactive, real-time dashboard.

In this post, I'll walk you through how I created a complete monitoring system that includes:

  • A PowerShell script that parses log files in real-time
  • A JSON data feed for web consumption
  • An interactive HTML dashboard with filtering and search capabilities
  • Real-time statistics and monitoring alerts

Visual Results

This is what the end result looks like when served on a web server, the JSON file needs to be in the same folder as the website.


You also get a live feed of the events in the "All" option, here you can see someone using the wrong original password:





The Challenge: Making Sense of Password Reset Logs

Most enterprise password reset applications generate logs that look like this:

2025-06-13 08:05:56 - Password reset attempt initiated for user 'bear.user@bear.local' 
from IP 10.245.186.135 by authenticated user 'Bear\bear.user2'
2025-06-13 08:05:56 - Authentication successful for user: bear.user
2025-06-13 08:05:56 - Password reset successful for user: bear.user
2025-06-13 09:37:22 - Password reset attempt initiated for user 'bear.user2@bear.local' 
from IP 10.24.1.288 by authenticated user 'Bear\bear.user'
2025-06-13 09:37:22 - ERROR shown to user: Current credentials are invalid.

While these logs contain valuable information, they're not easy to analyze at scale. I needed a way to:

  • Track success/failure rates in real-time
  • Identify problematic user accounts
  • Monitor for suspicious activity patterns
  • Provide administrators with actionable insights

The Solution: A Three-Part Monitoring System

I designed a solution with three main components:

  1. PowerShell Log Parser: Monitors log files and converts entries to structured JSON
  2. JSON Data Feed: Provides real-time data in a web-friendly format
  3. Interactive Dashboard: Displays data with filtering, search, and statistics

Part 1: The PowerShell Log Parser

The foundation of my monitoring system is a PowerShell script that continuously monitors the password reset log file and converts raw log entries into structured JSON data.

Here's the core parsing function:

function Parse-LogEntry {
    param([string[]]$LogLines)
    
    $entries = @()
    $currentEntry = @{}
    
    foreach ($line in $LogLines) {
        if ($line -match "^(\d{4}-\d{2}-\d{2} \d{2}:\d{2}:\d{2})") {
            # If we have a current entry, add it to entries
            if ($currentEntry.Count -gt 0) {
                $entries += [PSCustomObject]$currentEntry
            }
            
            # Start new entry
            $timestamp = $matches[1]
            $currentEntry = @{
                Timestamp = $timestamp
                IP = ""
                Username = ""
                AuthenticatedUser = ""
                Result = ""
                Reason = ""
                RawMessage = $line
            }
            
            # Extract IP address
            if ($line -match "IP:\s*([0-9\.]+)") {
                $currentEntry.IP = $matches[1]
            }
            
            # Extract authenticated user (for admin resets)
            if ($line -match "authenticated user:\s*([^\s]+)") {
                $currentEntry.AuthenticatedUser = $matches[1]
            }
            
            # Extract username from various patterns
            if ($line -match "for user '([^']+)'") {
                $currentEntry.Username = $matches[1]
            } elseif ($line -match "for user: ([^\s]+)") {
                $currentEntry.Username = $matches[1]
            }
            
            # Determine result and reason
            if ($line -match "Password reset SUCCESSFUL") {
                $currentEntry.Result = "SUCCESS"
                $currentEntry.Reason = "Password reset completed successfully"
            } elseif ($line -match "ERROR.*Current credentials are invalid") {
                $currentEntry.Result = "FAILURE"
                $currentEntry.Reason = "Invalid current password"
            } elseif ($line -match "Password reset page accessed") {
                $currentEntry.Result = "ACCESS"
                $currentEntry.Reason = "Page accessed"
            } elseif ($line -match "Password reset attempt initiated") {
                $currentEntry.Result = "ATTEMPT"
                $currentEntry.Reason = "Reset attempt initiated"
            }
        }
    }
    
    # Add the last entry
    if ($currentEntry.Count -gt 0) {
        $entries += [PSCustomObject]$currentEntry
    }
    
    return $entries
}

Key Features of the Parser:

  1. Non-blocking File Access: Uses .NET FileStream with shared read access to avoid locking the log file
  2. Incremental Processing: Only processes new log entries since the last check
  3. Pattern Recognition: Uses regex to extract usernames, IP addresses, and error reasons
  4. Structured Output: Converts unstructured log text into JSON objects

The script runs continuously, checking for new log entries every 30 seconds:

# Main monitoring loop
while ($true) {
    try {
        if (Test-Path $LogFilePath) {
            $currentFileSize = (Get-Item $LogFilePath).Length
            
            # Only process if file has grown
            if ($currentFileSize -gt $lastFileSize) {
                Write-Host "Processing log file changes..."
                
                # Read entire file content
                $logLines = Read-LogFileNonLocking -FilePath $LogFilePath
                
                if ($logLines.Count -gt 0) {
                    # Parse all entries
                    $newEntries = Parse-LogEntry -LogLines $logLines
                    
                    # Filter out duplicates and save
                    $allEntries += $uniqueNewEntries
                    Save-Data -Data $allEntries -OutputPath $OutputPath
                }
                
                $lastFileSize = $currentFileSize
            }
        }
        
        Start-Sleep -Seconds $IntervalSeconds
    }
    catch {
        Write-Error "Error in monitoring loop: $_"
        Start-Sleep -Seconds $IntervalSeconds
    }
}

Part 2: The JSON Data Structure

The PowerShell script outputs structured JSON data that looks like this:

[
  {
    "Username": "bear.user@bear.local",
    "AuthenticatedUser": "BEAR\\bear.user",
    "Timestamp": "2025-06-13 08:05:56",
    "RawMessage": "2025-06-13 08:05:56 - Password reset successful for user: bear.user",
    "IP": "10.245.186.135",
    "Reason": "Password reset completed successfully",
    "Result": "SUCCESS"
  },
  {
    "Username": "bear.user2@bear.local",
    "AuthenticatedUser": "",
    "Timestamp": "2025-06-13 09:37:22",
    "RawMessage": "2025-06-13 09:37:22 - ERROR shown to user: Current credentials 
    are invalid.",
    "IP": "",
    "Reason": "Invalid current password",
    "Result": "FAILURE"
  }
]

This structured format makes it easy for the web dashboard to consume and display the data effectively.

Part 3: The Interactive Dashboard

The final component is a comprehensive HTML dashboard that provides real-time monitoring capabilities. Here are the key features I implemented:

Real-Time Statistics Dashboard:

function updateStats() {
    const stats = {
        SUCCESS: 0,
        FAILURE: 0,
        ACCESS: 0,
        ATTEMPT: 0
    };

    allData.forEach(item => {
        if (stats.hasOwnProperty(item.Result)) {
            stats[item.Result]++;
        }
    });

    document.getElementById('successCount').textContent = stats.SUCCESS;
    document.getElementById('failureCount').textContent = stats.FAILURE;
    document.getElementById('accessCount').textContent = stats.ACCESS;
    document.getElementById('attemptCount').textContent = stats.ATTEMPT;
}

Advanced Filtering and Search:

function filterAndDisplayData() {
    const searchTerm = document.getElementById('searchBox').value.toLowerCase();
    
    filteredData = allData.filter(item => {
        // Filter by result type
        if (currentFilter !== 'all' && item.Result !== currentFilter) {
            return false;
        }
        
        // Filter by search term
        if (searchTerm) {
            return (
                (item.Username || '').toLowerCase().includes(searchTerm) ||
                (item.IP || '').toLowerCase().includes(searchTerm) ||
                (item.Reason || '').toLowerCase().includes(searchTerm) ||
                (item.AuthenticatedUser || '').toLowerCase().includes(searchTerm)
            );
        }
        
        return true;
    });

    displayData();
}

Auto-Refresh Functionality:

// Auto-refresh every 30 seconds
setInterval(loadData, 30000);

async function loadData() {
    try {
        const response = await fetch(DATA_FILE_PATH + '?t=' + Date.now());
        if (!response.ok) {
            throw new Error('Failed to load data');
        }
        
        const data = await response.json();
        allData = Array.isArray(data) ? data : [];
        updateStats();
        filterAndDisplayData();
        updateLastUpdatedTime();
    } catch (error) {
        console.error('Error loading data:', error);
        showError('Failed to load data. Please check if the data file exists and is accessible.');
    }
}

Implementation and Deployment

To deploy this monitoring system:

Install the PowerShell Script

# Run the monitoring script
.\LogMonitor.ps1 -LogFilePath "PasswordReset.log" 
-OutputPath ".\password_reset_data.json" -IntervalSeconds 30

Copy the Files to the Web Server
# Define source and destination paths
$sourcePath = "."
$destPath = "\\wwwsrv1.bear.local\LiveLog"

# Ensure destination directory exists
if (!(Test-Path $destPath)) {
    New-Item -ItemType Directory -Path $destPath -Force
    Write-Host "Created directory: $destPath"
}

# Copy LiveLog.html to index.html (only if index.html doesn't exist)
$sourceHtml = Join-Path $sourcePath "LiveLog.html"
$destHtml = Join-Path $destPath "index.html"

if (!(Test-Path $destHtml)) {
    if (Test-Path $sourceHtml) {
        Copy-Item $sourceHtml $destHtml -Force
        Write-Host "Copied LiveLog.html to index.html"
    } else {
        Write-Warning "Source file LiveLog.html not found"
    }
} else {
    Write-Host "index.html already exists, skipping copy"
}

# Copy password_reset_data.json only if source is newer
$sourceJson = Join-Path $sourcePath "password_reset_data.json"
$destJson = Join-Path $destPath "password_reset_data.json"

if (Test-Path $sourceJson) {
    $sourceTime = (Get-Item $sourceJson).LastWriteTime
    
    if (Test-Path $destJson) {
        $destTime = (Get-Item $destJson).LastWriteTime
        
        if ($sourceTime -gt $destTime) {
            Copy-Item $sourceJson $destJson -Force
            Write-Host "Copied password_reset_data.json (source is newer: $sourceTime vs $destTime)"
        } else {
            Write-Host "Destination password_reset_data.json is up to date (dest: $destTime, source: $sourceTime)"
        }
    } else {
        Copy-Item $sourceJson $destJson -Force
        Write-Host "Copied password_reset_data.json (destination didn't exist)"
    }
} else {
    Write-Warning "Source file password_reset_data.json not found"
}

File Locking Considerations

Initially, I encountered issues with the PowerShell script locking the log file. I solved this by using .NET FileStream with shared read access:

function Read-LogFileNonLocking {
    param([string]$FilePath)
    
    try {
        $fileStream = New-Object System.IO.FileStream($FilePath, [System.IO.FileMode]::
        Open, [System.IO.FileAccess]::Read, [System.IO.FileShare]::ReadWrite)
        $streamReader = New-Object System.IO.StreamReader($fileStream)
        $content = $streamReader.ReadToEnd()
        $streamReader.Close()
        $fileStream.Close()
        
        return $content -split "`n"
    }
    catch {
        Write-Warning "Error reading log file: $_"
        return @()
    }
}

Performance Optimization

To prevent the JSON file from growing too large, I implemented a rolling window approach:

# Keep only last 1000 entries to prevent file from growing too large
if ($allEntries.Count -gt 1000) {
    $allEntries = $allEntries[0..999]
}

Error Handling

Robust error handling ensures the monitoring system continues running even when issues occur:

try {
    const response = await fetch(DATA_FILE_PATH + '?t=' + Date.now());
    if (!response.ok) {
        throw new Error('Failed to load data');
    }
    // Process data...
} catch (error) {
    console.error('Error loading data:', error);
    showError('Failed to load data. 
Please check if the data file exists and is accessible.');
}

Conclusion

Building this password reset monitoring system has transformed how I managed the password reset logs. 

The combination of PowerShell for log processing, JSON for data exchange, and HTML/JavaScript for the dashboard creates a powerful yet lightweight solution that can be adapted to monitor virtually any log-based application.

Previous Post Next Post

نموذج الاتصال