ℹ️ Many blog posts do not include full scripts. If you require a complete version, please use the Support section in the menu.
Disclaimer: I do not accept responsibility for any issues arising from scripts being run without adequate understanding. It is the user's responsibility to review and assess any code before execution. More information

Taming the PowerShell Memory Monster: Fixing Exchange Online Leaks in Long-Running Scripts


Have you ever noticed your PowerShell scripts, especially those interacting with Exchange Online, gradually consume more and more RAM until your system advises of a low memory condition? You're not alone. This is a common and frustrating issue, particularly with long-running scripts that loop indefinitely.

If you are seeing your memory usage climb to 90% or higher, and it only resets when you physically close the PowerShell window, you are dealing with a Managed Heap Leak. Here is how to diagnose and kill it for good.

The Problem: Why the Leak Happens

PowerShell is built on the .NET framework, which uses a "Garbage Collector" (GC) to manage memory. Think of the GC as a janitor. In a normal script, the janitor cleans up after the script finishes.

However, in a continuous loop, the script never technically "finishes." When you combine this with the Exchange Online (EXO) module, which creates complex session objects and temporary files, the "janitor" gets confused. It sees the script is still running and assumes you might still need all that data from three hours ago, so it never throws it away.

Identifying the Leak: Your Diagnostic Tools

Before applying the fix, you can use these tools to confirm the leak is happening within the PowerShell engine:

  1. Process Explorer (Sysinternals): Look for your powershell.exe process.

    Add the column for Private Bytes. If this number only goes up and never down, you have a leak. Look at the Handles in the lower pane; an ever-increasing list of "NamedPipes" is a usual suspect for Exchange session leaks.

  2. VMMap: This tool shows you exactly what is holding the memory. If the Managed Heap is the largest section and is growing, it’s a PowerShell object issue. If the Heap (native) is growing, it’s likely a bug in a DLL or the EXO module itself.

The Fix: Forcing PowerShell to Clean Up

To stop the memory from climbing indefinitely, you need to manually "call the janitor" at the end of every cycle.

Step 1: The Garbage Collection Hammer

By default, the Garbage Collector waits for "memory pressure" before it works. In a loop, we want to force it to run every time we disconnect from Exchange. Add these lines at the very end of your loop:

# Force .NET to release unused memory back to the OS
[System.GC]::Collect()
[System.GC]::WaitForPendingFinalizers()

Step 2: Use Script Blocks for Scoping

One of the best ways to help the Garbage Collector is to make sure your variables "expire." By wrapping your logic in a script block (& { ... }), all variables created inside that block are marked as "disposable" as soon as the block ends.

Step 3: Clear the Exchange Temp Files

The EXO module is notorious for leaving "session artifacts" in your Temp folder. Manually clearing these prevents the session from bloating over time.

The "Leak-Proof" Script Template

Here is how your loop should look to prevent memory bloat:

while($true) {
    # 1. Wrap logic in a Script Block to isolate memory
    & {
        Write-Host "Starting Exchange Cycle..." -ForegroundColor Cyan
        
        Connect-ExchangeOnline -UserPrincipalName "admin@yourdomain.com" -SilentContinue
        
        # Retrieve your data
        $data = Get-EXOMailbox -ResultSize Unlimited
        
        # [Your Processing Logic Goes Here]

        # Always disconnect within the block
        Disconnect-ExchangeOnline -Confirm:$false
    }

    # 2. Clear out the EXO temporary session files
    Get-ChildItem -Path "$env:TEMP\tmpEXO_*" -ErrorAction SilentlyContinue | Remove-Item -Recurse -Force -ErrorAction SilentlyContinue

    # 3. Force the Garbage Collector to reclaim the RAM
    [System.GC]::Collect()
    [System.GC]::WaitForPendingFinalizers()

    Write-Host "Cycle complete. Memory reclaimed. Sleeping..." -ForegroundColor Green
    Start-Sleep -Seconds 300
}

Simulation Script : No Leak, can I try this?

Yes, you can this script will leak in the correct memory location gradually over time but this can be slow to see:


This is the code for that simulation - to get an diagnostics you will need to leave this script for a long time.

$LeakBucket = @() # This is the "hole" in the bucket that prevents cleanup
while($true) {
    # 1. Create a "heavy" object (approx 50MB of string data)
    $heavyData = "X" * 50MB
    
    # 2. To simulate a leak, we add it to a global list 
    # This keeps the reference "alive" so the janitor (GC) can't touch it.
    $LeakBucket += $heavyData
    
    # 3. Report current status
    $mem = [Math]::Round((Get-Process -Id $PID).PrivateMemorySize64 / 1MB, 2)
    Write-Host "Current Memory: $mem MB | Items in Bucket: $($LeakBucket.Count)" -ForegroundColor Red
    
    Start-Sleep -Seconds 1
}

However, if you want a quicker leak the code below will be more aggressive:
$LeakBucket = New-Object System.Collections.Generic.List[Byte[]]
Write-Host "Monitoring RAM... This will climb rapidly." -ForegroundColor Cyan
while($true) {
    # Allocate a 200MB byte array
    $byteArray = New-Object Byte[] 200MB
    
    # Fill it with random data so Windows can't optimize/deduplicate it
    (New-Object Random).NextBytes($byteArray)
    
    # Add to the list so it can't be cleaned up
    $LeakBucket.Add($byteArray)
    
    $memGB = [Math]::Round((Get-Process -Id $PID).PrivateMemorySize64 / 1GB, 2)
    Write-Host "Memory Locked in RAM: $memGB GB" -ForegroundColor Red
}
This will consume memory quicker than the previous script in a couple of seconds we have over 4.62GB locked into this simulation:


This is the process in Process Explorer where we can see its using 5.2GB of memory with the fake leak:


Then if we look under the Performance option we can see all that used memory:


The Performance Graph has shows the excessive increase in the private bytes, this is the first part of the saw tooth graph:


Now this is no longer running as a Powershell tasks lets run the garbage collector with the the command below, and this should free up all those resources:
[System.GC]::Collect()
When this command was executed in Powershell after a couple of seconds you see the drop in resource and the memory event is over, this is the Performance as the working set is no longer elevated: 


Then on the performance graph you can see the memory being released and and everything returning to normal, which is the end of the saw tooth as it returns to normal:


Summary

If you must run a PowerShell script indefinitely, you cannot rely on the internal memory management to stay lean. By using Explicit Garbage Collection and Script Block Scoping, you can keep a script that previously took 12 GB of RAM down to a steady 500 MB.

Previous Post Next Post

نموذج الاتصال