Fixing Windows Server 2019 Update Hell: A Deep Dive Into Component Store Corruption


Anyone who has managed Windows Server environments has likely encountered the dreaded situation where updates simply refuse to install. What begins as a routine maintenance task spirals into a multi-day troubleshooting odyssey through the labyrinthine depths of the Windows Component Store. This post documents our exhaustive battle with a Windows Server 2019 domain controller that had been stubbornly rejecting updates since its deployment, presenting a technical post-mortem of the errors, investigation techniques, and the surgical precision required to fix component store corruption without risking domain controller integrity.

The Initial Situation: A Server Frozen in Time

Our subject was a Windows Server 2019 Core domain controller (build 17763.1) - essentially frozen at RTM (Release to Manufacturing) without a single successful cumulative update installation since deployment. Every attempt to update through conventional channels produced cryptic errors, leaving the server increasingly vulnerable as years passed without security patches.

The server wasn't just missing a few updates - it had never successfully installed any cumulative updates. This presented both a significant security risk and a technical puzzle, as the update chain had been completely broken from the start.

To add complexity, this wasn't just any server - it was a production domain controller, which meant:

  • Downtime had to be minimized
  • An in-place upgrade carried significant risk
  • A clean OS reinstall would require complex AD promotion/demotion procedures
  • Any corruption fix had to preserve all AD DS functionality

The Investigative Process: Dissecting the Update Mechanism

Initial Diagnostic Deep Dive

When conventional methods fail, you need to go deeper. I started by analyzing the verbose logging provided by Windows servicing components:

# Get the last 500 lines of the CBS log
Get-Content -Path "C:\Windows\Logs\CBS\CBS.log" -Tail 500 | Out-File C:\Temp\CBSrecent.log

# Check for specific error patterns
Select-String -Path "C:\Windows\Logs\CBS\CBS.log" -Pattern "error", "fail",
 "0x800f0986", "corruption"

The CBS.log (Component-Based Servicing log) revealed our first significant clue:

Error: 0x800f0986 - PSFX_E_APPLY_FORWARD_DELTA_FAILED

This error code is crucial - it specifically indicates a failure in the Windows delta compression mechanism. Windows updates often use delta compression to reduce download sizes, where only the changes (deltas) between the current and new versions are downloaded rather than complete files. For this to work, the base files must be in an expected state.

Digging deeper, we found more detailed errors:

2025-05-08 19:53:47, Error CSI 00000011 (F) Hydration failed for component 
Microsoft-Windows-Audio-DSound, version 10.0.17763.4131, arch amd64, nonSxS, 
pkt {l:8 b:31bf3856ad364e35} on file dsound.dll with NTSTATUS -2146498170[gle=0x80004005]

2025-05-08 19:53:47, Error CSI 00000012@2025/5/8:18:53:47.450 (F) 
onecore\base\wcp\rtllib\win32lib\delta_library.cpp(287): 
Error NTSTATUS_FROM_WIN32(ERROR_INVALID_DATA) originated in function 
Windows::Rtl::DeltaDecompressBuffer expression: g_pfnApplyDeltaB(( 
(DELTA_FLAG_TYPE)0x00000000 ), ReferenceInput, CompressedInput, &UncompressedOutput)
[gle=0x80004005]

This was the first major breakthrough. The logs revealed a critical detail - the "hydration" process was failing. In Windows servicing terminology, "hydration" refers to the process of unpacking compressed components and preparing them for installation. The specific component failing was Microsoft-Windows-Audio-DSound and the file causing the issue was dsound.dll.

The error ERROR_INVALID_DATA indicated a fundamental mismatch between what Windows expected to find and what was actually there, suggesting corruption in the component store.

Understanding the Windows Component Store Architecture

Before diving into our solution, it's essential to understand the Windows Component Store (WinSxS) architecture that underlies the entire update mechanism.

The Windows Component Store is an intricate system residing in the C:\Windows\WinSxS directory. Unlike traditional software installations, Windows doesn't simply replace files during updates. Instead, it maintains multiple versions of components in this directory, with complex pointers and manifests determining which version is active. This architecture allows for:

  1. Side-by-side versioning (hence "SxS")
  2. Reliable servicing operations
  3. Component rollback capabilities
  4. Isolation of system components

When updates fail at the component store level, conventional troubleshooting methods often fall short because they don't address the underlying architectural issues.

Step 1: Establishing a Reliable Servicing Foundation

The first approach was to ensure the servicing mechanism itself was functional. Servicing Stack Updates (SSUs) are special updates that modify the very components responsible for installing other updates - essentially, they update the updater.

# First attempt with the latest SSU
dism /online /add-package /packagepath:SSU-17763.6763-x64.cab

Deployment Image Servicing and Management tool
Version: 10.0.17763.3406

Image Version: 10.0.17763.4010

Processing 1 of 1 - Adding package Package_for_ServicingStack_6763~
31bf3856ad364e35~amd64~~17763.6763.1.1
[==========================100.0%==========================]
The operation completed successfully.

This appeared to succeed, which was encouraging. However, when I attempted to install a Cumulative Update:

dism /online /add-package /packagepath:Windows10.0-KB5023702-x64.cab

Deployment Image Servicing and Management tool
Version: 10.0.17763.3406

Image Version: 10.0.17763.4010

Processing 1 of 1 - Adding package Multiple_Packages~~~~0.0.0.0
[=                          3.0%                           ]
Error: 0x800f0986

The operation failed with the same error code. This was the first valuable insight - the problem went deeper than just an outdated servicing stack. The component store itself contained corruptions that prevented any updates from being applied, even with the latest servicing stack in place.

Step 2: The Corruption Manifest - PossibleCorruptions.txt

During my investigation, I discovered a critical file that Windows maintains to track component store corruptions: C:\Windows\WinSxS\PossibleCorruptions.txt. This file serves as a manifest of all components that Windows has identified as corrupt or inconsistent, lets look at that file with:

Get-Content C:\Windows\WinSxS\PossibleCorruptions.txt

The contents were revealing:

Microsoft-Windows-Audio-DSound, Culture=neutral, Version=10.0.17763.1, 
PublicKeyToken=31bf3856ad364e35, ProcessorArchitecture=wow64, versionScope=NonSxS
dsound.dll
Microsoft-Windows-Audio-DSound, Culture=neutral, Version=10.0.17763.1, 
PublicKeyToken=31bf3856ad364e35, ProcessorArchitecture=amd64, versionScope=NonSxS
dsound.dll
Microsoft-Windows-IE-IEShims, Culture=neutral, Version=11.0.17763.1, 
PublicKeyToken=31bf3856ad364e35, ProcessorArchitecture=amd64, versionScope=NonSxS
IEShims.dll
Microsoft-Windows-IE-IEShims, Culture=neutral, Version=11.0.17763.1, 
PublicKeyToken=31bf3856ad364e35, ProcessorArchitecture=x86, versionScope=NonSxS
IEShims.dll
Microsoft-Windows-WindowsImageAcquisition-CoreServices, Culture=neutral, 
Version=10.0.17763.1, PublicKeyToken=31bf3856ad364e35, ProcessorArchitecture=amd64, 
versionScope=NonSxS
wiaservc.dll

This file provided me with precise information about which components were preventing updates from installing. Notably, all corrupted components were from the base version (10.0.17763.1), indicating that the corruption had likely been present since the original OS installation or occurred very early in the server's lifecycle.

I now had a specific list of targets to address:

  1. Microsoft-Windows-Audio-DSound (dsound.dll) - both x64 and x86 versions
  2. Microsoft-Windows-IE-IEShims (IEShims.dll) - both x64 and x86 versions
  3. Microsoft-Windows-WindowsImageAcquisition-CoreServices (wiaservc.dll)

These aren't arbitrary files - they're part of critical Windows subsystems:

  • DirectSound (audio processing)
  • Internet Explorer compatibility layer
  • Windows Image Acquisition (for scanners and cameras)

The question now became: how do we repair these specific components without compromising the domain controller's stability?

Step 3: Conventional Repair Methods and Why They Failed

Before proceeding with the targeted solution, I exhausted all conventional repair methods to ensure I was not overlooking something simple:

SFC and DISM Repair Attempts

First approach was to use the standard Windows repair tools:

# System File Checker to repair corrupted system files
sfc /scannow

# DISM to repair the component store
dism /online /cleanup-image /scanhealth
dism /online /cleanup-image /checkhealth
dism /online /cleanup-image /restorehealth

The DISM /restorehealth operation failed with:

Error: 0x800f081f
The source files could not be found.

I attempted to provide source files from a mounted Windows Server 2019 ISO:

dism /online /cleanup-image /restorehealth /source:G:\sources\install.wim /limitaccess

But this also failed with access errors:

2025-05-10 20:00:41, Error DISM DISM Package Manager: PID=468 TID=668 
Failed to get the full path to source location: wim:c:\source\install.wim - 
CPackageManagerCLIHandler::Private_ValidateCmdLine(hr:0x80070057)

I even tried mounting the WIM file to provide better access:

# Create a mount directory
New-Item -Path "C:\Mount" -ItemType Directory -Force

# Mount the WIM file
Dism /Mount-Wim /WimFile:"c:\source\install.wim" /Index:1 /MountDir:"C:\Mount"

# Try the restore
DISM /Online /Cleanup-Image /RestoreHealth /Source:C:\Mount /LimitAccess

But this seemed to fail with the errors (more on that later in this guide)

Error: 0x800f081f
The source files could not be found.

This failure of conventional methods confirmed our suspicion that I was dealing with a more fundamental issue that required direct intervention at the component store level.

Step 4: Understanding WinSxS Folder Naming and Structure

To fix the corruption, I needed to understand how the Windows Component Store organizes files. WinSxS folder names follow a specific pattern:

[architecture]_[component-name]_[public-key-token]_[version]_[locale]_[hash]

For example:

amd64_microsoft-windows-ie-ieshims_31bf3856ad364e35_11.0.17763.1007_none_d8f217c19565d9f7

Breaking this down:

  • amd64: Architecture (64-bit)
  • microsoft-windows-ie-ieshims: Component name
  • 31bf3856ad364e35: Public key token (Microsoft's signing key)
  • 11.0.17763.1007: Version (Windows build 17763, component revision 1007)
  • none: Locale (language-neutral)
  • d8f217c19565d9f7: Hash (unique identifier for this specific component)

Inside each component folder, you'll typically find:

  • The component binary (e.g., IEShims.dll)
  • Subdirectories named "f" and "r" containing "forward" and "revert" versions for updates
  • Manifest files defining the component's properties and dependencies

This understanding was crucial for my approach to fixing the component store.

Step 5: Surgical Repair of IEShims.dll Components

With conventional methods exhausted, I developed a surgical approach to repair the corrupted components by borrowing healthy files from a working server:

First Target: IEShims.dll

I started by examining available versions on a functional server:

# Search for IEShims.dll on a working server
Get-ChildItem -Path "\\<working_server>\c$\Windows\WinSxS" -Recurse -Filter 
"IEShims.dll" | Select-Object FullName

# Results revealed several versions
C:\Windows\WinSxS\amd64_microsoft-windows-ie-ieshims_31bf3856ad364e35_11.0.17763.
1007_none_d8f217c19565d9f7\IEShims.dll
C:\Windows\WinSxS\amd64_microsoft-windows-ie-ieshims_31bf3856ad364e35_11.0.17763.
1007_none_d8f217c19565d9f7\f\IEShims.dll
C:\Windows\WinSxS\amd64_microsoft-windows-ie-ieshims_31bf3856ad364e35_11.0.17763.
1007_none_d8f217c19565d9f7\r\IEShims.dll
C:\Windows\WinSxS\amd64_microsoft-windows-ie-ieshims_31bf3856ad364e35_11.0.17763.
771_none_ffe9677749736902\IEShims.dll
C:\Windows\WinSxS\amd64_microsoft-windows-ie-ieshims_31bf3856ad364e35_11.0.17763.
771_none_ffe9677749736902\f\IEShims.dll
C:\Windows\WinSxS\amd64_microsoft-windows-ie-ieshims_31bf3856ad364e35_11.0.17763.
771_none_ffe9677749736902\r\IEShims.dll

This revealed two critical insights:

  1. Multiple versions of the component existed (771 and 1007)
  2. Each version had subdirectories f and r containing identical copies of the DLL

I prepared a network share to facilitate the transfer:

# On the working server
New-Item -Path "C:\IEShimsCopy" -ItemType Directory -Force
Copy-Item -Path "C:\Windows\WinSxS\amd64_microsoft-windows-ie-ieshims_
31bf3856ad364e35_11.0.17763.771_none_ffe9677749736902" 
-Destination "C:\IEShimsCopy\" -Recurse

Copy-Item -Path "C:\Windows\WinSxS\x86_microsoft-windows-ie-ieshims_
31bf3856ad364e35_11.0.17763.771_none_a3cacbf39115f7cc"
 -Destination "C:\IEShimsCopy\" -Recurse

# Share the folder
New-SmbShare -Name "IEShimsCopy" -Path "C:\IEShimsCopy" -FullAccess "Everyone"

On the problematic server, I attempted to copy the files directly, but immediately hit permission issues:

2025/05/10 18:51:46 ERROR 5 (0x00000005) Accessing Source Directory 
\\<working_server>\c$\Windows\WinSxS\x86_microsoft-windows-ie-ieshims_31bf3856ad364e35
_11.0.17763.771_none_a3cacbf39115f7cc\
Access is denied.

This highlighted a key challenge - WinSxS files are owned by TrustedInstaller and highly protected. I needed elevated permissions beyond even Administrator rights.

Step 6: Working with TrustedInstaller Privileges

I discovered that even Administrator rights were insufficient to modify the WinSxS directory. All files and folders in this directory are owned by the "NT SERVICE\TrustedInstaller" account, which has special privileges in Windows.

To work at this level, I needed to run operations with TrustedInstaller privileges:

# First, we needed a tool to run as TrustedInstaller
# We used AdvancedRun, a specialized tool for this purpose
# (https://www.nirsoft.net/utils/advanced_run.html)
# Once AdvancedRun was downloaded, we executed PowerShell with TrustedInstaller privileges
AdvancedRun.exe /EXEFilename "powershell.exe" /RunAs 8 /Run

With these elevated privileges, I could now copy the files correctly:

# Copy the IEShims folders from our prepared share
Copy-Item -Path "\\<working_server>\IEShimsCopy\amd64_microsoft-windows-ie-ieshims_
31bf3856ad364e35_11.0.17763.771_none_ffe9677749736902" 
-Destination "C:\Windows\WinSxS\" -Recurse -Force

Copy-Item -Path "\\<working_server>\IEShimsCopy\x86_microsoft-windows-ie-ieshims_
31bf3856ad364e35_11.0.17763.771_none_a3cacbf39115f7cc" 
-Destination "C:\Windows\WinSxS\" -Recurse -Force

However, I noticed that just copying the version 771 folders wasn't sufficient - I also needed to fix the corrupted base version (17763.1) mentioned in PossibleCorruptions.txt. The base version folders were:

amd64_microsoft-windows-ie-ieshims_31bf3856ad364e35_11.0.17763.1_none_6c4494b35d3c4ca6
x86_microsoft-windows-ie-ieshims_31bf3856ad364e35_11.0.17763.1_none_1025f92fa4dedb70

I copied these base version folders from a healthy server to ensure the corruption was completely addressed:

Copy-Item -Path "\\bearpaws\c$\Windows\WinSxS\amd64_microsoft-windows-ie-ieshims
_31bf3856ad364e35_11.0.17763.1_none_*" -Destination "C:\Windows\WinSxS\" -Recurse -Force

Copy-Item -Path "\\bearpaws\c$\Windows\WinSxS\x86_microsoft-windows-ie-ieshims
_31bf3856ad364e35_11.0.17763.1_none_*" -Destination "C:\Windows\WinSxS\" -Recurse -Force

Step 7: Surgical Repair of dsound.dll Components

With the IEShims.dll components successfully repaired, I turned my attention to the dsound.dll corruption:

# Search for dsound.dll on a working server
Get-ChildItem -Path "\\bearpaws\c$\Windows\WinSxS" -Recurse -Filter "dsound.dll" 
| Select-Object Directory | Format-Table -Wrap

# Results showed the available versions
C:\Windows\WinSxS\amd64_microsoft-windows-audio-dsound_31bf3856ad364e35_10.0.17763.348
_none_cd674c18c78a1638
C:\Windows\WinSxS\amd64_microsoft-windows-audio-dsound_31bf3856ad364e35_10.0.17763.348
_none_cd674c18c78a1638\f
C:\Windows\WinSxS\amd64_microsoft-windows-audio-dsound_31bf3856ad364e35_10.0.17763.348
_none_cd674c18c78a1638\r
C:\Windows\WinSxS\wow64_microsoft-windows-audio-dsound_31bf3856ad364e35_10.0.17763.348
_none_d7bbf66afbead833
C:\Windows\WinSxS\wow64_microsoft-windows-audio-dsound_31bf3856ad364e35_10.0.17763.348
_none_d7bbf66afbead833\f
C:\Windows\WinSxS\wow64_microsoft-windows-audio-dsound_31bf3856ad364e35_10.0.17763.348
_none_d7bbf66afbead833\r

We copied both the 64-bit and 32-bit versions to our problematic server:

# Copy dsound.dll folders with TrustedInstaller privileges
Copy-Item -Path "\\working-server\IEShimsCopy\amd64_microsoft-windows-audio-dsound
_31bf3856ad364e35_10.0.17763.348_none_cd674c18c78a1638" 
-Destination "C:\Windows\WinSxS\" -Recurse -Force

Copy-Item -Path "\\working-server\IEShimsCopy\wow64_microsoft-windows-audio-dsound
_31bf3856ad364e35_10.0.17763.348_none_d7bbf66afbead833" 
-Destination "C:\Windows\WinSxS\" -Recurse -Force

Similar to IEShims.dll, we also needed to ensure the base version (17763.1) was fixed:

Copy-Item -Path "\\bearpaws\c$\Windows\WinSxS\amd64_microsoft-windows-audio-dsound
_31bf3856ad364e35_10.0.17763.1_none_*" -Destination "C:\Windows\WinSxS\" -Recurse -Force

Copy-Item -Path "\\bearpaws\c$\Windows\WinSxS\wow64_microsoft-windows-audio-dsound
_31bf3856ad364e35_10.0.17763.1_none_*" -Destination "C:\Windows\WinSxS\" -Recurse -Force

After both sets of components were copied, we ran a component store cleanup to register the changes:

dism /online /cleanup-image /startcomponentcleanup

Then, critically, we performed a reboot to ensure all changes were properly integrated:

Restart-Computer

Step 8: Breakthrough - First Successful Update!

After repairing both the IEShims.dll and dsound.dll components, I then attempted to install KB5023702 (March 2023):

dism /online /add-package /packagepath:Windows10.0-KB5023702-x64.cab

The installation proceeded through 100% and completed successfully:

2025-05-11 16:53:49, Info CBS Perf: InstallUninstallChain complete.
2025-05-11 16:53:49, Info CBS Exec: Package successfully staged for FOD Reservicing
2025-05-11 16:53:49, Info CBS Exec: TransientManifestCache disabled in config.
2025-05-11 16:53:49, Info CBS FinalCommitPackagesState: Started persisting state of packages
2025-05-11 16:54:08, Info CBS Reporting package change for package: 
Package_for_RollupFix~31bf3856ad364e35~amd64~~17763.4131.1.10, current: 
Resolved, pending: Default, start: Resolved, applicable: Installed, target: Installed,
limit: Installed, status: 0x0, failure source: Execute, reboot required: True, 
client id: DISM Package Manager Provider, initiated offline: False, 
execution sequence: 778, first merged sequence: 778, reboot reason: DRIVERSPRESENT, 
RM App session: -1, RM App name: N/A, FileName in use: N/A, release type: Security Update,
OC operation: False, download source: 0, download time (secs): 4294967295, 
download status: 0x0 (S_OK), Express download: False, Download Size: 0

Step 9 : Slow and Steady with the Updates
Rather than trying to jump straight to the latest updates, I took a gradual approach first we started with KB5025229 (which is the Cumulative update for April 2023) then I moved on to KB5027222 (which is the Cumulative update for June 20233) these were working well, however July 2023 changed the tide and introduced more issues.

The process so far was slow but it did work:
  1. KB4465477 (2018-10 Update for Windows Server 2019)
  2. KB5022554 (2022-12 Cumulative Update for Windows Server 2019)
  3. KB5022286 (January 2023 Culamtive Update)
  4. KB5005112 (2021-08 Servicing Stack Update for Windows Server 2019)
  5. KB5022840 (Feburary 2023 Cumlative Update)
  6. KB5025229 (Internal SSU-17763.4121-x64.cab with DISM)
  7. KB5023702 (March 2023 Cumulative Update)
  8. KB5025229 (April 2023 Cumulative Update)
  9. KB5027222 (June 2023 Cumulative Update)
Step 10 : WindowImageAquisition-CoreServices
When we tried to install KB5028168 (July 2023), we encountered a new hydration errors:
Hydration failed for component Microsoft-Windows-WindowsImageAcquisition-CoreServices,
version 10.0.17763.4644, arch amd64, nonSxS, pkt {l:8 b:31bf3856ad364e35} on file 
wiaservc.dll with NTSTATUS -2146498170
Hydration failed for component Microsoft-Windows-WindowsImageAcquisition-CoreServices, 
version 10.0.17763.4644, arch amd64, nonSxS, pkt {l:8 b:31bf3856ad364e35} on file 
wiaservc.dll with NTSTATUS -2146498170
I can see that the wiaservc.dll on a working server has the timestamp later than the timestamp on the "offline" servers which means for the %windir%\System32 directory we can replace this file, simples, but we need to get this DLL into the WinSXS to get that update replicated there as well.

This will require the TrustedInstalled access which will require this command once again:
AdvancedRun.exe /EXEFilename "powershell.exe" /RunAs 8 /Run
# Create a variable for the base folder name
$baseFolderName = "amd64_microsoft-windows-windowsimageacquisition-coreservices_
31bf3856ad364e35_10.0.17763.4644_none"
Then from there we need this script to copy the files into the correct locations:

# Try to find similar folders to determine the hash suffix
$similarFolders = Get-ChildItem -Path "C:\Windows\WinSxS" -Directory | 
Where-Object { $_.Name -like "$baseFolderName*" }

if ($similarFolders) {
    # Use the exact folder name if found
    $targetFolder = $similarFolders[0].FullName
    Write-Host "Found existing folder: $targetFolder" -ForegroundColor Green
} else {
    # Create a folder with an approximated hash if not found
    $targetFolder = "C:\Windows\WinSxS\${baseFolderName}_a5a8b8a1ae7bd6e2"
    Write-Host "Creating new folder: $targetFolder" -ForegroundColor Yellow
    New-Item -Path $targetFolder -ItemType Directory -Force
}

# Create the f and r subfolders
Write-Host "Creating subfolders..." -ForegroundColor Cyan
New-Item -Path "$targetFolder\f" -ItemType Directory -Force -ErrorAction SilentlyContinue
New-Item -Path "$targetFolder\r" -ItemType Directory -Force -ErrorAction SilentlyContinue

# Copy the DLL file to all necessary locations
Write-Host "Copying wiaservc.dll to the folder and subfolders..." -ForegroundColor Cyan
Copy-Item -Path "C:\Windows\System32\wiaservc.dll" -Destination 
"$targetFolder\wiaservc.dll" -Force
Copy-Item -Path "C:\Windows\System32\wiaservc.dll" -Destination 
"$targetFolder\f\wiaservc.dll" -Force
Copy-Item -Path "C:\Windows\System32\wiaservc.dll" -Destination 
"$targetFolder\r\wiaservc.dll" -Force

Write-Host "Files copied successfully!" -ForegroundColor Green
Write-Host "Now run: dism /online /cleanup-image /startcomponentcleanup" 
-ForegroundColor Cyan
This will then get the correct files in the correct location from there we need to run these two commands:
dism /online /cleanup-image /startcomponentcleanup
sfc /scannow
Then we need to restart the update for KB5028168 (July 2023 Cumulative Update) and see what has changed on that run by looking at the cbs.log file:

2025-05-12 13:25:24, Error CSI 0000000f (F) Hydration failed with original error NTSTATUS_FROM_WIN32(ERROR_INVALID_DATA) . Delta Type: 1 , IntegrityState Valid: true , RetrievedChecksum: 1432864936 , ComputedChecksum: 1432864936[gle=0x80004005] 2025-05-12 13:25:24, Error CSI 00000010@2025/5/12:12:25:24.195 (F) onecore\base\wcp\deltahydrator\deltahydrator.cpp(64): Error 800f0986 [Warning,Facility=15 (0x000f),Code=2438 (0x0986)] originated in function DeltaHydrator::`anonymous-namespace'::GetPsfxSpecificError expression: ((SCODE) (((unsigned long)(1)<<31) | ((unsigned long)(15)<<16) | ((unsigned long)(0x986))) ) [gle=0x80004005] 2025-05-12 13:25:24, Error CSI 00000011 (F) Hydration failed for component Microsoft-Windows-WindowsImageAcquisition-CoreServices, version 10.0.17763.4644, arch amd64, nonSxS, pkt {l:8 b:31bf3856ad364e35} on file wiaservc.dll with NTSTATUS -2146498170[gle=0x80004005] 2025-05-12 13:25:24, Error CSI 00000012@2025/5/12:12:25:24.198 (F) onecore\base\wcp\rtllib\win32lib\delta_library.cpp(287): Error NTSTATUS_FROM_WIN32(ERROR_INVALID_DATA) originated in function Windows::Rtl::DeltaDecompressBuffer expression: g_pfnApplyDeltaB(( (DELTA_FLAG_TYPE)0x00000000 ), ReferenceInput, CompressedInput, &UncompressedOutput) [gle=0x80004005] 2025-05-12 13:25:24, Error CSI 00000013 (F) Hydration failed with original error NTSTATUS_FROM_WIN32(ERROR_INVALID_DATA) . Delta Type: 1 , IntegrityState Valid: true , RetrievedChecksum: 1432864936 , ComputedChecksum: 1432864936[gle=0x80004005] 2025-05-12 13:25:24, Error CSI 00000014@2025/5/12:12:25:24.198 (F) onecore\base\wcp\deltahydrator\deltahydrator.cpp(64): Error 800f0986 [Warning,Facility=15 (0x000f),Code=2438 (0x0986)] originated in function DeltaHydrator::`anonymous-namespace'::GetPsfxSpecificError expression: ((SCODE) (((unsigned long)(1)<<31) | ((unsigned long)(15)<<16) | ((unsigned long)(0x986))) ) [gle=0x80004005] 2025-05-12 13:25:24, Error CSI 00000015 (F) Hydration failed for component Microsoft-Windows-WindowsImageAcquisition-CoreServices, version 10.0.17763.4644, arch amd64, nonSxS, pkt {l:8 b:31bf3856ad364e35} on file wiaservc.dll with NTSTATUS -2146498170[gle=0x80004005] 2025-05-12 13:25:24, Error CSI 00000016@2025/5/12:12:25:24.199 (F) Attempting to mark store corrupt with category [l:18 ml:19]'CorruptPayloadFile'[gle=0x80004005] 2025-05-12 13:25:24, Info CSI 00000017 Hashes for file member [l:12]'wiaservc.dll' do not match. Expected: {l:32 ml:4096 b:52c51705add389342d89ad4e872f9f844ebb5efe4a29a1824aa534edbdec7724}. Actual: {l:32 b:b5c7b5442cf23af690d722540ea201e3c5660de09a800fecb1a78957948ac098}. 2025-05-12 13:25:27, Error CSI 00000018 (F) STATUS_DELETE_PENDING #6703996# from Windows::Rtl::SystemImplementation::DirectFileSystemProvider::SysCreateFile(flags = (AllowSharingViolation), handle = {provider=NULL, handle=0, name= ("null")}, da = (DELETE|SYNCHRONIZE|FILE_READ_ATTRIBUTES|FILE_WRITE_ATTRIBUTES), oa = @0xb29a97cb98->OBJECT_ATTRIBUTES {s:48; rd:NULL; on:[110]'\SystemRoot\WinSxS\Temp\InFlight\f69760ed38c3db017831000090253c1a\029f82ed38c3db01fb31000090253c1a_lapspsh.dll'; a:(OBJ_CASE_INSENSITIVE)}, iosb = @0xb29a97cc00, as = [gle=0xd0000056] 2025-05-12 13:25:27, Error CSI (null), fa = (FILE_ATTRIBUTE_NORMAL), sa = (FILE_SHARE_READ|FILE_SHARE_WRITE|FILE_SHARE_DELETE), cd = FILE_OPEN, co = (FILE_NON_DIRECTORY_FILE|FILE_SYNCHRONOUS_IO_NONALERT|0x00004000), eab = NULL, eal = 0, disp = Invalid) [gle=0xd0000056] 2025-05-12 13:25:27, Error CSI 00000019@2025/5/12:12:25:27.063 (F) onecore\base\wcp\sil\ntsystem.cpp(2987): Error STATUS_DELETE_PENDING originated in function Windows::Rtl::SystemImplementation::DirectFileSystemProvider::SysCreateFile expression: (null) [gle=0x80004005] 2025-05-12 13:25:27, Error CSI 0000001a (F) STATUS_DELETE_PENDING #6703992# from Windows::Rtl::SystemImplementation::CDirectory::DeleteRecursively(...)[gle=0xd0000056] 2025-05-12 13:25:27, Error CSI 0000001b (F) 800f0986 [Error,Facility=(000f),Code=2438 (0x0986)] #6640401# from Windows::COM::CComponentStore::InternalTransact(...)[gle=0x800f0986] 2025-05-12 13:25:27, Error CSI 0000001c (F) 800f0986 [Error,Facility=(000f),Code=2438 (0x0986)] #6625091# from Windows::ServicingAPI::CCSITransaction::ICSITransaction2_AddFiles(Flags = 0, a = @0x246832c8040, fn = @0x246832c8840, fp = @0x246832c9040, disp = 0, op = 0)[gle=0x800f0986] 2025-05-12 13:25:27, Info CBS Failed to add to transaction package: Package_9138_for_KB5028168~31bf3856ad364e35~amd64~~10.0.1.14 [HRESULT = 0x800f0986 - PSFX_E_APPLY_FORWARD_DELTA_FAILED] 2025-05-12 13:25:27, Error CBS Failed to stage execution package: Package_9138_for_KB5028168~31bf3856ad364e35~amd64~~10.0.1.14 [HRESULT = 0x800f0986 - PSFX_E_APPLY_FORWARD_DELTA_FAILED] 2025-05-12 13:25:27, Info CBS CommitPackagesState: Started persisting state of packages 2025-05-12 13:25:28, Info CBS CommitPackagesState: Completed persisting state of packages 2025-05-12 13:25:28, Info CSI 0000001d@2025/5/12:12:25:28.116 CSI Transaction @0x246fcc51fa0 destroyed 2025-05-12 13:25:28, Info CBS Perf: Stage chain complete. 2025-05-12 13:25:28, Info CBS Failed to stage execution chain. [HRESULT = 0x800f0986 - PSFX_E_APPLY_FORWARD_DELTA_FAILED] 2025-05-12 13:25:28, Error CBS Failed to process single phase execution. [HRESULT = 0x800f0986 - PSFX_E_APPLY_FORWARD_DELTA_FAILED] 2025-05-12 13:25:28, Info CBS WER: Generating failure report for package: Package_for_ServicingStack_4640~31bf3856ad364e35~amd64~~17763.4640.1.3, status: 0x800f0986, failure source: Stage, start state: Installed, target state: Installed, client id: DISM Package Manager Provider 2025-05-12 13:25:28, Info CBS Not able to query DisableWerReporting flag. Assuming not set... [HRESULT = 0x80070002 - ERROR_FILE_NOT_FOUND] 2025-05-12 13:25:29, Info CBS Stage time performance datapoint is invalid. [HRESULT = 0x80070490 - ERROR_NOT_FOUND] 2025-05-12 13:25:29, Info CBS Execute time performance datapoint is invalid. [HRESULT = 0x80070490 - ERROR_NOT_FOUND] 2025-05-12 13:25:29, Info CBS FinalCommitPackagesState: Completed persisting state of packages 2025-05-12 13:25:29, Info CBS Enabling LKG boot option 2025-05-12 13:25:29, Info CBS Exec: Automatic corruption repair has already been attempted today, skip it. 2025-05-12 13:25:29, Info CBS Exec: Processing complete. Session: 31179575_330867861, Package: Package_for_ServicingStack_4640~31bf3856ad364e35~amd64~~17763.4640.1.3, Identifier: KB5028316 [HRESULT = 0x800f0986 - PSFX_E_APPLY_FORWARD_DELTA_FAILED] 2025-05-12 13:25:29, Info CBS Exec: Servicing Stack Upgrade operation will complete after TiWorker is recycled. 2025-05-12 13:25:29, Error CBS Failed to perform operation. [HRESULT = 0x800f0986 - PSFX_E_APPLY_FORWARD_DELTA_FAILED] 2025-05-12 13:25:29, Info CBS Session: 31179575_330867861 finalized. Reboot required: no [HRESULT = 0x800f0986 - PSFX_E_APPLY_FORWARD_DELTA_FAILED] 2025-05-12 13:25:29, Info CBS Failed to FinalizeEx using worker session [HRESULT = 0x800f0986]
This does not really get us further forward as the result of installing an update is the same just with different errors.

Step 11: Single re-corrupting package causes disasters

This problem all seems to come from a single package, the package we seem to have a problem with is the Windows Image Acquisition (WIA) this package seems to corrupt itself on install as outlined earlier, and the system cannot confirm the hash values of a single DLL file.

Package_for_RollupFix~31bf3856ad364e35~amd64~~17763.4645.1.14

The problem seems to be occurring because rather than being built with Server Core - It was unfortunately built with the desktop experience package, which was late removed - When this package was removed, the system still seems to think it needs to use WIA even though if you check for that role/feature it is no longer installed.

Step 12 : Check for corrupt packages

Corrupt packages as you found in step 11 and you will also find a step 12 can practically wreak havoc on applying cumulative updates to your servers, what is the servicing stack updates (SSU) And other non-cumulative updates install absolutely fine.

You need to check for this corruption which can be found in the CBS.log however, it will be buried in there with lots of other information you don’t need to view, you can use the TSS tool to extract this file or you could use the script. I’ve created below that only focused on package corruption, however, remember to run the commands below giving the best chance of accuracy:

sfc /scannow
DISM /Online /Cleanup-Image /ScanHealth

When those commands have completed successfully, you can then run the script below:

# Paths to check
$logFile = "C:\Windows\Logs\CBS\CBS.log"

# Check if the file exists
if (!(Test-Path $logFile)) {
    Write-Host "CBS.log not found at $logFile"
    exit
}

# Read and match lines indicating corruption
$matches = Select-String -Path $logFile -Pattern "Marking package as corrupt|Failed to resolve package"

# Process matches into a structured report
$report = foreach ($line in $matches) {
    $logText = $line.Line.Trim()
    $timestamp = $logText.Substring(0,23)
    $type = if ($logText -like "*Marking package*") { "Marked Corrupt" } else { "Failed to Resolve" }
    if ($logText -match "Package_for_KB(\d{7})") {
        $kb = $matches[1]
    } else {
        $kb = "Unknown"
    }

    [PSCustomObject]@{
        Timestamp = $timestamp
        Type      = $type
        KB        = $kb
        RawLine   = $logText
    }
}

# Output file paths
$csvPath = "$env:USERPROFILE\Desktop\Corrupt_Packages_Report.csv"
$htmlPath = "$env:USERPROFILE\Desktop\Corrupt_Packages_Report.html"

# Export to CSV
$report | Export-Csv -Path $csvPath -NoTypeInformation -Encoding UTF8

# Export to HTML
$report | ConvertTo-Html -Title "Component Corruption Report" -PreContent "<h2>Corrupt Rollup Packages Detected in CBS.log</h2>" |
    Out-File -FilePath $htmlPath -Encoding UTF8

Write-Host "Report generated:"
Write-Host "CSV: $csvPath"
Write-Host "HTML: $htmlPath"

This will generate both a CSV and HTML file for review.

Step 13: Groundhog Day : WinSXS loop from hell

This puts us in a bit of a quandary because the process seems to flow something like this:
  1. Uninstall the corrupt package
  2. Reboot the server
  3. Install any cumulative update from July 2023 onwards
  4. The update process tries to reinstall Package_for_RollupFix~31bf3856ad364e35~amd64~~17763.4645.1.14
  5. DISM detects that the hash file for the DLL does not match
  6. The Package from step 4 re-corrupts itself
  7. Cumulative update fails
  8. Uninstall the corrupt package - go to Step 1
This means we seem to have got ourselves in a very nasty cycle of remove the current package, try to update the Server with a cumulative update, that update tries to install the package which then corrupts and causes set up to fail.

Step 14 : Attempting newer cumulative updates

I was originally trying July 2023 as the cumulative update, There were no obvious known issues on The Microsoft support documentation so I wonder what would happen if I tried to jump forward a couple of releases.

I got exactly the same problem with the January 2024 update, so this corruption is not unique to a single cumulative update. It seems to be a problem after you get to July 2023 and onwards.

Step 15 : Refine the DISM commands to recover the component store and state

This component corruption is a little bit of a pain, so lets run a system file check on our domain controller due to some concerns about system integrity. I started with the standard approach of using the DISM (Deployment Image Servicing and Management) tool to check and repair the Windows component store:
DISM /Online /Cleanup-Image /RestoreHealth /Source:wim:D:\Sources\install.wim:4 /LimitAccess

However, this command failed with an error indicating that the source couldn't be found. After checking the Windows image information, I discovered I was using the wrong index:

DISM /Get-WimInfo /WimFile:D:\Sources\install.wim

The output showed that index 3 was the correct one for Windows Server 2019 Datacenter (without Desktop Experience), so I adjusted my command:

DISM /Online /Cleanup-Image /RestoreHealth /Source:wim:D:\Sources\install.wim:3 /LimitAccess

This is where things got interesting. The DISM process started, but then appeared to get stuck in an endless loop. The CBS.log showed the same message repeating for hours:

2025-05-17 15:38:07, Info                  CBS    Enumerating Foundation package: 
Microsoft-Windows-Foundation-Package~31bf3856ad364e35~amd64~~10.0.17763.1, 
this could be slow

I monitored this for over 24 hours, and the process was still showing the same message, occasionally completing sessions and starting new ones, but never making real progress.

After waiting patiently for 46 hours, I decided that something was fundamentally wrong. I tried pressing Ctrl+C to cancel the operation, but the process was unresponsive.

Given the critical nature of a domain controller, I was hesitant to forcibly terminate the process. However, after consulting with our team, I used PsKill from the Sysinternals suite to end the stuck DISM process:

pskill \\bearclaws 8540

I then checked if any related processes were still running:

pslist \\bearclaws dism

This showed that DismHost was still running. I tried a simpler DISM approach:

DISM /Online /Cleanup-Image /RestoreHealth

However, this command also showed signs of getting stuck. At this point, I realized we had a more serious issue with the servicing stack.

I attempted to reboot the server using PowerShell:

Restart-Computer

To my surprise, the server entered a "preparing to shutdown" state but never actually rebooted. The event logs showed:

2025-05-18 15:19:41, Info                  CBS    Starting shutdown processing in 
TrustedInstaller service.
2025-05-18 15:19:41, Info                  CBS    Failed to get reserve manager. 
[HRESULT = 0x800f0970]

But the shutdown never completed. When trying to force a restart with the shutdown command:

shutdown /r /t 0 /f /m \\bearclaws

I received the error:

bearclaws: A system shutdown is in progress.(1115)

This left the server in an unusual state - basic domain controller functions like ADDS were still operational, but the server was in a perpetual shutdown state. After careful consideration, a physical reboot was performed during a scheduled maintenance window.

After the reboot, I ran System File Checker to verify the integrity of system files:

sfc /scannow

To my relief, this completed successfully with no integrity violations detected:

Beginning system scan.  This process will take some time.
Beginning verification phase of system scan.
Verification 100% complete.
Windows Resource Protection did not find any integrity violations.

With the system files now verified as intact, I tried to apply the January 2024 Cumulative Update:

DISM /Online /Cleanup-Image /RestoreHealth

However, this failed due to Windows Update service being stopped. I checked and started the service:

Get-Service wuauserv | Select Status, StartType
Start-Service wuauserv

Despite this, update attempts continued to fail. Looking at the log files, I noticed that multiple files were corrupted, particularly wiaservc.dll.

I compared the file versions on our healthy domain controller with the problematic one:

Directory of C:\Windows\WinSxS\amd64_microsoft-windows-w..sition-coreservices_
31bf3856ad364e35_10.0.17763.6414_none_80171d5a852ac93d
27/10/2024  00:17           653,312 wiaservc.dll

Directory of C:\Windows\WinSxS\amd64_microsoft-windows-w..sition-coreservices_
31bf3856ad364e35_10.0.17763.1_none_235a24f70b7cc803
20/02/2023  19:46            54,092 wiaservc.dll

The file size discrepancy (54,092 bytes vs 653,312 bytes) suggested corruption.

I tried using our healthy domain controller as a repair source:

DISM /Online /Cleanup-Image /RestoreHealth /Source:\\bearclaws\C$\Windows\WinSxS /LimitAccess

This partially succeeded, repairing 6 files but still encountering issues:

CSI Payload Repaired:   6

As a final step, I ran the Component Store Cleanup to optimize the component store:

DISM /Online /Cleanup-Image /StartComponentCleanup

And verified once more with SFC:

sfc /scannow

This completed successfully, leaving the server in a stable operating state which was a releif.

Despite all these repairs, I was still unable to apply the January 2024 update. The update consistently failed with the error:

0x800f0986 - PSFX_E_APPLY_FORWARD_DELTA_FAILED

This error indicates a problem with the delta patching mechanism, likely due to inconsistencies in the underlying component store.

However, the server remains resistant to applying new updates due to underlying component store inconsistencies even after all these steps (including step from the whole article)

Step 16: DISM targeted for custom "repair" folders

It would appear that a "general" repair with the install.wim is not working so we need to extract the files from the install.wim file and then add them to a new directory, for example:

In this example we have the 2x component store packages which are failing in the cbs.log, more on that later on:

Dism /online /cleanup-image /restorehealth /source:c:\source /limitaccess
This will then only target the folder specified and if you review the cbs.log in the folder c:\windows\logs\cbs you will notice that it identifies the folder and the file that has been repaired (in bold below)

2025-05-19 19:49:27, Info                  CBS    (p) CSI Payload Corrupt (l) (Fixed) amd64_microsoft-windows-security-negoexts_31bf3856ad364e35_10.0.17763.5122_none_05e46b31cb2e92f5\r\negoexts.dll
2025-05-19 19:49:27, Info                  CBS    
2025-05-19 19:49:27, Info                  CBS    Summary:
2025-05-19 19:49:27, Info                  CBS    Operation: Detect and Repair 
2025-05-19 19:49:27, Info                  CBS    Operation result: 0x0
2025-05-19 19:49:27, Info                  CBS    Last Successful Step: Entire operation completes.
2025-05-19 19:49:27, Info                  CBS    Total Detected Corruption: 1
2025-05-19 19:49:27, Info                  CBS    CBS Manifest Corruption: 0
2025-05-19 19:49:27, Info                  CBS    CBS Metadata Corruption: 0
2025-05-19 19:49:27, Info                  CBS    CSI Manifest Corruption: 0
2025-05-19 19:49:27, Info                  CBS    CSI Metadata Corruption: 0
2025-05-19 19:49:27, Info                  CBS    CSI Payload Corruption: 1
2025-05-19 19:49:27, Info                  CBS    Total Repaired Corruption: 1
2025-05-19 19:49:27, Info                  CBS    CBS Manifest Repaired: 0
2025-05-19 19:49:27, Info                  CBS    CSI Manifest Repaired: 0
2025-05-19 19:49:27, Info                  CBS    CSI Payload Repaired: 1
2025-05-19 19:49:27, Info                  CBS    CSI Store Metadata refreshed: True
2025-05-19 19:49:27, Info                  CBS    
2025-05-19 19:49:27, Info                  CBS    Total Operation Time: 139 seconds.
2025-05-19 19:49:27, Info                  CBS    Ensure CBS corruption flag is clear
2025-05-19 19:49:27, Info                  CBS    All WCP store corruptions were fixed
2025-05-19 19:49:27, Info                  CBS    Ensure WCP corruption flag is clear
2025-05-19 19:49:27, Info                  CBS    All CSI corruption was fixed, ensure CorruptionDetectedDuringAcr is clear
2025-05-19 19:49:27, Info                  CBS    CheckSur: hrStatus: 0x0 [S_OK], download Result: 0x0 [S_OK]
2025-05-19 19:49:27, Info                  CBS    Count of times corruption detected: 1
2025-05-19 19:49:27, Info                  CBS    Seconds between initial corruption detections: -1
2025-05-19 19:49:27, Info                  CBS    Seconds between corruption and repair: 68812449
2025-05-19 19:49:27, Info                  CBS    Reboot mark cleared
2025-05-19 19:49:27, Info                  CBS    Winlogon: Simplifying Winlogon CreateSession notifications
2025-05-19 19:49:27, Info                  CBS    Winlogon: Deregistering for CreateSession notifications
2025-05-19 19:49:27, Info                  CBS    Exec: Processing complete, session(Corruption Repairing): 31181038_1817545646 [HRESULT = 0x00000000 - S_OK]
2025-05-19 19:49:27, Info                  CBS    Session: 31181038_1817545646 finalized. Reboot required: no [HRESULT = 0x00000000 - S_OK]
2025-05-19 19:49:27, Info                  CBS    Session: 31181038_3213612455 initialized by client DISM Package Manager Provider, external staging directory: (null), external registry directory: (null

That however does not fix this problem with the wiaservc.dll this is still an issue, so if we try an update again to January 2024 as before this will again fail as below:


However this time if we look again at the cbs.log and this time search for the words "hashes for file member" you will see all the files we have a problem with, and in this case there is only one left now as you can see below:


If you wish to see which packages have been installed this command will show you all those pacakges:

dism /online /get-packages /format:Table

That will then give you the error we have which is the has of the file is not matching what we have installed:

2025-05-19 20:05:06, Info                  CSI    0000001a Hashes for file member [l:12]'wiaservc.dll' do not match.
 Expected: {l:32 ml:4096 b:52c51705add389342d89ad4e872f9f844ebb5efe4a29a1824aa534edbdec7724}.
 Actual: {l:32 b:b5c7b5442cf23af690d722540ea201e3c5660de09a800fecb1a78957948ac098}.

Then right underneath that will be where the file should be located so from that you can get the folder and the name of the DLL:

2025-05-19 20:07:12, Info                  CBS    Exec: Not able to find amd64_microsoft-windows-w..sition-coreservices_31bf3856ad364e35_10.0.17763.1_none_235a24f70b7cc803\wiaservc.dll from directory local source
2025-05-19 20:07:12, Info                  CBS    Exec: Not able to find WinSxS\amd64_microsoft-windows-w..sition-coreservices_31bf3856ad364e35_10.0.17763.1_none_235a24f70b7cc803\wiaservc.dll from directory local source
2025-05-19 20:07:12, Info                  CBS    Repr: Not able to find replacement file for component amd64_microsoft-windows-w..sition-coreservices_31bf3856ad364e35_10.0.17763.1_none_235a24f70b7cc803, file wiaservc.dll from any local source
2025-05-19 20:07:12, Info                  CBS    Repr: Add missing payload:amd64_microsoft-windows-w..sition-coreservices_31bf3856ad364e35_10.0.17763.1_none_235a24f70b7cc803\wiaservc.dll

That points us for this example to this folder in C:\Windows\WinSXS: 

amd64_microsoft-windows-w..sition-coreservices_31bf3856ad364e35_10.0.17763.1_none_235a24f70b7cc803\wiaservc.dll

This file should be provided with KB5028168 which is July 2023 cumulative update which is ironically the cumulative updated where we first had a hint of this issue, so lets look for this file on another server with the same folder path name:

Note : We need the server to be the same operating system which in this case is Server 2019 this command will check the other domain controllers for the correct folder name:

$server = "<working-server>" 
$folder = "amd64_microsoft-windows-w..sition-coreservices_31bf3856ad364e35_10.0.17763.1_none_235a24f70b7cc803"
$path = "\\$server\C$\Windows\WinSxS\$folder"

if (Test-Path $path) {
    Write-Host "Folder found: $path"
} else {
    Write-Host "Folder not found on $server"
}

When this is run on the working domain controller we have found the folder, so now we need to copy this folder to a temporary directory on the non-functional servers, for this command we will be running it locally on the broken domain controller into a folder called "Source" as below:

$server = "<working-server>"
$remoteFolder = "\\$server\C$\Windows\WinSxS\amd64_microsoft-windows-w..sition-coreservices_31bf3856ad364e35_10.0.17763.1_none_235a24f70b7cc803"
$localFolder = "C:\Source"

Copy-Item -Path $remoteFolder -Destination $localFolder -Recurse -Force

We now need to get this file restored to the WinSXS folder with a /RestoreHealth pointing at that folder with the files we copied earlier:

Dism /Online /Cleanup-Image /RestoreHealth /source:C:\Source /LimitAccess

Now we know the expected hash value from the error earlier, so we now need to check the actual hash value and compare it to the one expected:

$expectedHash = "52c51705add389342d89ad4e872f9f844ebb5efe4a29a1824aa534edbdec7724"
$localFile = "C:\Source\amd64_microsoft-windows-w..sition-coreservices_31bf3856ad364e35_10.0.17763.1_none_235a24f70b7cc803\wiaservc.dll"

if (Test-Path $localFile) {
    $actualHash = (Get-FileHash -Path $localFile -Algorithm SHA256).Hash.ToLower()
    if ($actualHash -eq $expectedHash.ToLower()) {
        Write-Host "✅ Hash matches the expected value."
    } else {
        Write-Host "❌ Hash mismatch!"
        Write-Host "Expected: $expectedHash"
        Write-Host "Actual:   $actualHash"
    }
} else {
    Write-Host "File not found: $localFile"
}

It looks like we have a winner we have found a hash value that matches the require value:


This means we need to on the broken server copy that folder to the path c:\Source however we cannot copy these files to the WinSXS folder as they are protected by TrustedInstaller so first we need to take ownership of the folder:

takeown /f "C:\Windows\WinSxS\amd64_microsoft-windows-w..sition-coreservices_31bf3856ad364e35_10.0.17763.1_none_235a24f70b7cc803" /r /d y

Then we need to give ourself permissions to the folder:

icacls "C:\Windows\WinSxS\amd64_microsoft-windows-w..sition-coreservices_31bf3856ad364e35_10.0.17763.1_none_235a24f70b7cc803" /grant administrators:F /t

The we need to copy the files from the C:\Source folder to the actual folder with this command:

copy "C:\Temp2\amd64_microsoft-windows-w..sition-coreservices_31bf3856ad364e35_10.0.17763.1_none_235a24f70b7cc803" "C:\Windows\WinSxS\amd64_microsoft-windows-w..sition-coreservices_31bf3856ad364e35_10.0.17763.1_none_235a24f70b7cc803"

Then when we run the upgrade with the correct files with the correct has, we then get a successful install of the cumulative update as you can see below, this is obtained from the Get-Hotfix command

grizzly    Security Update  KB5034127     BEAR\adm.patchuser   20/05/2025 00:00:00

This means we have now successfully installed January 2024 hotfix (KB5034127) on this server.

Step 17 : Corrupt packages and Hydration failures - again!

This now means its smooth sailing, but no that is not true when we try to install Feburary 2025 we have another hydration failure this time with P2PGraph.dll which is part of the Microsoft-Windows-PeerToPeerGraphing packages and this time we have a hash problem all over again as below:

2025-05-20 20:51:08, Info                  CSI    0000054d Hashes for file member [l:12]'P2PGraph.dll' do not match.
Expected: {l:32 ml:4096 b:f76818894844be5c7851c8ade7a0b4382b1f4928896753e48b7208b9ddb5508c}.
Actual: {l:32 b:088c0e96907a474d5aa03610648bc272d795d4a84eed7ea78615959d3cb76cf2}.

The full log is this:

2025-05-20 20:51:08, Error                 CSI    00000548@2025/5/20:19:51:08.219 (F) onecore\base\wcp\rtllib\win32lib\delta_librar
y.cpp(287): Error NTSTATUS_FROM_WIN32(ERROR_INVALID_DATA) originated in function Windows::Rtl::DeltaDecompressBuffer expression: g_
pfnApplyDeltaB(( (DELTA_FLAG_TYPE)0x00000000 ), ReferenceInput, CompressedInput, &UncompressedOutput)
[gle=0x80004005]
2025-05-20 20:51:08, Error                 CSI    00000549 (F) Hydration failed with original error NTSTATUS_FROM_WIN32(ERROR_INVAL
ID_DATA) . Delta Type: 1 , IntegrityState Valid: true , RetrievedChecksum: 654885633 , ComputedChecksum: 654885633[gle=0x80004005]
2025-05-20 20:51:08, Error                 CSI    0000054a@2025/5/20:19:51:08.219 (F) onecore\base\wcp\deltahydrator\deltahydrator.
cpp(64): Error 800f0986 [Warning,Facility=15 (0x000f),Code=2438 (0x0986)] originated in function DeltaHydrator::`anonymous-namespac
e'::GetPsfxSpecificError expression: ((SCODE) (((unsigned long)(1)<<31) | ((unsigned long)(15)<<16) | ((unsigned long)(0x986))) )
[gle=0x80004005]
2025-05-20 20:51:08, Error                 CSI    0000054b (F) Hydration failed for component Microsoft-Windows-PeerToPeerGraphing,
version 10.0.17763.5830, arch amd64, nonSxS, pkt {l:8 b:31bf3856ad364e35} on file P2PGraph.dll with NTSTATUS -2146498170[gle=0x800
04005]
2025-05-20 20:51:08, Error                 CSI    0000054c@2025/5/20:19:51:08.220 (F) Attempting to mark store corrupt with categor
y [l:18 ml:19]'CorruptPayloadFile'[gle=0x80004005]
2025-05-20 20:51:08, Info                  CSI    0000054d Hashes for file member [l:12]'P2PGraph.dll' do not match.
Expected: {l:32 ml:4096 b:f76818894844be5c7851c8ade7a0b4382b1f4928896753e48b7208b9ddb5508c}.
Actual: {l:32 b:088c0e96907a474d5aa03610648bc272d795d4a84eed7ea78615959d3cb76cf2}.

This then needs to be obtained from a working server as before or we need the ISO from the original Server 2019 ISO, as this version has been superseded I needed to work with Microsoft to get this version of the file, once that was copied over and added to the WinSXS with these commands:

takeown /f "C:\Windows\WinSxS\amd64_microsoft-windows-peertopeergraphing_31bf3856ad364e35_10.0.17763.1_none_5388c4ba41e7f605" /r /d y
 
icacls "C:\Windows\WinSxS\amd64_microsoft-windows-peertopeergraphing_31bf3856ad364e35_10.0.17763.1_none_5388c4ba41e7f605" /grant administrators:F /t 

copy "C:\T\emp2\amd64_microsoft-windows-peertopeergraphing_31bf3856ad364e35_10.0.17763.1_none_5388c4ba41e7f605" "C:\Windows\WinSxS\amd64_microsoft-windows-peertopeergraphing_31bf3856ad364e35_10.0.17763.1_none_5388c4ba41e7f605"

The update to February 2025 failed with the same file but the hash was different as it was expecting a different build number as you can see below:

amd64_microsoft-windows-peertopeergraphing_31bf3856ad364e35_10.0.17763.1
amd64_microsoft-windows-peertopeergraphing_31bf3856ad364e35_10.0.17763.5830

The next error was with the same component and filename but from a different build so this means a different has value as well, that is shown in bold:

2025-05-22 07:17:19, Error                 CSI    00000020 (F) Hydration failed for component Microsoft-Windows-PeerToPeerGraphing, version 10.0.17763.5830, arch Host= amd64 Guest= x86, nonSxS, pkt {l:8 b:31bf3856ad364e35} on file P2PGraph.dll with NTSTATUS -2146498170[gle=0x80004005]
2025-05-22 07:17:19, Error                 CSI    00000021@2025/5/22:06:17:19.316 (F) Attempting to mark store corrupt with category [l:18 ml:19]'CorruptPayloadFile'[gle=0x80004005]
2025-05-22 07:17:19, Info                  CSI    00000022 Hashes for file member [l:12]'P2PGraph.dll' do not match.
Expected: {l:32 ml:4096 b:853281def4e0b48284f42cb4a8f7887f958a66b6121a07322ec8e2d9732d5325}.
Actual: {l:32 b:15b3f5a61b704f2b188bc918ffc882d51191a7940853c837809bda6306ee088e}.

This now need to be updated as well, so lets get that fixed as well, once we get the DLL from another healthy server or the correct ISO version, we need to take ownership then set permissions then copy over:

takeown /f "C:\Windows\WinSxS\amd64_microsoft-windows-peertopeergraphing_31bf3856ad364e35_10.0.17763.5830_none_b01cd73bbbb43606" /r /d y
 
icacls "C:\Windows\WinSxS\amd64_microsoft-windows-peertopeergraphing_31bf3856ad364e35_10.0.17763.5830_none_b01cd73bbbb43606" /grant administrators:F /t

copy "C:\T\emp2\amd64_microsoft-windows-peertopeergraphing_31bf3856ad364e35_10.0.17763.5830_none_b01cd73bbbb43606" "C:\Windows\WinSxS\amd64_microsoft-windows-peertopeergraphing_31bf3856ad364e35_10.0.17763.5830_none_b01cd73bbbb43606"

When I run the last command something was wrong here even though we have permissions to the folder we are getting access denied:


I also noticed when I try to view permissions on the folder in WinSXS that I was missing the Security tab as you can see below, almost like we are on a FAT32 file system not NTFS:


I also noticed that we were currently on the amd64_ part of the component store journey which means we could also have this issue in the rest of the chain which is as below:
  1. amd64_* folders (64-bit components)
  2. x86_* folders (32-bit components)
  3. wow64_* folders (32-bit on 64-bit compatibility)
  4. msil_* folders (.NET managed code assemblies)
  5. msi_* folders (Windows Installer components)
  6. cab_* folders (Cabinet file components)
  7. policy_* folders (Assembly binding policies)
  8. catalogs (Code signing catalogs)
  9. Various microsoft-windows-* component folders
  10. Security catalogs and manifests
This means I could be here for months fixing this issue and coming up against corrupt packages not to mention even if I did get to February 2025 (the target) this could cause more issues further down the line with future updates.

Step 18 : Upgrade Path Taken - until Abort

This is the path taken to upgrade the cumulative updates, and the entries in bold indicate where the updates prevent me from moving forward due to finding the anomaly with the file system structure.
  1. KB4465477 (2018-10 Update for Windows Server 2019)
  2. KB5022554 (2022-12 Cumulative Update for Windows Server 2019)
  3. KB5022286 (January 2023 Culamtive Update)
  4. KB5005112 (2021-08 Servicing Stack Update for Windows Server 2019)
  5. KB5022840 (Feburary 2023 Cumlative Update)
  6. KB5025229 (Internal SSU-17763.4121-x64.cab with DISM)
  7. KB5023702 (March 2023 Cumulative Update)
  8. KB5025229 (April 2023 Cumulative Update)
  9. KB5027222 (June 2023 Cumulative Update)
  10. KB5025229 (Internal SSU-17763.4640-x64.cab with DISM)
  11. KB5034127 (Internal ssu-17763.5084-x64.cab with DISM)
  12. KB5028168 (July 2023 Cumulative Update)  
  13. KB5029247 (August 2023 Cumulative Update)
  14. KB5034127 (Internal ssu-17763.5084-x64.cab with DISM)
  15. KB5034127 (January 2024 Cumulative Update)
  16. KB5052000 (February 2025 Cumulative Update)
Step 19 : Clean Build with a "stable" file system

Not that I wanted to do this, but you reach a point where this makes logical sense and the next plan was to build the disk for the Domain Controller, however being in Azure it gives us an added layer of complication as there is not hypervisor access so this is the plan:
  1. Prepare New OS Disk
  2. Create a new disk in Azure named: fuzzybear-server2022core-system
  3. Leave the new disk unattached from fuzzybear
  4. Build OS on Temporary VM
  5. Attach the new disk to build server hyperv-recoveryvm (Hyper-V server or temporary VM).
  6. Mount the installation ISO from: \\internalshare.bear.local\ISO Images\en-us_windows_server_2022_updated_aug_2021_x64_dvd_257ad90f.iso
  7. Install Windows Server 2022 Core onto the new disk.
  8. Assign a temporary computer name (e.g., fuzzybear-cleanbuild).
  9. Configure the NIC inside the OS to use DHCP only (important — do not assign a static IP manually).
  10. Confirm the server has internet access for patching.
  11. Update the OS
  12. Bring the OS to 2025-02 patch level using offline update packages (not Windows Update).
  13. Demote Existing Domain Controller
  14. On the current fuzzybear.bear.local gracefully demote from domain controller role using Uninstall-ADDSDomainController.
  15. Remove any lingering metadata from AD, DNS, and Sites & Services (if applicable)
  16. Power off the VM once demotion is confirmed.
  17. Swap OS Disk
  18. Detach the existing (corrupt) OS disk from fuzzybear.bear.local
  19. Attach the newly built fuzzybear-server2019core-system disk to fuzzybear.bear.local
  20. Boot with new disk
  21. Boot the VM, since the OS is DHCP-configured
  22. Azure will inject the static IP and DNS settings from the Azure NIC associated with fuzzy bear.
  23. The OS will automatically adopt the correct IP on startup.
  24. Validate IP assignment using:

    Get-NetIPAddress | Where-Object {$_.AddressFamily -eq "IPv4"}
    Get-DnsClientServerAddress

  25. Join the server to the domain as a member.
  26. Rename the computer account back to fuzzybear.bear.local
  27. Reboot server
  28. Ensure new name is valid and correct
  29. Validate domain connectivity, DNS resolution, and network stability.
  30. Promote the server to Domain Controller.

Lessons Learned and Best Practices

  1. Know when to stop chasing component corruption and think about a clean build, I do not like in place upgrades and in this scenario it would not help either and is not recommended on Domain Controllers.
  2. Install Servicing Stack Updates (SSUs) first: Always install SSUs before attempting to install cumulative updates. They update the components that handle the updating process itself.
  3. Check PossibleCorruptions.txt: This file is a goldmine of information about component store corruptions.
  4. Take a gradual approach: Don't try to jump straight to the latest updates. Work your way up chronologically.
  5. Copy from working systems: Having access to a properly updated system of the same version is invaluable.
  6. Run as TrustedInstaller: When modifying files in WinSxS, always use TrustedInstaller privileges.
  7. Reboot after significant changes: Even when not prompted, rebooting helps ensure changes take effect.
  8. Keep your servers updated: Regular updates prevent the buildup of these kinds of issues.

Conclusion

Fixing Windows Server update issues can be complex, but with a methodical approach and patience, even severely corrupted systems can be brought back to a healthy state without resorting to reinstallation or in-place upgrades.

The key is to understand the component store architecture, identify corrupted files, and carefully replace them while maintaining proper permissions and directory structures.

While we're still working on the final updates to bring this server fully up to date, we've made significant progress and learned valuable lessons along the way.

Previous Post Next Post

نموذج الاتصال