ℹ️ Many blog posts do not include full scripts. If you require a complete version, please use the Support section in the menu.
Disclaimer: I do not accept responsibility for any issues arising from scripts being run without adequate understanding. It is the user's responsibility to review and assess any code before execution. More information

Automating DNS TXT Record Updates: A Deep Technical Dive into API Authentication, Pagination, and DNS Record Management


When looking to updating Domain Control Validation (DCV) TXT records across mutiple domains managed through Comlaude's DNS platform, I embarked on what I thought would be a straightforward API automation project. What followed was a multi-day journey through authentication mysteries, undocumented API behaviors, and PowerShell compatibility challenges. This is the complete technical account of building a production-ready automation solution.

The Initial Requirements

The organization manages mutiple domains through Comlaude. Each domain needed a DCV TXT record for SSL certificate validation. The requirements were:

  1. Update existing DCV TXT records (values starting with underscore)
  2. Create new DCV records where none exist
  3. Never modify SPF, DKIM, DMARC, or other critical TXT records
  4. Process domains from a CSV file for batch operations
  5. Provide clear logging and confirmation prompts
  6. Handle API pagination for multiple domains
Visual Images

This is the DNS for the domain bother before the update with the script and after the update, lets start with the pre-update domain DNS record, notice we have have 2 x redirect records and 1x TXT record as below:


The script then runs as below:


If we now check the DNS portal we will now see the new record appear:



The Authentication Journey

First Attempt: Direct API Key Usage

My initial assumption was that the API key could be used directly as a Bearer token:

$API_KEY = "<api-key>"
$GROUP_ID = "<group-id>"

$Headers = @{
    "Authorization" = "Bearer $API_KEY"
    "X-Access-Group-Id" = $GROUP_ID
    "Content-Type" = "application/json"
}

Invoke-RestMethod -Uri "https://api.comlaude.com/api/resource-records" -Headers $Headers

Result: 401 Unauthorized

Second Attempt: Alternative Authorization Headers

I tested multiple authorization header formats based on common API patterns:

# Test 1: X-API-Key header
$Headers = @{"X-API-Key" = $API_KEY}

# Test 2: Plain Authorization header
$Headers = @{"Authorization" = $API_KEY}

# Test 3: ApiKey prefix (found this in various API docs)
$Headers = @{"Authorization" = "ApiKey $API_KEY"}

# Test 4: Basic Authentication
$EncodedCreds = [System.Convert]::ToBase64String([System.Text.Encoding]::UTF8.GetBytes("$API_KEY:"))
$Headers = @{"Authorization" = "Basic $EncodedCreds"}

Result: All returned 401 Unauthorized

The Authentication Breakthrough

After examining the Comlaude documentation more carefully, I discovered the authentication is a two-step process:

# Step 1: Login with credentials AND API key
$loginBody = @{
    username = $USERNAME
    password = $PASSWORD
    api_key = $API_KEY
} | ConvertTo-Json

$response = Invoke-RestMethod -Uri "https://api.comlaude.com/api_login" `
                             -Method POST `
                             -Body $loginBody `
                             -ContentType "application/json"

# Step 2: Extract the bearer token from response
# The token was buried in $response.data.access_token
$token = $response.data.access_token

# Step 3: Use this token for all subsequent requests
$Headers = @{
    "Authorization" = "Bearer $token"
    "Content-Type" = "application/json"
}

The response structure was:

{
    "data": {
        "token_type": "Bearer",
        "expires_in": 7200,
        "access_token": "eyJ0eXAiOiJKV1QiLCJhbGc...",
        "refresh_token": "def502002cec375fe708d21..."
    },
    "status_code": 200
}

Result: The API key alone doesn't provide access - it's part of the authentication payload along with username and password to obtain a JWT bearer token.

The Base URL and Endpoint Mystery

The Portal vs API Confusion

Initial attempts used various URL patterns that all failed:

# Attempt 1: Using portal URL with /api paths
"https://portal.comlaude.com/api/resource-records"  # Returns HTML, not JSON

# Attempt 2: Including group ID in base URL
"https://api.comlaude.com/groups/$GROUP_ID/domains"  # 404 Not Found

# Attempt 3: Using /api prefix on api subdomain
"https://api.comlaude.com/api/domains"  # 404 Not Found

Discovering the Correct Structure

Through systematic testing, I discovered the correct endpoint structure:

$BASE_URL = "https://api.comlaude.com"  # No trailing path

# Correct endpoints with leading slash and full path
"/groups/{groupId}/domains"                        # List all domains
"/groups/{groupId}/domains/{domainId}/zones"       # Get zones for a domain
"/groups/{groupId}/zones/{zoneId}/records"         # Get/Update records

The critical insights were:

  1. Use api.comlaude.com, not portal.comlaude.com
  2. Don't include /api prefix - the subdomain IS the API
  3. Always use leading slashes in endpoint paths
  4. Group ID goes in the path, not as a base URL component

The Pagination Challenge

Initial Problem: Only 25 Domains Returned

Despite having hundreds of domains, the API only returned 25:

$response = Invoke-RestMethod -Uri "$BASE_URL/groups/$GROUP_ID/domains" -Headers $headers
$response.data.Count  # Returns: 25

Debugging Pagination Structure

I added extensive debugging to understand the pagination:

# Check for pagination metadata
if ($response.meta) {
    Write-Host "Meta found: current_page=$($response.meta.current_page), 
                last_page=$($response.meta.last_page), 
                total=$($response.meta.total)"
}

The API returned pagination info in the meta object:

{
    "data": [...],
    "meta": {
        "current_page": 1,
        "last_page": 24,
        "per_page": 25,
        "total": 587
    }
}

Implementing Robust Pagination

The final pagination solution handles multiple scenarios:

$allDomains = @()
$page = 1
$hasMore = $true

while ($hasMore) {
    $url = "$BASE_URL/groups/$GROUP_ID/domains?page=$page&per_page=100"
    $response = Invoke-RestMethod -Uri $url -Headers $headers -Method GET
    
    $pageDomains = $response.data
    $allDomains += $pageDomains
    
    # Multiple ways to detect more pages
    if ($response.meta -and $response.meta.last_page) {
        $hasMore = $page -lt $response.meta.last_page
    }
    elseif ($response.links -and $response.links.next) {
        $hasMore = $true
    }
    elseif ($pageDomains.Count -eq 100) {
        # Full page returned, probably more available
        $hasMore = $true
    }
    else {
        $hasMore = $false
    }
    
    $page++
    if ($page -gt 50) { $hasMore = $false }  # Safety limit
}

The DNS Record Structure Discovery

Understanding the Hierarchy

Comlaude's DNS structure follows a three-tier hierarchy:

Domain (e.g. croucher.cloud)
  └── Zone (usually one per domain)
       └── Records (A, CNAME, TXT, etc.)

Each tier requires a separate API call:

# 1. Find the domain
$domain = $allDomains | Where-Object { $_.name -eq "croucher.cloud" }
$domainId = $domain.id  # "1a5ec746-not-actual-data-b172-a1e0eb1c9f14"

# 2. Get the zone
$zones = Invoke-RestMethod -Uri "$BASE_URL/groups/$GROUP_ID/domains/$domainId/zones"
$zoneId = $zones.data[0].id  # "c7398555-cf2a-no-notlivedata-4da79b741200"

# 3. Get the records
$records = Invoke-RestMethod -Uri "$BASE_URL/groups/$GROUP_ID/zones/$zoneId/records"

The Field Naming Catastrophe

The 400 Bad Request Mystery

Updates consistently failed with validation errors:

$updateBody = @{
    type = "TXT"
    name = "croucher.cloud"
    content = "_1386rf319q7prrunotrealdatab8nn3"  # This seems right?
    ttl = 3600
}

# Response: 400 Bad Request
# {"errors":[{"message":"Validation has failed","details":{"value":["The value field is required."]}}]}

The Solution: Field Name Differences

After extensive debugging, I discovered Comlaude uses value not content:

# WRONG - What most DNS APIs use
$updateBody = @{
    type = "TXT"
    content = "_1386rf319q7prrunotrealdatab8nn3"
} # CORRECT - What Comlaude expects $updateBody = @{ type = "TXT" value = "_1386rf319q7prrunotrealdatab8nn3"
}

The Complete Record Update Pattern

The API also required sending ALL original fields back, not just changed ones:

# Copy ALL fields from the original record
$updateBody = @{}
foreach ($prop in $record.PSObject.Properties) {
    $updateBody[$prop.Name] = $prop.Value
}

# Then update the specific field
$updateBody["value"] = $newTxtValue

# The final payload includes everything:
{
    "zone": { "id": "...", "domain": {...}, "networks": [...] },
    "id": "fcee1124-8722-4490-b236-d45009411333",
    "type": "TXT",
    "locked": 0,
    "ttl": 3600,
    "value": "_1386rf319q7prrunotrealdatab8nn3",
"name": "croucher.cloud" }

Protecting Critical DNS Records

Implementing Record Protection

The solution filters records by content pattern:

$dcvRecords = @()
$otherRecords = @()

foreach ($txt in $txtRecords) {
    $cleanValue = $txt.value -replace '^"', '' -replace '"$', ''
    
    if ($cleanValue -like "_*") {
        # DCV record - safe to update
        $dcvRecords += $txt
        Write-Host "DCV Record: $cleanValue" -ForegroundColor Cyan
    } 
    elseif ($cleanValue -like "v=spf1*") {
        Write-Host "SPF Record: PROTECTED" -ForegroundColor Red
        $otherRecords += $txt
    }
    elseif ($cleanValue -like "*._domainkey*") {
        Write-Host "DKIM Record: PROTECTED" -ForegroundColor Red
        $otherRecords += $txt
    }
    elseif ($cleanValue -like "v=DMARC1*") {
        Write-Host "DMARC Record: PROTECTED" -ForegroundColor Red
        $otherRecords += $txt
    }
    else {
        $preview = $cleanValue.Substring(0, [Math]::Min(50, $cleanValue.Length))
        Write-Host "Other Record: $preview..." -ForegroundColor Gray
        $otherRecords += $txt
    }
}

TLS 1.2 Enforcement

PowerShell 5.1 defaults to older TLS versions, causing connection failures:

# Error: The underlying connection was closed: An unexpected error occurred on a send

# Solution - Force TLS 1.2
[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12

CSV Handling Edge Cases

The CSV import had issues with empty or malformed files:

# This fails with "The property 'Count' cannot be found"
$domains = Import-Csv -Path $CsvPath
Write-Host "Loaded $($domains.Count) domains"

# Robust solution
$domains = @(Import-Csv -Path $CsvPath)  # Force array
if ($null -eq $domains -or $domains.Count -eq 0) {
    throw "No data found in CSV"
}

Creating New Records

When no DCV record exists, the script creates one:

$createBody = @{
    type = "TXT"
    name = "@"  # Root domain
    value = "_1386rf319q7prrunotrealdatab8nn3"
ttl = 3600 } $createUrl = "$BASE_URL/groups/$GROUP_ID/zones/$zoneId/records" try { $response = Invoke-RestMethod -Uri $createUrl ` -Headers $headers ` -Method POST ` -Body ($createBody | ConvertTo-Json) } catch { # If @ doesn't work, try with full domain name $createBody["name"] = "croucher.cloud" # Retry... }

Error Handling and User Experience

Confirmation Prompts

Every update requires explicit confirmation:

Write-Host "Current DCV: _1386rf319q7prrunotrealdatab8nn3" -ForegroundColor Yellow
Write-Host "New DCV: _a5d3rf319q7prnotrealdata26k0o9b8nn3" -ForegroundColor Cyan $confirm = Read-Host "Update this DCV record? (yes/no)"

Progress Tracking

With mutiple domains, progress indication is crucial:

$successCount = 0
$failCount = 0

foreach ($row in $domains) {
    Write-Host "`n[$($domains.IndexOf($row) + 1)/$($domains.Count)] Processing: $($row.domain)"
    # ... processing ...
}

Write-Host "`nSummary: $successCount successful, $failCount failed"

Lessons from Production Deployment

API Timeout Handling

With multiple domains and pagination, timeouts became an issue:

# Add timeout and retry logic
$maxRetries = 3
$retryCount = 0

while ($retryCount -lt $maxRetries) {
    try {
        $response = Invoke-RestMethod -Uri $url -Headers $headers -TimeoutSec 30
        break
    }
    catch {
        $retryCount++
        Start-Sleep -Seconds 2
    }
}

Token Expiration

The 2-hour token expiration requires handling for long runs:

$tokenExpiry = (Get-Date).AddHours(2)

# Before each API call
if ((Get-Date) -gt $tokenExpiry) {
    # Re-authenticate
    Get-NewToken
}

Conclusion

What appeared to be a simple DNS automation task revealed multiple layers of complexity: undocumented authentication flows, non-standard field naming, critical infrastructure protection requirements, and pagination challenges at scale. The journey from initial 401 errors to a production-ready solution required methodical debugging, careful API response analysis, and robust error handling.

The key technical takeaways:

  1. Modern APIs often require multi-step authentication despite having API keys
  2. Field naming conventions vary significantly between APIs (value v content)
  3. Pagination is essential and often poorly documented
  4. DNS automation requires extreme care to protect critical records
  5. PowerShell compatibility across versions requires attention to detail
  6. API response structures need thorough exploration, not assumption

For anyone automating DNS operations at scale, the investment in robust error handling, comprehensive logging, and safety checks pays dividends when managing critical infrastructure..

Previous Post Next Post

نموذج الاتصال