PowerShell : Analyze Entra ID Sign-in Logs by IP Address


This technical blog post details my journey creating a PowerShell script to extract and analyze successful sign-ins from specific IP addresses in Microsoft Entra ID (formerly Azure AD). 

The Challenge

I needed to identify all users who had successfully signed in from a specific IP address within the last 24 hours, while the Entra portal provides this information, I needed a programmatic way to:

  1. Query sign-in logs for a specific IP address
  2. Filter for only successful, interactive sign-ins
  3. Export results to CSV for analysis
  4. Generate a summary of unique users and their sign-in counts

Understanding the Microsoft Graph API Approach

Microsoft Graph API provides access to Entra ID sign-in logs, but there are several potential pitfalls:

  1. Authentication complexity: Obtaining and using the correct token format
  2. Filter syntax: The API is sensitive to malformed filter strings
  3. Pagination: Results are limited per page and require handling multiple requests
  4. Schema changes: The API structure can vary across different Entra ID implementations

Step 1: Setting Up Authentication

The first task was establishing proper authentication with the Microsoft Graph API. This requires:

# Configuration
$tenantId = "<tenant_id>"
$clientId = "<application_id>"
$clientSecret = "<application_secret>"

# Get authentication token
$tokenUrl = "https://login.microsoftonline.com/$tenantId/oauth2/v2.0/token"
$tokenBody = @{
    client_id     = $clientId
    scope         = "https://graph.microsoft.com/.default"
    client_secret = $clientSecret
    grant_type    = "client_credentials"
}

$tokenResponse = Invoke-RestMethod -Method Post -Uri $tokenUrl -ContentType 
"application/x-www-form-urlencoded" -Body $tokenBody
$accessToken = $tokenResponse.access_token

# Set up headers
$headers = @{
    Authorization = "Bearer $accessToken"
    ConsistencyLevel = "eventual"  # Required for orderby
}

Pitfall: I initially tried using different scope values, but found that https://graph.microsoft.com/.default is the most reliable option. Using other scopes can result in authorization failures even when the app registration permissions are correct.

Step 2: Building the Right Query

After several failed attempts with complex client-side filtering approaches, I discovered that a direct, properly formatted query works best:

# Build a direct query that matches portal filters
$date24HoursAgo = (Get-Date).AddHours(-24).ToString("yyyy-MM-ddTHH:mm:ssZ")
$filter = "createdDateTime ge $date24HoursAgo and ipAddress eq '$targetIP' 
and status/errorCode eq 0"

$url = "https://graph.microsoft.com/v1.0/auditLogs/signIns?`$filter=$filter&`$top=100"

Pitfall: Date formatting is crucial. I found that using ISO 8601 format with the 'Z' timezone indicator (yyyy-MM-ddTHH:mm:ssZ) works consistently. Other formats can cause 400 Bad Request errors.

Step 3: Handling Pagination

The Graph API limits results to 100 per page, so I implemented pagination handling:

# Get the data
$results = @()
$response = Invoke-RestMethod -Method Get -Uri $url -Headers $headers
$results += $response.value

# Get additional pages if needed
$nextLink = $response.'@odata.nextLink'
while ($nextLink) {
    $response = Invoke-RestMethod -Method Get -Uri $nextLink -Headers $headers
    $results += $response.value
    $nextLink = $response.'@odata.nextLink'
}

Pitfall: Early versions of my script limited pagination to a fixed number of pages, potentially missing results. The approach above continues until all pages are retrieved.

Step 4: Filtering for Interactive Sign-ins

One challenge was accurately identifying interactive sign-ins:

# Filter for interactive sign-ins only if the property exists
$interactiveResults = $results | Where-Object { -not ($_.PSObject.Properties.Name 
-contains "isInteractive") -or $_.isInteractive -eq $true }

Pitfall: The isInteractive property might not be present in all Entra ID implementations. My script checks if the property exists before using it to avoid errors.

Step 5: Creating CSV Output

Exporting data to CSV required handling potential missing properties:

# Create CSV data
$csvData = @()
foreach ($signin in $interactiveResults) {
    $csvData += [PSCustomObject]@{
        UserPrincipalName = $signin.userPrincipalName
        IPAddress = $signin.ipAddress
        DateTime = $signin.createdDateTime
        Application = $signin.appDisplayName
        ClientApp = $signin.clientAppUsed
        OperatingSystem = $signin.operatingSystem
        Status = "Success"
        IsInteractive = "Yes"
        City = $signin.location.city
        Country = $signin.location.countryOrRegion
        DeviceDetail = $signin.deviceDetail.deviceId
        RequestId = $signin.id
    }
}

# Export to CSV
$csvData | Export-Csv -Path $csvFilePath -NoTypeInformation

Pitfall: Nested properties like location.city can be null, potentially causing errors. Consider adding null checks for production code.

Step 6: Generating User Summary

The final step was creating a meaningful summary of unique users and their sign-in counts:

# Sort users by sign-in count (descending)
$userCounts = @()
foreach ($user in $uniqueUsers) {
    $count = ($interactiveResults | Where-Object 
    { $_.userPrincipalName -eq $user.userPrincipalName }).Count
    $userCounts += [PSCustomObject]@{
        UserPrincipalName = $user.userPrincipalName
        Count = $count
    }
}
$sortedUsers = $userCounts | Sort-Object -Property Count -Descending

# Add each user to the log
foreach ($user in $sortedUsers) {
    $logContent += "$($user.UserPrincipalName): $($user.Count) sign-ins"
}

# Write to log file
$logContent | Out-File -FilePath $logFilePath

Pitfall: PowerShell variable expansion within strings that contain colons can cause errors. Using $() syntax ensures proper variable expansion.

Previous Post Next Post

نموذج الاتصال