Copyright (c) 2026 MindMesh Academy. All rights reserved. This content is proprietary and may not be reproduced or distributed without permission.

4.1.4. Workspace Logging with Azure Log Analytics

💡 First Principle: Monitor Hub shows Fabric-native activity logs, but enterprise observability often requires integration with Azure's broader monitoring ecosystem. Workspace logging exports detailed telemetry to Azure Log Analytics for advanced querying, long-term retention, and cross-service correlation.

Scenario: Your organization uses Azure Monitor for all cloud infrastructure. The security team requires centralized logging of all data access patterns across Azure services, including Fabric. Monitor Hub alone cannot provide this integration.

Key Differences: Monitor Hub vs. Workspace Logging

AspectMonitor HubWorkspace Logging (Log Analytics)
RetentionLimited (days/weeks)Configurable (months/years)
Query LanguageBasic filteringKQL (full power)
IntegrationFabric-onlyAzure Monitor, Sentinel, third-party SIEM
AlertingBasicAdvanced with Azure Monitor Alerts
CostIncludedLog Analytics ingestion costs

Configuring Workspace Logging

  1. Prerequisites: Create an Azure Log Analytics workspace in Azure Portal
  2. Navigate to Fabric Admin PortalAudit and usage settings
  3. Enable Azure Log Analytics integration
  4. Provide the Log Analytics workspace ID and key
  5. Select which log categories to export
Log Categories Available:
CategoryData IncludedUse Case
Audit LogsUser actions, permission changesSecurity compliance
Operation LogsPipeline runs, refresh activitiesOperational monitoring
Performance LogsQuery durations, resource consumptionPerformance analysis

Querying Fabric Logs in Log Analytics

// Find all failed pipeline runs in the last 24 hours
FabricLogs
| where TimeGenerated > ago(24h)
| where Category == "PipelineRuns"
| where Status == "Failed"
| project TimeGenerated, WorkspaceName, PipelineName, ErrorMessage
| order by TimeGenerated desc

// Analyze refresh patterns by user
FabricLogs
| where Category == "Refresh"
| summarize RefreshCount = count() by User, bin(TimeGenerated, 1d)
| render timechart
Visual: Logging Architecture

⚠️ Exam Trap: Enabling all log categories without considering cost can be expensive. Log Analytics charges based on data ingestion volume. Start with audit and operation logs, then add performance logs only if needed for specific analysis.

Key Trade-Offs:
  • Comprehensive Logging vs. Cost: More log categories increase visibility but also Log Analytics costs
  • Centralization vs. Complexity: Centralized logging enables correlation but requires Log Analytics expertise

Reflection Question: Your security team requires 2-year retention of all data access logs for compliance. Can Monitor Hub alone meet this requirement? What configuration would you recommend?

Alvin Varughese
Written byAlvin Varughese
Founder15 professional certifications