Copyright (c) 2026 MindMesh Academy. All rights reserved. This content is proprietary and may not be reproduced or distributed without permission.

2.1.3. Fabric Items and Their Relationships

💡 First Principle: Fabric items are the building blocks of analytics solutions—like specialized tools in a workshop. Each tool has a specific purpose: you wouldn't use a hammer to measure length, and you wouldn't use a lakehouse for real-time millisecond queries. Understanding which tool to reach for determines whether your solution succeeds or struggles.

Comparative Table: Fabric Item Types
Item TypePrimary PurposeStorage LocationQuery Language
LakehouseStore and process big dataOneLake (Delta tables + Files)Spark SQL, PySpark
Data WarehouseStructured analyticsOneLake (Delta tables)T-SQL
KQL DatabaseReal-time analyticsOneLakeKQL
EventstreamReal-time data ingestionTransient (routes to destinations)Visual editor
Data PipelineOrchestrationMetadata onlyVisual + expressions
Dataflow Gen2Low-code ETLOneLake stagingPower Query (M)
NotebookCode-based processingOneLake (output)PySpark, Spark SQL
Visual: Item Relationships in a Typical Architecture

Decision Framework: Which Item to Use When

If You Need To...Use ThisWhy
Store big data with flexible schemaLakehouseDelta tables + raw files, Spark processing
Run complex T-SQL analyticsData WarehouseOptimized for SQL, familiar to BI teams
Analyze streaming data in real-timeKQL DatabaseSub-second queries on time-series data
Route and transform streaming eventsEventstreamVisual stream processing, multiple outputs
Orchestrate multi-step workflowsData PipelineControl flow, scheduling, dependencies
Transform data with low-code UIDataflow Gen2Power Query familiar to Excel users
Write custom processing logicNotebookFull PySpark/SQL flexibility

Common Item Selection Mistakes

MistakeProblemBetter Choice
Using Warehouse for raw file storageWarehouse requires structured dataUse Lakehouse for files
Using Lakehouse for real-time queriesSpark has cold-start latencyUse KQL Database
Using Notebook for simple transformsOverkill, harder to maintainUse Dataflow Gen2
Using Pipeline when Dataflow sufficesUnnecessary complexityDataflow can schedule itself

⚠️ Exam Trap: Questions often present a scenario and ask which item to use. Key signals: "real-time" → KQL Database/Eventstream; "T-SQL" → Warehouse; "unstructured files" → Lakehouse; "orchestration" → Pipeline.

Alvin Varughese
Written byAlvin Varughese
Founder•15 professional certifications