4.3.3. Monitoring and Optimization Questions
Question 7
Your company uses Microsoft Fabric Lakehouse to manage large datasets for analytics. Query performance on a Delta table has degraded due to an increase in small Parquet files.
Which two actions should you perform?
- A. Apply V-Order for better data sorting and compression
- B. Convert to Hive table format
- C. Use the OPTIMIZE command to consolidate files
- D. Increase the number of partitions
Answer: A and C
Explanation: OPTIMIZE consolidates small files into larger ones, directly addressing the small file problem. V-Order improves compression and read performance. Converting to Hive format is not supported for Delta tables. Increasing partitions would create more files, worsening the problem.
Question 8
You have a table named dbo.Customers. Users report that the following code fails to insert data:
BEGIN TRY
INSERT INTO dbo.Customers (FirstName, LastName)
VALUES (N'Jeff', N'Price');
COMMIT TRANSACTION;
END TRY
BEGIN CATCH
SELECT ERROR_NUMBER() AS ErrorNumber, ERROR_MESSAGE() AS ErrorMessage;
ROLLBACK TRANSACTION;
END CATCH;
What should you do to ensure the code runs successfully in a transaction while continuing to handle exceptions?
- A. Add a BEGIN TRANSACTION statement to the TRY block
- B. Move the COMMIT statement to the CATCH block
- C. Remove the COMMIT statement entirely
- D. Remove the ROLLBACK statement
Answer: A. Add a BEGIN TRANSACTION statement to the TRY block
Explanation: The code attempts to COMMIT a transaction that was never started. Adding BEGIN TRANSACTION before the INSERT creates the transaction context needed for COMMIT to succeed. The CATCH block correctly handles errors by rolling back.
Question 9
A pipeline failed due to a script activity error. You need to ensure the pipeline execution fails with a customized error message and code.
Which two actions should you perform?
- A. Add a Fail activity
- B. Configure custom error settings in the Fail activity
- C. Use a Try-Catch activity
- D. Enable detailed logging
Answer: A and B
Explanation: The Fail activity is specifically designed to terminate pipelines with custom error messages and codes. You must add the activity and configure its error settings. There is no Try-Catch activity in Fabric pipelines. Logging doesn't address the requirement to fail with custom messages.
Question 10
You discover that a Dataflow Gen2 refresh fails after running for approximately one hour. The dataflow uses an on-premises data gateway.
What should you do first?
- A. Verify the maximum parameters per pipeline
- B. Verify the version of the on-premises data gateway
- C. Check for queued refresh runs
- D. Increase the refresh timeout setting
Answer: B. Verify the version of the on-premises data gateway
Explanation: Gateway version issues commonly cause long-running refresh failures. Outdated gateways have compatibility issues with Fabric. This should be checked first before investigating other causes. Pipeline parameters and queued runs are unrelated to Dataflow Gen2 gateway issues.