You need to set up Azure Data Factory pipelines to meet data movement requirements.
You need to set up Azure Data Factory pipelines to meet data movement requirements. Which integration runtime should you use? A. self-hosted integration runtime B. Azure-SSIS Integration Runtime C. .NET Common Language Runtime (CLR) D. Azure integration runtime Explanation: The following table describes the capabilities and network support for each of the integration runtime types: […]
Which two options should you use? Each correct answer presents part of the solution.
Overview Current environment Contoso relies on an extensive partner network for marketing, sales, and distribution. Contoso uses external companies that manufacture everything from the actual pharmaceutical to the packaging. The majority of the company’s data reside in Microsoft SQL Server database. Application databases fall into one of the following tiers: The company has a reporting […]
You need to ensure that phone-based poling data can be analyzed in the PollingData database.
You need to ensure that phone-based poling data can be analyzed in the PollingData database. How should you configure Azure Data Factory? A. Use a tumbling schedule trigger B. Use an event-based trigger C. Use a schedule trigger D. Use manual execution Explanation: When creating a schedule trigger, you specify a schedule (start date, recurrence, […]
A company uses Azure SQL Database to store sales transaction data. Field sales employees need an offline copy
A company uses Azure SQL Database to store sales transaction data. Field sales employees need an offline copy of the database that includes last year’s sales on their laptops when there is no internet connection available. You need to create the offline export copy. Which three options can you use? Each correct answer presents a […]
Each day, company plans to store hundreds of files in Azure Blob Storage and Azure Data Lake Storage. The comp
Each day, company plans to store hundreds of files in Azure Blob Storage and Azure Data Lake Storage. The company uses the parquet format. You must develop a pipeline that meets the following requirements: • Process data every six hours • Offer interactive data analysis capabilities • Offer the ability to process data using solid-state […]
You need to ingest and visualize real-time Twitter data by using Microsoft Azure.
You develop data engineering solutions for a company. You need to ingest and visualize real-time Twitter data by using Microsoft Azure. Which three technologies should you use? Each correct answer presents part of the solution. NOTE: Each correct selection is worth one point. A. Event Grid topic B. Azure Stream Analytics Job that queries Twitter […]
Which tool should you use to configure a pipeline to copy data?
You develop data engineering solutions for a company. You must integrate the company’s on-premises Microsoft SQL Server data with Microsoft Azure SQL Database. Data must be transformed incrementally. You need to implement the data integration solution. Which tool should you use to configure a pipeline to copy data? A. Use the Copy Data tool with […]
You need to load the data from the Azure Data Lake Gen 2 storage account into the Azure SQL Data Warehouse.
You develop a data ingestion process that will import data to a Microsoft Azure SQL Data Warehouse. The data to be ingested resides in parquet files stored in an Azure Data Lake Gen 2 storage account. You need to load the data from the Azure Data Lake Gen 2 storage account into the Azure SQL […]
You develop a data ingestion process that will import data to a Microsoft Azure SQL Data Warehouse. The data t
You develop a data ingestion process that will import data to a Microsoft Azure SQL Data Warehouse. The data to be ingested resides in parquet files stored in an Azure Data Lake Gen 2 storage account. You need to load the data from the Azure Data Lake Gen 2 storage account into the Azure SQL […]
You develop a data ingestion process that will import data to a Microsoft Azure SQL Data Warehouse. The data t
You develop a data ingestion process that will import data to a Microsoft Azure SQL Data Warehouse. The data to be ingested resides in parquet files stored in an Azure Data Lake Gen 2 storage account. You need to load the data from the Azure Data Lake Gen 2 storage account into the Azure SQL […]