Microsoft Azure Multiple Choice Questions and Answers

azure objective questions and answers, microsoft azure mcq questions and answers, microsoft azure mcqs, microsoft azure mcq questions, microsoft azure fundamentals mcq, azure devops mcq questions, azure cloud mcq, azure storage mcq, azure sql mcq,
azure objective questions and answers

Microsoft Azure Multiple Choice Questions and Answers (mcq) are the Azure cloud service practical/lab based objective questions with answers. Most of the following ms Azure cloud services problems depicts a scenario and you need to suggest the best possible solution for the same. While others are based on the theory and facts to check your knowledge. We are planning to add the Microsoft Azure mcq pdf download link for more practice.

Q.1. Your company uses several Azure HDInsight cluster.
The data engineering team reports several errors with some application using these cluster.
You need to recommend a solution to review the health of the clusters.
What should you include in your recommendation?

a. Log analytics
b. Azure Automation
c. Application insights

Advertisement

Log analytics

Azure objective questions and answers

Q.2. You develop data engineering solutions for a company.
A project requires the deployment of resources to Microsoft azure for bath data processing on azure HD insight.
Batch processing will run daily and must
Scaled to minimize costs
Be monitored for cluster performance
You need to recommend a tool that will monitor cluster and provides information to suggest how to scale
Solution : Monitor cluster load using the Ambari Web UI. Does the solution meet the goal.

a. Incorrect
b. Correct

Advertisement

Incorrect

Q.3. You develop data engineering solutions for a company.
A project requires the deployment of resources to Microsoft azure for bath data processing on azure HD insight.
Batch processing will run daily and must
Scaled to minimize costs
Be monitored for cluster performance
You need to recommend a tool that will monitor cluster and provides information to suggest how to scale
Solution : Monitor cluster by using azure log analytics and HD insight cluster management solution.
Does the solution meet the goal?

a. Yes
b. No

Yes


Q.4. A company has a Microsoft Azure HD solution that uses different cluster size to process and analyze data. Operations are continuous.
Reports indication a slowdown during a specific time window.
You need to determine a monitoring solution to track down the issue in the least amount of time.
What should you use?

a. HDinsights .Net SDK
b. Ambari Rest API
c. Azure monitor metrics
d. Azure log analytics rule query

Advertisement

Ambari Rest API

Microsoft Azure Multiple Choice Questions and Answers

Q.5. You are working as a data engineer and you have to query the data, filter it, and send it to power BI.
Which technology would you be using to receive the data.

a. WebJob
b. HDInsight
c. Function app
d. Stream Analytics

Advertisement

HDInsight

Azure Blob Services: 2 Questions
Q.1. You have been asked to optimize blob storage. what is the best solution to recommend?
a. Implement lifecycle management
b. Manually create a policy to review and delete blobs at the end of their lifecycle
c. Increase DWU to optimize performance
d. Move all blobs to hot tiers to increase performance

Implement lifecycle management

Q.2. Which of the following is not an access tier in blob storage?
a. Archive
b. Hot
c. Cool
d. Glacier

Advertisement

Glacier

Azure SQL: 5 Questions
Q.1. Which of the following is the right option when it comes to relational data?
a. Relational data is stored in a file system as unstructured data.
b. Relational data is stored in a hierarchical folder structure
c. Relational data is stored in a tabular form of rows and columns.
d. Relational data is stored in a comma separated value file.

Relational data is stored in a tabular form of rows and columns.

Q.2. Your company has a transactional application that stores data in an azure SQL managed instances.
In which of the following circumstances would you need to implement a read-only database replica?

a. You need to generate reports without affecting the transactional workload.
b. You need to implement HA in the event of regional outage.
c. You need to audit the transactional application.

Advertisement

You need to generate reports without affecting the transactional workload.

Q.3. A relational database is appropriate for which of the following scenarios?
a. For those scenarios where there is a high volume of transactional writes.
b. For those scenarios where there is a high volume of writes that have varying data structures
c. For those scenarios where there is a high volume of geographics distributed writes.
d. For those scenarios where there is a high volume of changes to relational between entities.

For those scenarios where there is a high volume of transactional writes.

Q.4. You have the following table in place
Which statement should you use in a SQL query to change the quantity of product to 400?

a. INSERT
b. UPDATE
c. MERGE
d. CREATE

UPDATE

Q.5. You have to map the right term to the term description.
Which of the following would you map to the following description?
“A database object that holds data”

a. Index
b. View
c. Table

Advertisement

Table

Azure Data Lake Analytics Gen2: 5 Questions
Q.1. You develop data engineering solutions for a company. A project requires the development at data to azure data lake storage.
You need to implement role-based access control(RBAC) so that project members can manage the azure data lake storage resources.
Which three actions should you perform? each correct answer presents part of the situation
A. Configure access control lists (ACL) for the Azure data Lake storage account.
B. Configure service-to-service authentication for the data lake storage account.
C. Create security groups in azure active directory (Azure AD) and add project members.
D. Configure end user authentication for the azure data lake storage account.
E. Assign azure AD security groups to azure data lake storage.

a. A,B and E
b. A,C and E
c. B, C and E

Advertisement

A,C and E

Q.2. You develop a data ingestion process that will import data to a Microsoft azure SQL data warehouse. The data to be ingested resides in parquet files stored in an azure data lake gen 2 storage account.
You need to load the data from the azure data lake gen 2 storage account into the azure SQL data warehouse.
Solutions:
1. Create an external data source pointing to the azure storage account
2. Create a workload group using the azure storage account name as the pool name.
3. Load the data using the INSERT….SELECT statement
Does the solution meet the goal?

a. Incorrect
b. Correct

Advertisement

Incorrect

Q.3. You have an Azure SQL database that has masked columns. you need to identify when a user attempts to infer data from the masked columns.
What should you use?

a. Transparent data encryption
b. Auditing
c. Azure advanced threat protection
d. Custom masking rules

Auditing

Microsoft Azure objective questions

Q.4. You are working as a data engineer for your company and you have been given a task to create an azure data lake gen2 storage account. Your company wants to ingest data into the storage account from various data sources.
What should you be using to ingest data from a local workstations?

a. Azure Data Factory
b. AzCopy Tool
c. Azure Events Hubs
d. Azure Event Grid

Advertisement

AzCopy Tool

Q.5. A Company has created an azure data lake gen2 storage account. They want to ingest data into the storage account from various data sources.
Which of the following can they use to ingest data from log data stored on web servers.

a. Azure Data Factory
b. AzCopy Tool
c. Azure Event Hubs
d. Azure Event Grid

Advertisement

Azure Data Factory

Azure Analysis Services: 5 Questions
Q.1. What can be the maximum size of a dimension ?
a. 8 GB
b. 4 GB
c. 500 GB

4 GB

Q.2. What is known as the fact table in Azure analysis service.
a. A fact table is what contains the basic information that the user needs to summarize.
b. A fact table what contains the attributes.
c. A Factless fact tables are used for tracking and recording events.

A fact table is what contains the basic information that the user needs to summarize.

Q.3. The HR department in your company needs to send employee data monthly.
Which type of processing should you use.

a. Stream Analytics
b. OLAP
c. OLTP
d. Batch Processing

Advertisement

Batch Processing

Q.4. A data analyst prepares a line chart that tracks sales over the past year and also shows projected sales for the coming quarter.
What type of analysis is this an example of?

a. Prescriptive
b. Diagnostic
c. Predictive
d. Descriptive

Predictive

Q.5. Which azure stream analytics window produces an output only when an event occurs?
a. Session window
b. Sliding windows
c. Tumbling window
d. Hopping window

Advertisement

Sliding windows

Azure Synapse Analytics: 6 Questions
Q.1. In modern data base architecture. Which services are used the most to ingest the data?
a. Azure data factory
b. Azure synapse analytics
c. Azure data bricks
d. Azure analysis services

Azure data factory

Q.2. In cloud warehouse which service most commonly used for storage?
a. Data factory
b. Data Lake
c. Data warehouse
d. Azure analysis services

Directory service

Q.3. Which of the following terms refers to the compute scale that use in a data warehouse in azure synapse analytics?
a. RTU
b. DWU
c. DTU

Advertisement

DWU

Q.4. You need to choose a sharding pattern for sql data warehouse that offers the highest query performance for large tables. Which choice offers the best solutions?
a. Round robin
b. Hash
c. Replicate

Hash


Q.5. Acme is launching a new product. They need to implement an online storage solution to leverage scale out architecture processing. Acme needs to process highly complex from massive amounts of data. Which solution would you recommend?
a. Sql database
b. Azure data lake gen 2
c. Azure blob storage
d. SQL data warehouse

SQL data warehouse

Q.6. You have an azure synapse analytics database. Within the database, you have a dimension table named stores that contains store information. You have a total of 263 stores nation wide. Store information is retrieved in more than half of the queries that are issues against this database. These queries include staff information per store, sales information per store, and finance information. You want to improve the query performance of these queries by configuring that table geometry of the store table. Which is the appropriate table geometry to select for the stores tables?
a. Round robin
b. Non-Clustered
c. Replicated tables
Azure Data Factory: 12 Questions

Advertisement

Replicated tables

Q.1. Which term isn’t associated with Azure data factory?
a. Linked Services
b. Pipeline
c. Cell
d. Activity

Cell

Q.2. How long is monitoring data stored in Data Factory?
a. 45 days
b. 60 days
c. 15 days
d. 365 days

45 days

Q.3. You have been asked to implement an ELT process involving Data Factory, Azure Data Lake, SQL Data Warehouse, Databricks, Polybase, and PowerBI, which order is most correct?
a. Pull data from source system and land data in Azure Data Lake with Data Factory, Then load it onto SQL Data Warehouse with Polybase. Net, move data with Data Factory to Databricks and then to PowerBI for visualization.
b. Pull data from source system and land data in Data Factory with Polybase. Tne load it into SQL Data Warehouse with Polybase. Next, move data with Polybase to databricks and then to powerBI f0r visualization.
c. Pull Data from source system and land the data in Azure Data Lake with Polybase. Then load it into SQL Data Warehouse with Data Factory. Then move data with Polybase to databricks and then to powerBI for visualization.
d. Pull data from source system and land the data in Azure Data Lake with Data Factory. Then load it into Databricks with Polybase. Next, move data with Data Factory to SQL data warehouse and then to powerBI for visualization.

Pull data from source system and land data in Azure Data Lake with Data Factory, Then load it onto SQL Data Warehouse with Polybase. Net, move data with Data Factory to Databricks and then to PowerBI for visualization.

Q.4. Which of the following statements about batch processing is true?
a. Batch processing is the movement of blocks of data over time.
b. Data Factory is an example of a batch processing service.
c. Data format and encoding can pose a challenge to batch processing.
d. All of these statements are true.

Advertisement

All of these statements are true.

Q.5. What connects an Azure Data Factory activity to a dataset?
a. Linked Service
b. NIC
c. Data link
d. Pipeline

Linked Service

windows azure mcq questions

Q.6. You are working as a data engineer for your company. You have been given the task to migrate a corporate research analytical solution from an internal datacenter to Azure.
About 200 TB of research data is currently stored in an on-premise Hadoop cluster, you have been given the task to copy it to Azure Storage. Your company’s internal datacentre is connected to your Azure Virtual Network (VNet) with express Route peering and the Azure storage service endpoint is accessible from the same Vnet.
Your company’s policy is that the research data cannot be transferred over public internet.
You have to securely migrate the research data online.
What should you be doing?

a. Transfer the data using Azure Data ox heavy devices.
b. Transfer the data using Azure Data Box disk devices.
c. Transfer the data using azure data factory in distributed copy (DistCopy)mode. With an Azure Data Factory self -hosted integration runtime (IR) machine installed in the on-premises datacentre.
d. Transfer the data using Azure Data Factory in native integration runtime (IR) mode, with an Azure Data Factory self- hosted IR machine installed on the Azure Vnet.

c, d

Q.7. You are working as a data engineer in your company and your company has an Azure Synapse Analytics SQL pool. You have been given the task to create 2 Azure Data Factory (ADF) pipeline to load the data into the SQL pool, as mentioned:
– A pipeline to migrate data from SQL server analysis services (SSAS)
– A pipeline for a daily incremental load from an Azure SQL database.
For each of the correct statement select Yes and vice-versa.
A. You can use the SSAS data source in an ADF copy activity.
B. You can implement the incremental load from the azure SQL database by using the change tracking feature combined with an ADF copy activity
C. An ADF copy activity can invoke the polybase feature to load the azure synapse analytics SQL pool.

a. A and B
b. B and C
c. A and C

Advertisement

B and C

Q.8. You are working with Azure Data Factory.
Sales data from two regions must be imported into an azure synapse analytics SQL pool. The data is stored in two CSV files.
You have the following requirements:
All data from the CSV files must be stored in a single destination table.
Duplicate records must be inserted into the destination table.
You need to implement a mapping data flow to import the data.
Which data flow transformation should you use?

a. Join
b. Union
c. Aggregate
d. Lookup

Union

Q.9. You are working as a data engineer for your company wants to integrate their on-premise Microsoft SQL server data with azure SQL database. The data should be transformed incrementally. what can be used to configure a pipeline to copy the data?
a. Make use of the AzCopy tool with blob storage as the linked services in the source.
b. Make use of azure powershell with SQL server as the linked services in the source.
c. Make use of the azure data factory UI with blob storage as the linked service in the source.
d. Make use of .NET data factory API with blob storage as the linked in the source.

Advertisement

Make use of the azure data factory UI with blob storage as the linked service in the source.

Q.10. You are working as a data engineer for a company that has an Azure data lake storage account. They have decided to implement role-based access control (RBAC) so that project members can manage the azure data lake storage resources. What actions should you be doing for this?
Select 3 options from options given below.

a. Ensure to assign Azure AD security groups to azure data lake storage.
b. Make sure to configure end-user authentication to the Azure data lake storage account.
c. Make sure to configure service-to-service authentication to the azure data lake storage account.
d. Create security groups in azure AD and then add the project members.
e. Configure access control lists for the azure data lake storage.

a, d, e

Q.11. You are working as a data engineer for your company and you have been given task to create an azure data lake gen2 storage account. Your company wants to ingest data into the storage account from various data sources.
What should you be using to ingest data from a relational data store?

a. Azure Data Factory
b. AzCopy Tool
c. Azure Events Hubs
d. Azure Event Grid

Advertisement

Azure Data Factory

Q.12. You are working as a data engineer for your company and you have been given the task to pull data from an on premise SQL server and migrate the data to Azure Blob storage. Your company is planning to use Azure Data Factory.
What steps should you be doing to implement this solution?

a. Create a database master key
b. Backup the database
c. Create a new azure data factory resource.

Create a new azure data factory resource.

Leave a Comment

Your email address will not be published. Required fields are marked *

error: Content is protected !!
Scroll to Top