Microsoft Azure Certification - Testprep Training Tutorials https://www.testpreptraining.com/tutorial/category/microsoft-azure/ Fri, 02 Feb 2024 07:36:53 +0000 en-US hourly 1 https://wordpress.org/?v=6.4.5 Exam DP-600: Implementing Analytics Solutions Using Microsoft Fabric FAQs https://www.testpreptraining.com/tutorial/exam-dp-600-implementing-analytics-solutions-using-microsoft-fabric-faqs/ Tue, 30 Jan 2024 07:53:11 +0000 https://www.testpreptraining.com/tutorial/?page_id=62293 Microsoft DP-600 Exam Basic FAQ What is the Microsoft DP-600 Exam? As a candidate preparing for the Exam DP-600: Implementing Analytics Solutions Using Microsoft Fabric, it’s crucial to possess expertise in designing, creating, and deploying large-scale data analytics solutions. How many questions are there on the DP-600 Exam? In the Microsoft DP-600 exam, there will be 40-60 questions....

The post Exam DP-600: Implementing Analytics Solutions Using Microsoft Fabric FAQs appeared first on Testprep Training Tutorials.

]]>
Exam DP-600: Implementing Analytics Solutions Using Microsoft Fabric FAQs

Microsoft DP-600 Exam Basic FAQ

What is the Microsoft DP-600 Exam?

As a candidate preparing for the Exam DP-600: Implementing Analytics Solutions Using Microsoft Fabric, it’s crucial to possess expertise in designing, creating, and deploying large-scale data analytics solutions.

  • In this role, your responsibilities involve converting data into reusable analytics assets using Microsoft Fabric components like Lakehouses, Data Warehouses, Notebooks, Dataflows, Data Pipelines, Semantic Models, and Reports.
  • You’ll be implementing analytics best practices within Fabric, which includes incorporating version control and ensuring proper deployment.
  • To excel as a Fabric analytics engineer, collaboration with other roles is essential. This includes working closely with Solution Architects, Data Engineers, Data Scientists, AI Engineers, and Database Administrators, as well as Power BI Data Analysts.
  • Aside from mastering the Fabric platform, hands-on experience in data modeling, data transformation, Git-based source control, exploratory analytics, and proficiency in languages such as SQL, DAX, and PySpark are also required for success in this role.
How many questions are there on the DP-600 Exam?

In the Microsoft DP-600 exam, there will be 40-60 questions.

What is the course outline for the DP-600 Exam?

The topics in the Microsoft DP-600 exam are –

  • Plan, implement, and manage a solution for data analytics (10–15%)
  • Prepare and serve data (40–45%)
  • Implement and manage semantic models (20–25%)
  • Explore and analyze data (20–25%)

How much the DP-600 Exam will cost?

The Microsoft DP-600 exam will cost $165 USD with additional taxes.

In what language can we give the DP-600 Exam?

This exam is available in the English language.

What accommodations are available for candidates with disabilities?

Microsoft is ensuring that exams are accessible to everyone, including people with disabilities.

Microsoft DP-600 Exam Specifics FAQ

What types of questions are there on the Microsoft Certification exams?

Microsoft introduces innovative testing technologies and question types, so, they do not provide for the specific item types that will appear on a given exam.

Why is there is the use of the case study exam format?

The case study exam format uses complex scenarios that more accurately simulate what professionals do on the job. However, scenario-based questions included in the case studies are designed to test your ability to identify the critical information needed to solve a problem and then analyze and synthesize it to make decisions.

Can we review the questions after the completion of the case study?

Yes, you may review the questions in a case study only after moving to the next case or section of the exam. After completing a case study and its associated questions, a review screen will appear.

What is a short answer question in the Microsoft exam?

There are questions of the short response question type that can be answered by putting a few lines of code in the available text-entry field. There, you may select from a list of keyword possibilities for usage in the code you create. You may, however, double-check your syntax once you’ve entered your code.

Is there any negative marking for the wrong answer?

No, you are not penalized for answering incorrectly. As, for single-point items, you need to answer completely and correctly to be awarded the point. And, you don’t earn points for the parts of your response that are incorrect.

Can I review all of my answers before leaving a section or completing the exam?

Yes, you can review your answers to most questions. However, there are yes/no questions that describe a problem and a possible solution. And, you will not be able to review these questions’ answers. In addition, after you move to the next question in this set, you are not able to change your answer. These questions are preceded by an overview screen that provides this information, and each question includes a reminder that you cannot return to the question or change your answer after leaving it.

Microsoft Exam Scoring and Results FAQ

When and how will I get my exam results?

Within a few minutes of finishing the exam, you will be notified if you passed or failed. You will also receive a printed report that includes your exam score as well as feedback on your performance in the skill areas assessed.

How does the score report look like?

The score report provides a numeric score for overall exam performance, pass/fail status, a bar chart showing performance on each skill area assessed on the exam, and details on how to interpret your results and next steps. Using this information, candidates can determine areas of strength and weakness.

Does the score report show a numerical score for each section?

Each part does not have a numerical score in the score reports. Only the pass/fail status is reflected in the score reports, which give an overall numerical score. We give score bars to illustrate topic areas of strength and weakness instead of providing a numerical score for each segment.

How are exam scores calculated?

After you complete your exam, the points you earn for on each question are summed and then compared with the cut score to determine whether the result is pass or fail.

If I receive the same score every time I retake the same exam, does this imply an error in the computation of the results?

No. Receiving the same score on multiple attempts does not indicate that the program computing the results is in error. It is not uncommon for candidates to obtain similar or identical scores on multiple attempts of an exam. This consistent result demonstrates the reliability of the exam in evaluating skills in this content domain. If this happens on multiple attempts, you may want to reconsider how you’re preparing for the exam and seek other opportunities to learn and practice the skills measured by the exam.

I passed my first Microsoft Certification exam (at Pearson VUE). Now what do I do?

To explore the next steps and available benefits, see your benefits and exams dashboard. Sign in using the same Microsoft account you used to register for your exam.

If I do not pass, what can I do?

Prioritize the skills that you should practice by focusing on the content areas where your exam performance was the weakest and in the content areas that have the highest percentage of questions. Additionally, you may want to review the resources on the exam details page and our Study Groups. For this, check the bottom of the individual exam details page. When you are ready to retake the exam, schedule an appointment as you normally would. Note that you must pay for each exam you retake and follow Microsoft’s retake policy.

If I do not pass an exam, can I have a refund?

No. Microsoft does not offer refunds for exams you do not pass or exam appointments you miss.

dp-600 tests
For More Information Check Microsoft Exam Policies

Go back to the Tutorial

The post Exam DP-600: Implementing Analytics Solutions Using Microsoft Fabric FAQs appeared first on Testprep Training Tutorials.

]]>
Exam DP-600: Implementing Analytics Solutions Using Microsoft Fabric https://www.testpreptraining.com/tutorial/exam-dp-600-implementing-analytics-solutions-using-microsoft-fabric/ Tue, 30 Jan 2024 07:46:47 +0000 https://www.testpreptraining.com/tutorial/?page_id=62281 As a candidate preparing for the Exam DP-600: Implementing Analytics Solutions Using Microsoft Fabric, it’s crucial to possess expertise in designing, creating, and deploying large-scale data analytics solutions. In this role, your responsibilities involve converting data into reusable analytics assets using Microsoft Fabric components like Lakehouses, Data Warehouses, Notebooks, Dataflows, Data Pipelines, Semantic Models, and...

The post Exam DP-600: Implementing Analytics Solutions Using Microsoft Fabric appeared first on Testprep Training Tutorials.

]]>
Exam DP-600: Implementing Analytics Solutions Using Microsoft Fabric

As a candidate preparing for the Exam DP-600: Implementing Analytics Solutions Using Microsoft Fabric, it’s crucial to possess expertise in designing, creating, and deploying large-scale data analytics solutions. In this role, your responsibilities involve converting data into reusable analytics assets using Microsoft Fabric components like Lakehouses, Data Warehouses, Notebooks, Dataflows, Data Pipelines, Semantic Models, and Reports. You’ll be implementing analytics best practices within Fabric, which includes incorporating version control and ensuring proper deployment.

Knowledge required:

  • To excel as a Fabric analytics engineer, collaboration with other roles is essential. This includes working closely with Solution Architects, Data Engineers, Data Scientists, AI Engineers, and Database Administrators, as well as Power BI Data Analysts.
  • Aside from mastering the Fabric platform, hands-on experience in data modeling, data transformation, Git-based source control, exploratory analytics, and proficiency in languages such as SQL, DAX, and PySpark are also required for success in this role.

Exam Details

exam details

Successfully passing the Exam DP-600: Implementing Analytics Solutions Using Microsoft Fabric will qualify you to attain the esteemed title of Microsoft Certified: Fabric Analytics Engineer Associate. The exam is conducted in English, comprises 40-60 questions, requires a passing score of 700, and the registration fee is $165 USD.

Course Outline

Preparing for the exam requires a solid understanding of the course outline, serving as a guide to acquiring essential skills and knowledge. Understanding the exam curriculum guarantees a comprehensive grasp of the subjects at hand. Let’s now examine the key areas covered in the DP-600 exam.

course outline Dp-600

1. Understand about planning, implementing, and managing solution for data analytics (10–15%)

Planning a data analytics environment

Implementing and managing a data analytics environment

Managing the analytics development lifecycle

2. Learn how to prepare and serve data (40–45%)

Creating objects in a lakehouse or warehouse

Copying data

Transforming data

Optimizing performance

practice exam

3. Understanding implementing and managing semantic models (20–25%)

Designing and building semantic models

Optimizing enterprise-scale semantic models

4. Understand how to explore and analyze data (20–25%)

Performing exploratory analytics

  • Implementing descriptive and diagnostic analytics
  • Integrating prescriptive and predictive analytics into a visual or report
  • Profiling data (Microsoft Documentation: Using the data profiling tools)

Querying data by using SQL

  • Querying a lakehouse in Fabric by using SQL queries or the visual query editor (Microsoft Documentation: Query using the visual query editor)
  • Querying a warehouse in Fabric by using SQL queries or the visual query editor (Microsoft Documentation: Query using the SQL query editor)
  • Connecting to and querying datasets by using the XMLA endpoint

Microsoft DP-600 Exam FAQs

Check FAQs here!

faq Dp-600

Exam Policies

All the details about the exam, including its procedures, can be found in the Microsoft Certification exam policies. It’s crucial to follow these guidelines both during the exam and when you’re at the test center. Let’s take a closer look at some of these rules:

Retaking the Exam: If you don’t pass on your first attempt, wait for 24 hours before attempting again. During this time, you can choose a new exam date on the certification dashboard. After the second attempt, there’s a 14-day waiting period. After the third attempt, there are 14-day intervals between each try. You are allowed up to five attempts per year, and the 12-month period begins from your initial try.

Changing Exam Date or Cancelling: If you need to modify or cancel your exam, ensure you do so at least 24 hours before your scheduled time. Any changes made within 24 hours will result in the forfeiture of the exam fee. Additionally, if your company provided a voucher for the exam, they may incur penalties if you make changes or cancellations with less than 24 hours notice.

Study Guide for Microsoft DP-600 Exam

study guide

1. Understanding Exam Goals

To initiate your preparation for the Microsoft DP-600 exam, it’s crucial to comprehend the exam objectives. These goals delve into fundamental topics that form the core of what you need to know. The exam assesses your technical skills in accomplishing specific tasks:

  • Planning, implementing, and managing a solution for data analytics
  • Preparing and serving data
  • Implementing and managing semantic models
  • Exploring and analyzing data

2. Microsoft Learning Paths

Microsoft offers distinct learning paths equipped with study modules to prepare you for your exams. For a comprehensive guide and study resources for the DP-600 test, visit the official Microsoft website. The modules in this course not only enhance your understanding of the subjects but also ensure your success in the exams. Here’s what the learning path for the test includes:

– Ingest data with Microsoft Fabric

For more: https://learn.microsoft.com/en-us/training/paths/ingest-data-with-microsoft-fabric/

Discover how Microsoft Fabric empowers you to gather and organize data from different sources, including files, databases, or web services, using dataflows, notebooks, and pipelines.

Modules in this learning path:

  • Ingesting Data with Dataflows Gen2 in Microsoft Fabric
  • Ingesting data with Spark and Microsoft Fabric notebooks
  • Using Data Factory pipelines in Microsoft Fabric

– Implementing a Lakehouse with Microsoft Fabric

For more: https://learn.microsoft.com/en-us/training/paths/implement-lakehouse-microsoft-fabric/

This learning path help you understand the basic components of implementing a data lakehouse with Microsoft Fabric.

Modules in this learning path:

  • End-to-end analytics using Microsoft Fabric
  • Lakehouses in Microsoft Fabric
  • Using Apache Spark in Microsoft Fabric
  • Working with Delta Lake tables in Microsoft Fabric
  • Ingesting Data with Dataflows Gen2 in Microsoft Fabric
  • Using Data Factory pipelines in Microsoft Fabric
  • Organizing a Fabric lakehouse using medallion architecture design

– Working with data warehouses using Microsoft Fabric

For more: https://learn.microsoft.com/en-us/training/paths/work-with-data-warehouses-using-microsoft-fabric/

Get familiarity with the data warehousing process and understand how to load, monitor, and query a warehouse in Microsoft Fabric.

Modules in this learning path:

  • Data warehouses in Microsoft Fabric
  • Loading data into a Microsoft Fabric data warehouse
  • Querying a data warehouse in Microsoft Fabric
  • Monitoring a Microsoft Fabric data warehouse

– Working with semantic models in Microsoft Fabric

For more: https://learn.microsoft.com/en-us/training/paths/work-semantic-models-microsoft-fabric/

Creating reports for large-scale businesses involves more than just linking to data. Success in enterprise-level implementation requires a grasp of semantic models and effective strategies for scalability and optimization.

Modules in this learning path:

  • Understanding scalability in Power BI
  • Creating Power BI model relationships
  • Using tools to optimize Power BI performance
  • Enforcing Power BI model security

– Designing and building tabular models

For more: https://learn.microsoft.com/en-us/training/paths/design-build-tabular-models/

This learning path helps you get familiar with the foundational components of designing scalable tabular models using Power BI.

Modules in this learning path:

  • Creating Power BI model relationships
  • Using DAX time intelligence functions in Power BI Desktop models
  • Creating calculation groups
  • Enforcing Power BI model security
  • Using tools to optimize Power BI performance

– Managing the analytics development lifecycle

For more: https://learn.microsoft.com/en-us/training/paths/manage-analytics-development-lifecycle/

This learning path provides an understanding of the basic components of implementing lifecycle management techniques for Power BI assets.

Modules in this learning path:

  • Designing a Power BI application lifecycle management strategy
  • Creating and managing a Power BI deployment pipeline
  • Creating and managing Power BI assets

3. Participate in Study Communities

Making exam preparations becomes much smoother when you become part of online study groups. These communities connect you with experienced individuals who have faced similar challenges. It’s a chance to discuss any concerns you may have about the test and get ready for the DP-600 exam. So, it’s more than just studying; it’s learning from those who have already walked the path.

4. Use Practice Tests

Practice tests play a vital role in reinforcing your understanding of the study material. When you engage with Microsoft DP-600 practice exams, you can pinpoint your strengths and areas that require more attention. It’s like getting a sneak peek into your study progress. Moreover, these tests improve your speed in answering questions, providing a significant advantage on the actual exam day. Once you’ve covered a substantial amount of material, incorporating these practice tests for the exam is a wise decision. It’s not just about practicing; it’s about maximizing the effectiveness of your study time.

practice tests

The post Exam DP-600: Implementing Analytics Solutions Using Microsoft Fabric appeared first on Testprep Training Tutorials.

]]>
Microsoft Configuring Windows Server Hybrid Advanced Services: AZ-801 Sample Questions https://www.testpreptraining.com/tutorial/microsoft-configuring-windows-server-hybrid-advanced-services-az-801-sample-questions/ Tue, 06 Sep 2022 07:37:47 +0000 https://www.testpreptraining.com/tutorial/?page_id=57601 Candidates who can configure advanced Windows Server services on-premises, in hybrid settings, and in the cloud should take the Microsoft AZ-801 test. On-premises and hybrid solutions should be implemented and managed by these professionals, who should also be able to perform tasks like security, migration, monitoring, high availability, troubleshooting, and disaster recovery. Some of the...

The post Microsoft Configuring Windows Server Hybrid Advanced Services: AZ-801 Sample Questions appeared first on Testprep Training Tutorials.

]]>
Microsoft Configuring Windows Server Hybrid Advanced Services: AZ-801 Sample Questions

Candidates who can configure advanced Windows Server services on-premises, in hybrid settings, and in the cloud should take the Microsoft AZ-801 test. On-premises and hybrid solutions should be implemented and managed by these professionals, who should also be able to perform tasks like security, migration, monitoring, high availability, troubleshooting, and disaster recovery. Some of the administrative tools and technologies they use include Windows Admin Center, PowerShell, Azure Arc, Azure Automation Update Management, Microsoft Defender for Identity, Azure Security Center, Azure Migrate, and Azure Monitor. The article provides a list of Microsoft Configuring Windows Server Hybrid Advanced Services: AZ-801 Sample Questions that cover core exam topics including –

  • Secure Windows Server on-premises and hybrid infrastructures (25–30%)
  • Implement and manage Windows Server high availability (10–15%)
  •  Implement disaster recovery (10–15%)
  • Migrate servers and workloads (20–25%)
  • Monitor and troubleshoot Windows Server environments (20–25%)

Q1)Windows Server is installed on a server you have called Server1. You must make sure that only particular apps on Server1 have access to the data in protected folders. Solution: Controlled folder access is configured under Virus & threat protection. Is the objective being met in AZ-801?

  • A. Yes
  • B. No

Correct Answer: A

Q2)Windows Server is installed on a server you have called Server1. You must make sure that only particular apps on Server1 have access to the data in protected folders. Tamper Protection is configured under Virus & Threat Protection. Is the objective being met in AZ-801?

  • A. Yes
  • B. No

Correct Answer: B

Q3)Windows Server is installed on a server you have called Server1. You must make sure that only particular apps on Server1 have access to the data in protected folders. The Exploit protection settings are configured from the App & browser control. Is the objective being met?

  • A. Yes
  • B. No

Correct Answer: B

Q4)You have a Windows Server virtual machine in Azure called VM1. You intend to install a fresh line-of-business application (LOB) on VM1. Make that the programme has the ability to generate child processes. What settings ought to be made on VM1 in AZ-801?

  • A. Microsoft Defender Credential Guard
  • B. Microsoft Defender Application Control
  • C. Microsoft Defender SmartScreen
  • D. Exploit protection

Correct Answer: D

Q5)You have 100 Windows Server-based Azure virtual machines. The virtual machines have Microsoft Defender for Cloud installed. If Microsoft Defender for Cloud alerts you that “Antimalware deactivated in the virtual machine,” you must shut down the virtual machine immediately. Which setting in Microsoft Defender for Cloud should you use in AZ-801?

  • A. a logic app
  • B. a workbook
  • C. a security policy
  • D. adaptive network hardening

Correct Answer: A

Q6)You have 100 on-premises servers with Azure Arc support and a Microsoft Sentinel deployment. The same resource group contains all of the Azure Arc-capable resources. The servers must be onboarded for Microsoft Sentinel. The answer must require the least amount of administration. How should servers be onboarded into Microsoft Sentinel?

  • A. Azure Automation
  • B. Azure Policy
  • C. Azure virtual machine extensions
  • D. Microsoft Defender for Cloud\

Correct Answer: B

Q7)You have an Azure Active Directory (Azure AD) tenant that is synced with an on-premises Active Directory Domain Services (AD DS) domain utilising password hash synchronisation. You are a subscriber to Microsoft 365. Every device is attached to Azure AD hybrid. Users complain that entering their password manually is required in order to access Microsoft 365 applications. The number of times users must enter their password in order to access Microsoft 365 and Azure services has to be decreased. What ought you to do?

  • A. Create a Conditional Access policy in Azure AD for the Office 365 applications.
  • B. Add an autodiscover record to the AD DS domain’s DNS zone.
  • C. Turn on single sign-on in Azure AD Connect (SSO).
  • D. Set up pass-through authentication with Azure AD Connect.

Correct Answer: C

Q8)You have a Microsoft Defender for Cloud subscription that is active. You have 50 Windows Server-based Azure virtual machines. You must make sure that Defender for Cloud is notified of any security flaws found on the virtual machines. Which extension on the virtual machines should you enable?

  • A. Vulnerability assessment for machines
  • B. Microsoft Dependency agent
  • C. Log Analytics agent for Azure VMs
  • D. Guest Configuration agent

Correct Answer: A

Q9)You have a workgroup with 10 servers running Windows Server. All network traffic between the servers must be configured on the servers to be encrypted. The answer needs to be as safe as it can be. In a connection security rule, which authentication mechanism should you configure?

  • A. NTLMv2
  • B. pre-shared key
  • C. Kerberos V5
  • D. computer certificate

Correct Answer: D

Q10)You have a Windows Server virtual machine in Azure called VM1. Using Azure Disk Encryption, you must encrypt the data on the discs attached to VM1. What is a requirement for putting Azure Disk Encryption into practise?

  • A. Customer Lockbox for Microsoft Azure
  • B. an Azure key vault
  • C. a BitLocker recovery key
  • D. data-link layer encryption in Azure

Correct Answer: B

Q11)An Active Directory Domain Services (AD DS) domain exists on your network. Windows Server is installed on two servers in the domain, Server1 and Server2. You must make sure that Server2 can be managed through the Computer Management panel. The least privilege principle must be applied to the solution. Which two Advanced Security rules in Windows Defender Firewall should be enabled on Server2? Each right response offers a piece of the answer.

  • A. the COM+ Network Access (DCOM-In) rule
  • B. all the rules in the Remote Event Log Management group
  • C. the Windows Management Instrumentation (WMI-In) rule
  • D. the COM+ Remote Administration (DCOM-In) rule
  • E. the Windows Management Instrumentation (DCOM-In) rule

Correct Answer: A and B

Q12)Windows Server is installed on a server that you own. The server is set up to use a connection security rule to encrypt all incoming traffic. You must make sure that Server1 can respond to orders for unencrypted tracert sent by devices connected to the same network. From Windows Defender Firewall with Advanced Security, what should you do?

  • A. From the IPsec Settings, configure IPsec defaults.
  • B. Create a new custom outbound rule that allows ICMPv4 protocol connections for all profiles.
  • C. Change the Firewall state of the Private profile to Off.
  • D. From the IPsec Settings, configure IPsec exemptions.

Correct Answer: D

Q13)You have VM1, a virtual computer in Azure. You switch on VM1’s Microsoft Defender SmartScreen. Make sure that the SmartScreen messages that are shown to users are recorded. What ought you to do?

  • A. Run WinRM quickconfig from a command prompt, to start.
  • B. Change the Advanced Audit Policy Configuration settings from the local Group Policy.
  • C. In Event Viewer, switch on the Debug log feature.
  • D. Modify the Virus & threat protection settings in the Windows Security app.

Correct Answer: C

Q14)The configurations of your failover cluster, Cluster1, are as follows: There are 6 nodes. Dynamic quorum is the quorum. File-sharing, dynamic witness.What is the most nodes that can fail concurrently and yet have a quorum?

  • A. 1
  • B. 2
  • C. 3
  • D. 4
  • E. 5

Correct Answer: C

Q15)Utilizing Storage Spaces Direct is your business. A Storage Space Direct storage pool’s available storage must be viewed. Which should you employ?

  • A. System Configuration
  • B. File Server Resource Manager (FSRM)
  • C. the Get-StorageFileServer cmdlet
  • D. Failover Cluster Manager

Correct Answer: D

Q16)You have two Windows Server-powered Azure virtual machines. The virtual computers will be housed in a failover cluster that you intend to build. An Azure Storage account that will serve as a cloud witness for the cluster has to be configured. Resilience must be enhanced by the solution. Which redundancy setting ought to be used for the storage account?

  • A. Geo-zone-redundant storage (GZRS)
  • B. Locally-redundant storage (LRS)
  • C. Zone-redundant storage (ZRS)
  • D. Geo-redundant storage (GRS)

Correct Answer: C

Q17)A three-node failover cluster is present. Pre- and post-scripts must be executed before and after Cluster-Aware Updating (CAU). The answer must require the least amount of administration. Which should you employ?

  • A. Azure Functions
  • B. Run profiles
  • C. Windows Server Update Services (WSUS)
  • D. Scheduled tasks

Correct Answer: B

Q18)You have two Windows Server-powered servers with the names Server1 and Server2. The Hyper-V server role is installed on both servers. Three virtual machines, VM1, VM2, and VM3, are hosted by Server 1. To Server2, the virtual machines replicate. There is a hardware issue with Server1. As soon as you can, you must bring VMs 1, 2, and 3 back online. What commands should you run from Server2’s Hyper-V Manager console for each virtual machine?

  • A. Start
  • B. Move
  • C. Unplanned Failover
  • D. Planned Failover

Correct Answer: C

Q19)A multitier application is hosted by three Azure virtual machines called VM1, VM2, and VM3. You’re going to use Azure Site Recovery. You must make sure that VMs 1, 2, and 3 fail over together. What settings should you make in AZ-801?

  • A. an availability zone
  • B. a recovery plan
  • C. an availability set

Correct Answer: B

Q20)You have a server named Server1 that is installed on your premises and is running Windows Server along with the Hyper-V server role. You have a subscription to Azure.You intend to use Azure Backup to backup Server1 to the cloud. Which two Azure Backup choices need that Microsoft Azure Backup Server (MABS) be installed? Each accurate response offers an entire resolution in AZ-801.

  • A. Bare Metal Recovery
  • B. Files and folders
  • C. System State
  • D. Hyper-V Virtual Machines

Correct Answer: A and C

Microsoft Configuring Windows Server Hybrid Advanced Services: AZ-801 free practice test

The post Microsoft Configuring Windows Server Hybrid Advanced Services: AZ-801 Sample Questions appeared first on Testprep Training Tutorials.

]]>
Microsoft Azure: AZ-140 Sample Questions https://www.testpreptraining.com/tutorial/microsoft-azure-az-140-sample-questions/ Fri, 26 Aug 2022 12:44:03 +0000 https://www.testpreptraining.com/tutorial/?page_id=57189 Microsoft provides the exam AZ-140: Configuring and Operating Windows Virtual Desktop on Microsoft Azure. The AZ-140 Exam assesses a candidate’s ability to perform technical tasks such as planning a Windows Virtual Desktop architecture, managing access and security, managing user environments and apps, implementing a Windows Virtual Desktop infrastructure, and monitoring and maintaining a Windows Virtual...

The post Microsoft Azure: AZ-140 Sample Questions appeared first on Testprep Training Tutorials.

]]>
Microsoft Azure: AZ-140 Sample Questions

Microsoft provides the exam AZ-140: Configuring and Operating Windows Virtual Desktop on Microsoft Azure. The AZ-140 Exam assesses a candidate’s ability to perform technical tasks such as planning a Windows Virtual Desktop architecture, managing access and security, managing user environments and apps, implementing a Windows Virtual Desktop infrastructure, and monitoring and maintaining a Windows Virtual Desktop infrastructure. Candidates for this exam have subject matter expertise in planning, delivering, and managing virtual desktop experiences and remote apps on Azure for any device.

Exam AZ-140: Configuring and Operating Windows Virtual Desktop on Microsoft Azure consists of 40 to 60 multiple-choice and multi-response questions. The candidate will have 120 minutes to finish the exam. Furthermore, it is only available in English, and they must score 70% to obtain this certification.

Microsoft Azure: AZ-140 Sample Questions

Advanced Sample Questions

What is the main purpose of the Azure Active Directory (AD) in Microsoft Azure?

  • a) To provide a central location for managing users and applications
  • b) To provide a secure and scalable infrastructure for running Windows-based applications
  • c) To provide a platform for developing and deploying web-based applications
  • d) To provide a centralized storage for all your files and documents

Answer: a

Explanation: Azure AD provides a central location for managing users and applications in Microsoft Azure. It helps to secure access to resources and provides a single sign-on experience for users.

What is the Azure Storage Account used for in Microsoft Azure?

  • a) To host virtual machines
  • b) To store backups of your virtual machines
  • c) To store unstructured data such as blobs, files, queues, and tables
  • d) To store structured data in a relational database

Answer: c

Explanation: Azure Storage Account is used to store unstructured data such as blobs, files, queues, and tables in Microsoft Azure. It provides scalable and highly available storage solutions that can be used for a variety of purposes.

What is the Azure Resource Manager used for in Microsoft Azure?

  • a) To manage and monitor resources within your Azure subscription
  • b) To manage and monitor virtual machines
  • c) To manage and monitor storage accounts
  • d) To manage and monitor network security

Answer: a

Explanation: Azure Resource Manager is used to manage and monitor resources within your Azure subscription. It helps you to deploy, manage, and monitor resources in a consistent and predictable manner, making it easier to manage your resources in Microsoft Azure.

What is the Azure Virtual Network used for in Microsoft Azure?

  • a) To host virtual machines
  • b) To store backups of your virtual machines
  • c) To provide a secure and scalable infrastructure for running applications
  • d) To provide a centralized storage for all your files and documents

Answer: c

Explanation: Azure Virtual Network is used to provide a secure and scalable infrastructure for running applications in Microsoft Azure. It enables you to isolate your applications from the public internet and communicate between virtual machines securely.

What is the Azure Load Balancer used for in Microsoft Azure?

  • a) To distribute incoming traffic across multiple virtual machines
  • b) To store backups of your virtual machines
  • c) To provide a secure and scalable infrastructure for running applications
  • d) To provide a centralized storage for all your files and documents

Answer: a

Explanation: Azure Load Balancer is used to distribute incoming traffic across multiple virtual machines in Microsoft Azure. It helps to ensure high availability and performance for your applications by distributing incoming traffic evenly across multiple virtual machines.

What is the Azure ExpressRoute used for in Microsoft Azure?

  • a) To connect your on-premises infrastructure to Microsoft Azure
  • b) To store backups of your virtual machines
  • c) To provide a secure and scalable infrastructure for running applications
  • d) To provide a centralized storage for all your files and documents

Answer: a

Explanation: Azure ExpressRoute is used to connect your on-premises infrastructure to Microsoft Azure. It provides a dedicated and secure connection between your on-premises infrastructure and Microsoft Azure, enabling you to use Microsoft Azure as an extension of your own data center.

What is the Azure Storage Account used for in Microsoft Azure?

  • a) To store backups of your virtual machines
  • b) To host virtual machines
  • c) To provide a secure and scalable infrastructure for running applications
  • d) To provide a centralized storage for all your data and files

Answer: d

Explanation: Azure Storage Account is used to provide a centralized storage for all your data and files in Microsoft Azure. It offers various storage options, including blobs, files, queues, and tables, to meet your storage needs for your applications and data.

What is the Azure Automation Account used for in Microsoft Azure?

  • a) To automate repetitive tasks and processes in Microsoft Azure
  • b) To store backups of your virtual machines
  • c) To provide a secure and scalable infrastructure for running applications
  • d) To provide a centralized storage for all your data and files

Answer: a

Explanation: Azure Automation Account is used to automate repetitive tasks and processes in Microsoft Azure. It helps to simplify and streamline your IT operations by automating routine tasks and processes, such as deployments, configuration management, and monitoring.

What is the Azure Backup used for in Microsoft Azure?

  • a) To store backups of your virtual machines
  • b) To host virtual machines
  • c) To provide a secure and scalable infrastructure for running applications
  • d) To provide a centralized storage for all your data and files

Answer: a

Explanation: Azure Backup is used to store backups of your virtual machines in Microsoft Azure. It provides a secure and scalable solution for backing up your virtual machines, ensuring the protection of your data and applications in the event of a disaster or other data loss scenarios.

What is the Azure DevOps used for in Microsoft Azure?

  • a) To manage software development life cycle
  • b) To store backups of your virtual machines
  • c) To provide a secure and scalable infrastructure for running applications
  • d) To provide a centralized storage for all your data and files

Answer: a

Explanation: Azure DevOps is used to manage the software development life cycle in Microsoft Azure. It provides a suite of tools and services that help developers to plan, develop, test, and deliver software efficiently and effectively. Azure DevOps helps teams to collaborate and streamline their development process, resulting in faster delivery of high-quality software.

Basic Sample Questions

These and other benefits make Azure a compelling choice for customers who are looking to leverage the cloud for their business needs.

Question 1 –

You have a contoso.com Azure Active Directory (Azure AD) tenant and a VNET1 Azure virtual network.
Further, you add an Azure Active Directory Domain Services (Azure AD DS) managed domain named litwareinc.com to VNET1.
You intend to deploy a Pool1 Azure Virtual Desktop host pool to VNET1.
You must ensure that Windows 10 Enterprise host pools can be deployed to Pool1. What should you start with?

  • A. Modify the settings of the litwareinc.com DNS zone.
  • B. Modify the DNS settings of VNET1.
  • C. Add a custom domain name to contoso.com.
  • D. Implement Azure AD Connect cloud sync.

Correct Answer – B 
Reference:
https://docs.microsoft.com/en-us/azure/active-directory-domain-services/tutorial-create-instance

Question 2 –

You intend to implement Azure Virtual Desktop. Existing virtual machines will be used in the deployment.
You create a host pool for Azure Virtual Desktop.
You must ensure that the virtual machines can be added to the host pool.
What should you start with?

  • A. Register the Microsoft.DesktopVirtualization provider.
  • B. Generate a registration key.
  • C. Run the Invoke-AzVMRunCommand cmdlet.
  • D. Create a role assignment.

Correct Answer –B 
Reference:
https://docs.microsoft.com/en-us/azure/virtual-desktop/create-host-pools-azure-marketplace

Question 3 –

You’re putting together an Azure Virtual Desktop deployment.
You determine the network latency between the user locations and the planned deployment.
What method should you use to determine the best Azure region in which to deploy the host pool?

  • A. Azure Traffic Manager
  • B. Azure Virtual Desktop Experience Estimator
  • C. Azure Monitor for Azure Virtual Desktop
  • D. Azure Advisor

Correct Answer –B 
Reference:
https://azure.microsoft.com/en-gb/services/virtual-desktop/assessment/

Question 4 –

Your business has 60,000 customers.
You intend to implement Azure Virtual Desktop.
You must suggest a storage solution for FSLogix profile containers. The solution must provide the highest IOPS and lowest latency desktop experience possible.
What would you suggest?

  • A. Azure Data Lake Storage
  • B. Azure NetApp Files
  • C. Azure Blob Storage Premium
  • D. Azure Files Standard

Correct Answer –B 
Reference:
https://docs.microsoft.com/en-us/azure/virtual-desktop/store-fslogix-profile

Question 5 –

You have five session hosts in your Azure Virtual Desktop host pool. Windows 10 Enterprise multi-session is used by the session hosts. You must prevent users from accessing the internet while using Azure Virtual Desktop. All required Microsoft services must be accessible to the session hosts.
Solution: You configure the host pool’s RDP Properties.
Is this satisfactory?

  • A. Yes
  • B. No

Correct Answer –B

Question 6 –

Pool1 is the name of the Azure Virtual Desktop host pool that you deploy.
You have a store1 Azure Storage account that stores FSLogix profile containers in a share called profiles.
The path to the storage containers for the session hosts must be configured.
Which route should you take?

  • A. \\store1.blob.core.windows.net\profiles
  • B. https://store1.file.core.windows.net/profiles
  • C. \\store1.file.core.windows.net\profiles
  • D. https://store1.blob.core.windows.net/profiles

Correct Answer –C 
Reference:
https://docs.microsoft.com/en-us/azure/virtual-desktop/create-profile-container-adds

Question 7 –

You intend to deploy Azure Virtual Desktop session host virtual machines based on a master image that has already been configured. The master image will be kept in a public image gallery.
To serve as the master image, you create a virtual machine called Image1. You configure Image1 by installing applications and making configuration changes.
You must ensure that the new session host virtual machines created using Image1 have distinct names and security identifiers.

What should you do with Image1 before uploading it to the shared image gallery?

  • A. At a command prompt, run the set computername command.
  • B. At a command prompt, run the sysprep command.
  • C. From PowerShell, run the rename-computer cmdlet.
  • D. From the lock screen of the Windows device, perform a Windows Autopilot Reset.

 Correct Answer –B 
Reference:
https://docs.microsoft.com/en-us/azure/virtual-machines/windows/prepare-for-upload-vhd-image#determine-when-to-use-sysprep

Question 8 –

You will NOT be able to return to this section once you have answered a question in it. As a result, these questions will be missing from the review screen.
You have five session hosts in your Azure Virtual Desktop host pool. Windows 10 Enterprise multi-session is used by the session hosts.
You must prevent users from accessing the internet while using Azure Virtual Desktop. All required Microsoft services must be accessible to the session hosts.
Solution: You configure rules in the network security group (NSG) that is linked to the session hosts’ subnet.
Is this satisfactory?

  • A. Yes
  • B. No

Correct Answer –A 
Reference:
https://docs.microsoft.com/en-us/azure/virtual-network/tutorial-filter-network-traffic

Question 9 –

You will NOT be able to return to this section once you have answered a question in it. As a result, these questions will be missing from the review screen.
You have five session hosts in your Azure Virtual Desktop host pool. Windows 10 Enterprise multi-session is used by the session hosts.
You must prevent users from accessing the internet while using Azure Virtual Desktop. All required Microsoft services must be accessible to the session hosts.
Solution: The Address space settings of the virtual network that contains the session hosts are configured.
Is this satisfactory?

  • A. Yes
  • B. No

Correct Answer –B

Question 10 –

You have five session hosts in your Azure Virtual Desktop host pool. Windows 10 Enterprise multi-session is used by the session hosts.
You must prevent users from accessing the internet while using Azure Virtual Desktop. All required Microsoft services must be accessible to the session hosts.
Solution: You change the IP address of each session host.
Is this satisfactory?

  • A. Yes
  • B. No

Correct Answer –B

Question 11 –

You have a host pool in Azure Virtual Desktop. Session hosts running Windows 10 Enterprise multi-session are included in the pool.
When you connect to a Remote Desktop session on Pool1, you notice a problem with the frequency of screen updates.
You must determine whether the problem is due to a lack of server, network, or client resources. The solution must reduce the amount of time it takes to identify the resource type.
What are your options?

  • A. From within the current session, use the Azure Virtual Desktop Experience Estimator.
  • B. From Azure Cloud Shell, run the Get-AzOperationalInsightsWorkspaceUsage cmdlet and specify the DefaultProfile parameter.
  • C. From Azure Cloud Shell, run the Get-AzWvdUserSession cmdlet and specify the UserSessionId parameter.
  • D. From within the current session, use Performance Monitor to display the values of all the RemoteFX Graphics(*)\Frames Skipped/Second counters.

Correct Answer –D 
Reference:
https://docs.microsoft.com/en-us/azure/virtual-desktop/remotefx-graphics-performance-counters

Question 12 –

You have a tenant named contoso.com in Azure Active Directory (Azure AD).
Further, you deploy an Azure Active Directory Domain Services (Azure AD DS) managed domain named aaddscontoso.com to a virtual network named VNET1 using a user account named Admin1.
You intend to deploy Pool1 as an Azure Virtual Desktop host pool to VNET1.
You must ensure that you can deploy Windows 10 Enterprise session hosts to Pool1 using the Admin1 user account.
What should you start with?

  • A. Add Admin1 to the AAD DC Administrators group of contoso.com.
  • B. Assign the Cloud device administrator role to Admin1.
  • C. Assign a Microsoft 365 Enterprise E3 license to Admin1.
  • D. Change the password of Admin1.

Correct Answer –A 
Reference:
https://docs.microsoft.com/en-us/azure/virtual-desktop/create-host-pools-azure-marketplace?tabs=azure-portal

Question 13 –

Pool1 is an Azure Virtual Desktop host pool that contains the following:
Workspace1 is a linked workspace.
Default Desktop is an application group.
Host1 is the name of the session host.
You must insert a new data disk.
What should you change?

  • A. Host1
  • B. Workspace1
  • C. Pool1
  • D. Default Desktop

Correct Answer –A

Question 14 –

You have an Azure Virtual Desktop installation.
You must first create a host pool. The solution must ensure that credits can be accumulated during periods of low CPU usage and then used to boost performance above the purchased baseline during periods of high CPU usage.
When you create the pool, which virtual machine series should you specify?

  • A. A-series
  • B. D-series
  • C. H-series
  • D. B-series

Correct Answer –D 
Reference:
https://docs.microsoft.com/en-us/azure/virtual-machines/sizes-b-series-burstable

Question 15 –

You have a domain named contoso.com in Azure Active Directory Domain Services (Azure AD DS).
You have a storage1 Azure Storage account. Storage1 hosts a file share named share1 that is configured with share and file system permissions. For authentication, Share1 is set to use contoso.com.
Further, Pool1 is the name you give to your Azure Virtual Desktop host pool. Pool1 has two session hosts, both of which run the Windows 10 multi-session + Microsoft 365 Apps image.

Pool1 requires the configuration of an FSLogix profile container.
So, what should you do now?

  • A. Install the FSLogix agent on the session hosts of Pool1.
  • B. From storage1, set Allow shared key access to Disabled.
  • C. Configure the Profiles setting for the session hosts of Pool1.
  • D. Generate a shared access signature (SAS) key for storage1.

Correct Answer –A
Reference:
https://docs.microsoft.com/en-us/azure/virtual-desktop/create-host-pools-user-profile

Question 16 –

Pool1 is the name of your Azure Virtual Desktop host pool, and Storage1 is the name of your Azure Storage account. Storage1 keeps FSLogix profile containers in the share1 folder.
You form a new group called Group1. Also, you give Group1 permission to log in to Pool1.
You must ensure that Group1 members have access to the FSLogix profile containers in share1. The principle of least privilege must be applied to the solution.

Which two privileges should Group1 be granted? Each correct response represents a portion of the solution.

  • A. the Storage Blob Data Contributor role for storage1
  • B. the List folder / read data NTFS permissions for share1
  • C. the Modify NTFS permissions for share1
  • D. the Storage File Data SMB Share Reader role for storage1
  • E. the Storage File Data SMB Share Elevated Contributor role for storage1
  • F. the Storage File Data SMB Share Contributor role for storage1

Correct Answer –CF 
Reference:
https://docs.microsoft.com/en-us/azure/virtual-desktop/create-file-share

Question 17 –

You have a host pool for Azure Virtual Desktop.
Microsoft Antimalware for Azure must be installed on the session hosts.
What are your options?

  • A. Add an extension to each session host.
  • B. From a Group Policy Object (GPO), enable Windows 10 security features.
  • C. Configure the RDP Properties of the host pool.
  • D. Sign in to each session host and install a Windows feature.

Correct Answer –A 
Reference:
https://docs.microsoft.com/en-us/azure/security/fundamentals/antimalware

Question 18 –

Pool1 is an Azure Virtual Desktop host pool that is part of an Azure Active Directory Domain Services (Azure AD DS) managed domain.
You must configure idle session timeout settings for users who connect to Pool1’s session hosts.
Solution: You modify the AADDC Users GPO settings from an Azure AD DS-connected computer.
Is this satisfactory?

  • A. Yes
  • B. No

Correct Answer –B

Question 19 –

Pool1 is an Azure Virtual Desktop host pool that is part of an Azure Active Directory Domain Services (Azure AD DS) managed domain.
You must configure idle session timeout settings for users who connect to Pool1’s session hosts.
Solution: You can change the AADDC Computers GPO settings from an Azure AD DS-connected computer.

Is this satisfactory?

  • A. Yes
  • B. No

Correct Answer –A

Question 20 –

Pool1 is an Azure Virtual Desktop host pool that is part of an Azure Active Directory Domain Services (Azure AD DS) managed domain.
You must configure idle session timeout settings for users who connect to Pool1’s session hosts.
Solution: You can change the Session behavior settings in Pool1’s RDP Properties from the Azure portal.

Is this satisfactory?

  • A. Yes
  • B. No

Correct Answer –B

Microsoft Azure: AZ-140 Sample Questions

The post Microsoft Azure: AZ-140 Sample Questions appeared first on Testprep Training Tutorials.

]]>
Microsoft Azure Stack Hub: AZ-600 Sample Questions https://www.testpreptraining.com/tutorial/microsoft-azure-stack-hub-az-600-sample-questions/ Wed, 24 Aug 2022 15:09:25 +0000 https://www.testpreptraining.com/tutorial/?page_id=57119 The Configuring and Operating a Hybrid Cloud with Microsoft Azure Stack Hub (AZ-600) exam is intended for Azure administrators and Azure Stack operators. They are accountable for: To begin, using Azure Stack Hub, end-users or customers can access cloud services from within their own data center. Second, the infrastructure must be planned, deployed, packaged, updated,...

The post Microsoft Azure Stack Hub: AZ-600 Sample Questions appeared first on Testprep Training Tutorials.

]]>
Microsoft Azure Stack Hub: AZ-600

The Configuring and Operating a Hybrid Cloud with Microsoft Azure Stack Hub (AZ-600) exam is intended for Azure administrators and Azure Stack operators. They are accountable for:

  • To begin, using Azure Stack Hub, end-users or customers can access cloud services from within their own data center.
  • Second, the infrastructure must be planned, deployed, packaged, updated, and maintained.
  • Finally, providing hybrid cloud resources as well as managing infrastructure as a service (IaaS) and platform as a service (PaaS) (PaaS).
  • Following that, as part of a larger team dedicated to cloud-based management and security or hybrid environments as part of an end-to-end infrastructure, you will serve.

Advanced Sample Questions

What is the purpose of Azure Stack Hub?

  • a. To provide a hybrid cloud solution for on-premise deployment
  • b. To manage resource utilization in a public cloud
  • c. To manage virtual machines in a private cloud
  • d. To provide a disaster recovery solution

Answer: a. To provide a hybrid cloud solution for on-premise deployment

Explanation: Azure Stack Hub is a hybrid cloud solution that provides a consistent cloud experience for on-premise deployment. It enables organizations to run Azure services in their own data centers, providing a unified experience across public and private clouds.

What are the components of Azure Stack Hub?

  • a. Infrastructure as a Service (IaaS), Platform as a Service (PaaS), and Software as a Service (SaaS)
  • b. Networking, Compute, and Storage
  • c. Virtual Machines, Load Balancers, and Storage Accounts
  • d. Active Directory, Backup, and Recovery

Answer: b. Networking, Compute, and Storage

Explanation: Azure Stack Hub is made up of three main components: Networking, Compute, and Storage. These components provide the core functionality of the platform and enable organizations to run and manage virtual machines, applications, and data in a hybrid cloud environment.

What are the deployment options for Azure Stack Hub?

  • a. Azure Stack Hub can be deployed as a standalone solution or integrated with an existing infrastructure
  • b. Azure Stack Hub can only be deployed as a standalone solution
  • c. Azure Stack Hub can only be deployed as part of an existing infrastructure
  • d. Azure Stack Hub can only be deployed in a public cloud environment

Answer: a. Azure Stack Hub can be deployed as a standalone solution or integrated with an existing infrastructure

Explanation: Azure Stack Hub provides two deployment options: standalone and integrated. The standalone deployment option provides a complete solution, including hardware, operating system, and Azure Stack Hub software. The integrated deployment option allows organizations to integrate Azure Stack Hub with their existing infrastructure and hardware, providing a more seamless and cost-effective solution.

What are the benefits of using Azure Stack Hub?

  • a. Provides a unified experience across public and private clouds
  • b. Offers enhanced security and control over data and applications
  • c. Enables organizations to run Azure services in their own data centers
  • d. All of the above

Answer: d. All of the above

Explanation: Azure Stack Hub provides a range of benefits for organizations, including a unified experience across public and private clouds, enhanced security and control over data and applications, and the ability to run Azure services in their own data centers. These benefits enable organizations to benefit from the power of the cloud while maintaining control and security over their data and applications.

What are the roles and responsibilities of an Azure Stack Hub administrator?

  • a. Configuring and managing Azure Stack Hub infrastructure
  • b. Monitoring resource utilization and performance
  • c. Managing virtual machines and applications
  • d. All of the above

Answer: d. All of the above

Explanation: An Azure Stack Hub administrator is responsible for configuring and managing the Azure Stack Hub infrastructure, monitoring resource utilization and performance, and managing virtual machines and applications. The administrator must have a thorough understanding of Azure Stack Hub and its capabilities, as well as the ability to configure and manage the platform to meet the needs of their organization.

How does Azure Stack Hub provide a consistent cloud experience?

  • a. By using the same Azure services and APIs across public and private clouds
  • b. By using different Azure services and APIs across public and private clouds
  • c. By using a different set of tools and technologies across public and private clouds
  • d. By using a different set of resources across public and private clouds

Answer: a. By using the same Azure services and APIs across public and private clouds

Explanation: Azure Stack Hub provides a consistent cloud experience by using the same Azure services and APIs across public and private clouds. This enables organizations to easily move applications and data between their on-premise data centers and the public cloud, providing a seamless experience for developers and administrators.

How does Azure Stack Hub provide enhanced security and control over data and applications?

  • a. By providing a secure, isolated environment for sensitive data and applications
  • b. By providing an insecure, public environment for sensitive data and applications
  • c. By providing a secure, shared environment for sensitive data and applications
  • d. By providing an insecure, private environment for sensitive data and applications

Answer: a. By providing a secure, isolated environment for sensitive data and applications

Explanation: Azure Stack Hub provides enhanced security and control over data and applications by providing a secure, isolated environment for sensitive data and applications. This enables organizations to run sensitive applications and store sensitive data in their own data centers, providing a higher level of control and security than is possible in a public cloud environment.

How does Azure Stack Hub enable organizations to run Azure services in their own data centers?

  • a. By providing a platform for running Azure services in a hybrid cloud environment
  • b. By providing a platform for running non-Azure services in a hybrid cloud environment
  • c. By providing a platform for running non-Azure services in a public cloud environment
  • d. By providing a platform for running Azure services in a public cloud environment

Answer: a. By providing a platform for running Azure services in a hybrid cloud environment

Explanation: Azure Stack Hub enables organizations to run Azure services in their own data centers by providing a platform for running Azure services in a hybrid cloud environment. This enables organizations to benefit from the power of the cloud while maintaining control and security over their data and applications.

How does Azure Stack Hub support application and data mobility?

  • a. By providing a unified experience across public and private clouds
  • b. By providing a different experience across public and private clouds
  • c. By providing a limited experience across public and private clouds
  • d. By providing no experience across public and private clouds

Answer: a. By providing a unified experience across public and private clouds

Explanation: Azure Stack Hub supports application and data mobility by providing a unified experience across public and private clouds. This enables organizations to easily move applications and data between their on-premise data centers and the public cloud, providing a seamless experience for developers and administrators.

How does Azure Stack Hub support the development and deployment of cloud-native applications?

  • a. By providing a platform for developing and deploying cloud-native applications
  • b. By providing a platform for developing and deploying non-cloud-native applications
  • c. By providing a limited platform for developing and deploying cloud-native applications
  • d. By providing no platform for developing and deploying cloud-native applications

Answer: a. By providing a platform for developing and deploying cloud-native applications

Explanation: Azure Stack Hub supports the development and deployment of cloud-native applications by providing a platform for developing and deploying cloud-native applications. 

Basic Sample Questions

Question 1 –

You have an Azure Stack Hub-integrated system that is online. An Azure Active Directory (Azure AD) identity provider is used by the integrated system. The Azure App Service resource provider must be updated. Which of the following two actions should you take? Each correct response represents a portion of the solution.

  • A. Download the App Service installer to a computer that can connect to the Azure Stack Hub endpoints
  • B. Run appservice.exe as a local administrator
  • C. From the Updates blade of the administrator portal, select the infrastructure section
  • D. From the Updates blade of the administrator portal, select the Resource providers section
  • E. From the administrator portal, select the update, download the update, and then install the update

Correct Answer – AB 
Reference:
https://docs.microsoft.com/en-us/azure-stack/operator/azure-stack-app-service-update?view=azs-2008&pivots=state-connected

Question 2 –

Your business is a Cloud Solution Provider (CSP) that offers Stack Hub services to a number of customers in a multitenant environment. For billing reconciliation, user subscriptions are linked to Azure CSP subscriptions. You must examine all customers’ usage for the current day and the previous seven days. What are your options?

  • A. Query the Azure Stack Hub usage API
  • B. Query the Partner Center Usage API
  • C. From Partner Center, download the daily-rated usage reconciliation CSV
  • D. From Partner Center, view the usage associated with the Azure Partner Shared Services (APSS) subscription

Correct Answer – B 
Reference:
https://docs.microsoft.com/en-us/azure-stack/operator/azure-stack-provider-resource-api?view=azs-2008

Question 3 –

You have a system that is integrated with Azure Stack Hub. To provision user subscriptions, you must create a new offer that can only be used by an operator. The solution must make it easier for users to create resources. What kind of offer should you make?

  • A. public with a base plan only
  • B. delegated with a base plan only
  • C. private with a base plan only
  • D. decommissioned

Correct Answer – B 
Reference:
https://docs.microsoft.com/en-us/azure-stack/operator/azure-stack-delegated-provider?view=azs-2008

Question 4 –

You have an Azure Stack Hub-integrated system that is online. Multitenant billing must be enabled. What should you start with?

  • A. Generate a service principal to connect the Azure Billing API to Azure Commerce
  • B. Run the New-AzResource cmdlet
  • C. Create a blob storage account named AzSCommerceStaging that stores CSV usage metadata before you send the metadata to Azure Commerce
  • D. Email the registration subscription ID, resource group name, and registration name to azstcsp@microsoft.com

Correct Answer – D 
Reference:
https://docs.microsoft.com/en-us/azure-stack/operator/azure-stack-csp-howto-register-tenants?view=azs-2008&tabs=az

Question 5 –

You intend to connect an Azure Stack Hub integrated system to a highly available Azure App Service resource provider. You must ensure that the App Service resource meets all of the requirements. The solution must be cost-effective. What two resources should you use? Each correct response represents a portion of the solution.

  • A. a highly available Microsoft SQL Server 2019 instance in the default provider subscription
  • B. a highly available file server in a dedicated user subscription
  • C. a highly available file server in the default provider subscription
  • D. a highly available Microsoft SQL Server 2019 instance in a dedicated user subscription
  • E. a single file server in the default provider subscription
  • F. a single Microsoft SQL Server 2019 instance in the default provider subscription

Correct Answer – AC 
Reference:
https://blog.apps.id.au/adventures-cloud-operator-highly-available-app-service-1-4-azure-stack-step-2-deployment/

Question 6 –

You have an Azure Stack Hub-integrated system that is online.
The Azure Event Hubs service must be updated.

Solution: Run the Update-AzureRmManagementGroup cmdlet from an internet-connected computer.

Is this satisfactory?

  • A. Yes
  • B. No

Correct Answer – B

Question 7 –

You have an Azure Stack Hub-integrated system that is online. The Azure Event Hubs service must be updated. Solution: You run the Install-AzsUpdate cmdlet from a privileged endpoint (PEP) session. Is this satisfactory?

  • A. Yes
  • B. No

Correct Answer – A 
Reference:
https://docs.microsoft.com/en-us/azure-stack/operator/azure-stack-update-monitor?view=azs-2008

Question 8 –

You have an Azure Stack Hub-integrated system that is online. The Azure Event Hubs service must be updated.
Solution: You select the most recent infrastructure update from the administrator portal’s Updates blade. Is this satisfactory?

  • A. Yes
  • B. No

Correct Answer – B 
Reference:
https://docs.microsoft.com/en-us/azure-stack/operator/resource-provider-apply-updates?view=azs-2008
Implement Data Center Integration

Question 9 –

In which of the following three cases should you update the registration of an Azure Stack Hub integrated system? Each correct response provides a complete solution.

  • A. when you add or remove nodes for capacity-based billing
  • B. when you change the billing model
  • C. when you add or remove nodes for consumption-based billing
  • D. when you renew an annual capacity subscription
  • E. when you enable Azure Stack Hub for multitenancy
  • F. when you update the Azure Active Directory (Azure AD) home directory

Correct Answer – ABD 
Reference:
https://docs.microsoft.com/en-us/azure-stack/operator/azure-stack-registration?view=azs-2008&tabs=az1%2Caz2%2Caz3%2Caz4&pivots=state- connected#renew-or-change-registration

Question 10 –

You have an Azure Stack Hub-integrated system with the following settings:
AzS is a deployment prefix.
AzP Is physical prefix

The certificates for the integrated system must be renewed. Which virtual machine should you connect to with PowerShell?

  • A. AzS-ERCS02
  • B. AzS-CA01
  • C. AzS-WAS01
  • D. AzP-S1-N01

Correct Answer – A

Question 11 –

You intend to deploy an Azure Stack Hub integrated system with Internet access. The public VIP pool must be defined. What is the smallest possible subnet mask for the public VIP pool?

  • A. /22
  • B. /25
  • C. /26
  • D. /27

Correct Answer – C 
Reference:
https://docs.microsoft.com/en-us/azure-stack/operator/azure-stack-network?view=azs-2008

Question 12 –

Your organization is a Cloud Solution Provider (CSP). You intend to deploy a multitenant Azure Stack Hub integrated system to host internal company workloads as well as customer workloads. The integrated system must be registered. Which Azure subscription type should you use for registration?

  • A. Azure Partner Shared Services (APSS)
  • B. Enterprise Agreement (EA)
  • C. Pay-As-You-Go (PAYG)
  • D. CSP

Correct Answer – A 
Reference:
https://docs.microsoft.com/en-us/azure-stack/operator/azure-stack-add-manage-billing-as-a-csp?view=azs-2008

Microsoft Azure Stack Hub: AZ-600

The post Microsoft Azure Stack Hub: AZ-600 Sample Questions appeared first on Testprep Training Tutorials.

]]>
Microsoft Azure Infrastructure Solutions: AZ-305 Sample Questions https://www.testpreptraining.com/tutorial/microsoft-azure-infrastructure-solutions-az-305-sample-questions/ Mon, 22 Aug 2022 08:00:55 +0000 https://www.testpreptraining.com/tutorial/?page_id=57063 Candidates should study AZ-305: Designing Microsoft Azure Infrastructure Solutions if they have expertise in creating cloud and hybrid solutions that use Microsoft Azure, including computation, network, storage, monitoring, and security. Among other responsibilities, this position entails consulting stakeholders and translating business requirements into designs for secure, scalable, and reliable Azure solutions. An Azure Solutions Architect...

The post Microsoft Azure Infrastructure Solutions: AZ-305 Sample Questions appeared first on Testprep Training Tutorials.

]]>
Microsoft Azure Infrastructure Solutions: AZ-305 Sample Questions

Candidates should study AZ-305: Designing Microsoft Azure Infrastructure Solutions if they have expertise in creating cloud and hybrid solutions that use Microsoft Azure, including computation, network, storage, monitoring, and security. Among other responsibilities, this position entails consulting stakeholders and translating business requirements into designs for secure, scalable, and reliable Azure solutions. An Azure Solutions Architect additionally works with administrators, developers, and other roles involved in the deployment of Azure solutions. The article provides a list of Microsoft Azure Infrastructure Solutions: AZ-305 Sample Questions that cover core exam topics including –

  • Design Identity, Governance, and Monitoring Solutions
  • Design Data Storage Solutions
  •  Design Business Continuity Solutions
  • Design Infrastructure Solutions

Advanced Sample Questions

Which Azure service is used to provide a scalable, fully managed NoSQL database?

  • a. Azure Cosmos DB
  • b. Azure SQL Database
  • c. Azure Database for MySQL
  • d. Azure Database for PostgreSQL

Answer: a. Azure Cosmos DB

Explanation: Azure Cosmos DB is a fully managed NoSQL database service that provides scalability, high availability, and global distribution. It supports a variety of data models, including document, key-value, graph, and column-family.

Which Azure service is used to monitor and diagnose issues across applications and infrastructure?

  • a. Azure Monitor
  • b. Azure Log Analytics
  • c. Azure Application Insights
  • d. Azure Service Health

Answer: a. Azure Monitor

Explanation: Azure Monitor is a platform for monitoring and diagnosing issues across applications and infrastructure. It provides a centralized location for collecting and analyzing telemetry data from a variety of sources, including applications, infrastructure, and Azure services.

Which Azure service is used to manage the configuration and deployment of virtual machines?

  • a. Azure Resource Manager
  • b. Azure Virtual Machines
  • c. Azure Backup
  • d. Azure Site Recovery

Answer: a. Azure Resource Manager

Explanation: Azure Resource Manager is a service for managing the configuration and deployment of resources in Azure. It provides a way to organize resources into resource groups, apply tags for easy searching, and create templates for deploying resources in a repeatable way.

Which Azure service is used to create and manage virtual networks?

  • a. Azure Virtual Machines
  • b. Azure Site Recovery
  • c. Azure Backup
  • d. Azure Virtual Network

Answer: d. Azure Virtual Network

Explanation: Azure Virtual Network is a service for creating and managing virtual networks in Azure. It provides a way to securely connect virtual machines, applications, and other services within a single virtual network, or across multiple virtual networks.

Which Azure service is used to provide a fully managed platform for running and scaling containerized applications?

  • a. Azure Kubernetes Service
  • b. Azure Container Instances
  • c. Azure Container Registry
  • d. Azure Batch

Answer: a. Azure Kubernetes Service

Explanation: Azure Kubernetes Service is a fully managed platform for running and scaling containerized applications. It provides a way to deploy and manage containerized applications using Kubernetes, an open-source system for automating deployment, scaling, and management of containerized applications.

Which Azure service is used to provide an identity and access management solution for applications and services?

  • a. Azure Active Directory
  • b. Azure Key Vault
  • c. Azure Security Center
  • d. Azure Information Protection

Answer: a. Azure Active Directory

Explanation: Azure Active Directory is a cloud-based identity and access management solution that provides a way to authenticate and authorize users for applications and services. It provides a centralized location for managing users and groups, enforcing access policies, and enabling single sign-on.

Which Azure service is used to automate the deployment and management of infrastructure resources?

  • a. Azure DevOps
  • b. Azure Resource Manager
  • c. Azure Automation
  • d. Azure Logic Apps

Answer: b. Azure Resource Manager

Explanation: Azure Resource Manager is a service for managing the configuration and deployment of resources in Azure. It provides a way to organize resources into resource groups, apply tags for easy searching, and create templates for deploying resources in a repeatable way.

Which Azure service is used to provide a scalable, fully managed relational database service?

  • a. Azure Cosmos DB
  • b. Azure SQL Database
  • c. Azure Database for MySQL
  • d. Azure Database for PostgreSQL

Answer: b. Azure SQL Database

Explanation: Azure SQL Database is a fully managed relational database service that provides scalability, high availability, and automatic backup and recovery. It supports SQL Server functionality and is compatible with a variety of tools and frameworks.

Which Azure service is used to manage secrets and keys used for authentication and encryption?

  • a. Azure Active Directory
  • b. Azure Key Vault
  • c. Azure Security Center
  • d. Azure Information Protection

Answer: b. Azure Key Vault

Explanation: Azure Key Vault is a service for managing secrets and keys used for authentication and encryption. It provides a way to securely store and manage cryptographic keys, certificates, and secrets, and enables the use of keys and secrets in applications and services.

Which Azure service is used to provide a fully managed messaging service for asynchronous communication between applications and services?

  • a. Azure Service Bus
  • b. Azure Event Hubs
  • c. Azure Notification Hubs
  • d. Azure Relay

Answer: a. Azure Service Bus

Explanation: Azure Service Bus is a fully managed messaging service that provides a way to decouple applications and services for asynchronous communication. It supports a variety of messaging patterns, including point-to-point, publish-subscribe, and request-response.

Basic Sample Questions

Q1) You have a bespoke application called Application1 in your Azure subscription. Application1 was created by Fabrikam, Ltd., a third-party business. Role-based access control (RBAC) permissions were given to Fabrikam developers for the Application1 components. Microsoft 365 E5 is licenced for all users. You must provide a remedy to determine whether the Fabrikam developers still need access to Application1. The answer must adhere to the following criteria:

Send a monthly email with a list of Application1 access permissions to the manager of the developers. Automatically withdraw an access authorization if the manager doesn’t confirm it. cut back on development work. What ought to you suggest?

  • A. Create an application1 access review in Azure Active Directory (Azure AD).
  • B. Develop a runbook for Azure Automation that executes the Get-AzRoleAssignment cmdlet.
  • C. Create a unique role assignment for the Application1 resources in Azure Active Directory (Azure AD) Privileged Identity Management.
  • D. Develop a runbook for Azure Automation that executes the Get-AzureADUserAppRoleAssignment cmdlet.

Correct Answer: A

Q2)You have a subscription to Azure. The subscription has a blob container with several blobs inside of it. During the month of April, ten users from your company’s financial division want to access the data. To allow access to the blobs only during the month of April, you must suggest a solution. Which security measure ought to be suggested in the recommendation?

  • A. shared access signatures (SAS)
  • B. Conditional Access policies
  • C. certificates
  • D. access keys

Correct Answer: A

Q3)You have an on-premises Active Directory domain that is synchronised with an Azure Active Directory (Azure AD) tenant. WebApp1 is an internal web application that is hosted on your premises. WebApp1 makes use of Windows Integrated authentication. Some users access the on-premises network via remote access but do not have VPN access. You must grant single sign-on (SSO) access to WebApp1 to the remote users. What two features ought to be incorporated into the solution? Each right response offers a piece of the answer.

  • A. Azure AD Application Proxy
  • B. Azure AD Privileged Identity Management (PIM)
  • C. Conditional Access policies
  • D. Azure Arc
  • E. Azure AD enterprise applications
  • F. Azure Application Gateway

Correct Answer: A and C

Q4)Several virtual machines are deployed by your business both on-premises and on Azure. ExpressRoute is being set up and deployed for connectivity from on-premises to Azure. There are problems with network connectivity on certain virtual computers. To determine whether packets are being accepted or blocked to the virtual machines, you must evaluate the network traffic. Use Azure Traffic Analytics in Azure Network Watcher to examine network traffic as a solution. Is the objective being met?

  • A. Yes
  • B. No

Correct Answer: B

Q5)Several virtual machines are deployed by your business both on-premises and on Azure. ExpressRoute is installed and set up for connectivity from on-premises to Azure. There are problems with network connectivity on certain virtual computers. To determine whether packets are being accepted or blocked to the virtual machines, you must evaluate the network traffic. Use Azure Advisor to examine network traffic as a solution. Is the objective being met?

  • A. Yes
  • B. No

Correct Answer: B

Q6)You are creating a sizable Azure environment with numerous subscriptions. As a component of a governance solution, you intend to employ Azure Policy. Which three scopes are available for Azure Policy definitions assignment? Each accurate response offers an entire resolution.

  • A. Azure Active Directory (Azure AD) administrative units
  • B. Azure Active Directory (Azure AD) tenants
  • C. subscriptions
  • D. compute resources
  • E. resource groups
  • F. management groups

Correct Answer: ACF

Q7)To create a monthly report of all new Azure Resource Manager (ARM) resource deployments in your Azure subscription, you must suggest a solution. What should the recommendation contain in Microsoft Azure Infrastructure Solutions: AZ-305 ?

  • A. Azure Activity Log
  • B. Azure Advisor
  • C. Azure Analysis Services
  • D. Azure Monitor action groups

Correct Answer: A

Q8)To create a monthly report of all new Azure Resource Manager (ARM) resource deployments in your Azure subscription, you must suggest a solution. What should the recommendation contain?

  • A. an Azure Logic Apps integration account
  • B. an Azure Import/Export job
  • C. Azure Data Factory
  • D. an Azure Analysis services On-premises data gateway
  • E. an Azure Batch account

Correct Answer: B and C

Q9)You have a subscription to Azure that includes the programmes App1 and App2. App1 is a programme for processing sales. When an App1 transaction needs to be ship, a message is put to a queue in an Azure Storage account, and App2 scans the queue for pertinent transactions. Additional programmes will be implement in the future that will handle some shipping requests based on the particulars of the transactions. For each additional application to be able to read the pertinent transactions, you must suggest a replacement for the storage account queue. What ought to you suggest?

  • A. one Azure Data Factory pipeline
  • B. multiple storage account queues
  • C. one Azure Service Bus queue
  • D. one Azure Service Bus topic

Correct Answer: D

Q10)You are creating a programme that will run on Azure. The programme will store video files with sizes varying from 50 MB to 12 GB. Users will be able to access the application online and it will employ certificate-based authentication. You must suggest a location for the video files to be stored. The solution must minimise storage costs while offering the quickest read speed. What ought to you suggest?

  • A. Azure Files
  • B. Azure Data Lake Storage Gen2
  • C. Azure Blob Storage
  • D. Azure SQL Database

Correct Answer: C

Q11)A solution for the Azure IoT Hub that will contain 50,000 IoT devices is what you are designing. Temperature, device ID, and time data will all be stream by each device. Every second, 50,000 records will be written on average. Near real-time visualisation of the data will be use. You must suggest a service that can store and search the data. Which two services would you suggest? Each accurate response offers an entire resolution.

  • A. Azure Table Storage
  • B. Azure Event Grid
  • C. Azure Cosmos DB SQL API
  • D. Azure Time Series Insights

Correct Answer: C and D

Q12)To host a stateless web application under an Azure subscription, you must deploy resources. The answer must adhere to the following criteria:
Make the entire.NET framework available. Give backup in case an Azure region fails. Give administrators access to the operating system so they can install the dependencies for special applications. Solution: Along with an Azure Application Gateway, you deploy two Azure virtual machines to two different Azure regions. Is the objective being met in Microsoft Azure Infrastructure Solutions: AZ-305 ?

  • A. Yes
  • B. No

Correct Answer: B

Q13) To host a stateless web application in an Azure subscription, you must deploy resources. The answer must adhere to the following criteria:
Make the entire.NET framework available. Give backup in case an Azure region fails. Give administrators access to the operating system so they can install the dependencies for special applications. Solution: You set up an Azure Traffic Manager profile and deploy two Azure virtual machines to two different Azure regions. Is the objective being met?

  • A. Yes
  • B. No

Correct Answer: A

Q14) On a virtual machine hosted by Azure, you have SQL Server. Every night, a batch process is use to write data to the databases. For the data, you must suggest a disaster recovery plan. The answer must adhere to the following criteria:

  • Offer the capacity to bounce back in the event of a local outage.
  • Support automated recovery, a recovery point target (RPO) of 24 hours, and a recovery time objective (RTO) of 15 minutes. minimise expenses. What should the recommendation contain?
  • A. Azure virtual machine availability sets
  • B. Azure Disk Backup
  • C. an Always On availability group
  • D. Azure Site Recovery

Correct Answer: D

Q15)A SQL database design is what you’re doing. Twenty 20 GB databases with various consumption patterns will be part of the solution. You must suggest a database hosting platform for the databases. The answer must adhere to the following criteria: A 99.99% uptime Service Level Agreement (SLA) must be met by the solution. The databases’ allotted computing resources must scale dynamically. There must be reserve capacity in the solution. Reduced compute costs are require. What should the recommendation contain in Microsoft Azure Infrastructure Solutions: AZ-305 ?

  • A. 20 Azure SQL databases in an elastic pool
  • B. An availability set of 20 databases on a Microsoft SQL server running on an Azure virtual machine
  • C. A Microsoft SQL server with 20 databases that is running on an Azure virtual machine
  • D. 20 serverless Azure SQL Database instances.

Correct Answer: A

Microsoft Azure Infrastructure Solutions: AZ-305 free  practice test

The post Microsoft Azure Infrastructure Solutions: AZ-305 Sample Questions appeared first on Testprep Training Tutorials.

]]>
Microsoft Azure Cosmos DB(DP-420) Sample Questions https://www.testpreptraining.com/tutorial/microsoft-azure-cosmos-dbdp-420-sample-questions/ Wed, 17 Aug 2022 13:32:06 +0000 https://www.testpreptraining.com/tutorial/?page_id=57018 Advanced Sample Questions What is Azure Cosmos DB? A) A relational database management system B) A NoSQL database service C) A cloud-based document database D) An in-memory data store Answer: B) A NoSQL database service Explanation: Azure Cosmos DB is a globally-distributed, multi-model database service provided by Microsoft Azure. It is a NoSQL database, which...

The post Microsoft Azure Cosmos DB(DP-420) Sample Questions appeared first on Testprep Training Tutorials.

]]>
Microsoft Azure Cosmos DB (DP-420) Sample Questions

Advanced Sample Questions

What is Azure Cosmos DB?

  • A) A relational database management system
  • B) A NoSQL database service
  • C) A cloud-based document database
  • D) An in-memory data store

Answer: B) A NoSQL database service

Explanation: Azure Cosmos DB is a globally-distributed, multi-model database service provided by Microsoft Azure. It is a NoSQL database, which means that it is designed to handle non-relational data, such as documents, key-value pairs, graph data, and columnar data.

What are the benefits of using Azure Cosmos DB?

  • A) Scalability, high availability, and low latency
  • B) Advanced security features and data privacy
  • C) Both A and B
  • D) None of the above

Answer: C) Both A and B

Explanation: Azure Cosmos DB provides a number of benefits to users, including scalability, high availability, and low latency. It also provides advanced security features and data privacy, ensuring that sensitive data is protected and secure. These benefits make Azure Cosmos DB an ideal choice for a wide range of use cases, such as web, mobile, gaming, and IoT applications.

What data models does Azure Cosmos DB support?

  • A) Document
  • B) Key-value
  • C) Graph
  • D) All of the above

Answer: D) All of the above

Explanation: Azure Cosmos DB is a multi-model database, which means that it supports multiple data models, including document, key-value, graph, and columnar. This enables users to choose the data model that is best suited to their specific use case, and to easily switch between models as their needs evolve.

What is the purpose of the Azure Cosmos DB query language?

  • A) To retrieve data from the database
  • B) To update data in the database
  • C) To delete data from the database
  • D) All of the above

Answer: A) To retrieve data from the database

Explanation: The Azure Cosmos DB query language is used to retrieve data from the database. It provides a flexible and powerful way for users to query and retrieve data, and to filter and aggregate data based on specific criteria. The query language supports a variety of programming languages, including SQL, JavaScript, and MongoDB, making it easy for developers to work with the data in the database.

What is the consistency model in Azure Cosmos DB?

  • A) Eventual consistency
  • B) Strong consistency
  • C) Bounded staleness consistency
  • D) All of the above

Answer: D) All of the above

Explanation: Azure Cosmos DB provides a number of consistency options, including eventual consistency, strong consistency, and bounded staleness consistency. This allows users to choose the level of consistency that is appropriate for their specific use case, and to balance consistency, performance, and availability. For example, applications that require low latency and high throughput may choose eventual consistency, while applications that require strong data consistency may choose strong consistency.

What is the role of the Azure Cosmos DB emulator in development and testing?

  • A) To allow developers to test their applications locally
  • B) To provide a live environment for testing applications
  • C) To provide a development environment for building applications
  • D) All of the above

Answer: A) To allow developers to test their applications locally

Explanation: The Azure Cosmos DB emulator provides developers with a local environment for testing their applications, without the need for a live connection to the Azure Cosmos DB service. This enables developers to test their applications in a controlled and isolated environment, and to easily simulate different scenarios and test cases. The emulator supports all the features of the Azure Cosmos DB service, making it an ideal tool for development and testing.

What is the purpose of the Azure Cosmos DB partitioning model?

  • A) To distribute data across multiple nodes
  • B) To improve performance by reducing the amount of data stored on a single node
  • C) Both A and B
  • D) None of the above

Answer: C) Both A and B

Explanation: The Azure Cosmos DB partitioning model is designed to distribute data across multiple nodes, and to improve performance by reducing the amount of data stored on a single node. This enables the database to scale horizontally and to handle large amounts of data and traffic, while still providing fast and reliable performance. The partitioning model is based on the concept of a partition key, which is used to distribute data across the nodes in the database.

What is the role of the Azure Cosmos DB data migration tool in migrating data to Azure Cosmos DB?

  • A) To simplify the process of migrating data from other sources to Azure Cosmos DB
  • B) To provide a graphical interface for migrating data to Azure Cosmos DB
  • C) Both A and B
  • D) None of the above

Answer: C) Both A and B

Explanation: The Azure Cosmos DB data migration tool is designed to simplify the process of migrating data from other sources to Azure Cosmos DB. It provides a graphical interface that makes it easy to select the data to be migrated, and to specify the target database and collection. The tool supports a wide range of data sources, including JSON, MongoDB, Cassandra, and SQL Server, making it easy to migrate data from a variety of sources to Azure Cosmos DB.

What is the purpose of the Azure Cosmos DB global distribution feature?

  • A) To replicate data across multiple regions for improved data durability and availability
  • B) To improve performance by reducing the amount of data stored on a single node
  • C) Both A and B
  • D) None of the above

Answer: A) To replicate data across multiple regions for improved data durability and availability Explanation: The Azure Cosmos DB global distribution feature enables users to replicate their data across multiple regions, for improved data durability and availability. This enables users to keep their data close to their users, for fast and reliable access, and to ensure that their data is available even in the event of a regional outage. The global distribution feature provides multi-homing and active-active replication, and enables users to easily configure and manage their global distribution settings.

Basic Sample Questions

Question 1
In your Azure Cosmos DB Core (SQL) API account, you have a container named container1 whose contents you wish to make available as reference data for Azure Stream Analytics.
Solution: Use Azure Cosmos DB Core (SQL API) as input and Azure Blob Storage as output to create an Azure Data Factory pipeline. Will this meet the goal?
  • A. Yes
  • B. No

Answer : B

Reference: https://docs.microsoft.com/en-us/azure/cosmos-db/sql/changefeed-ecommerce-solution

Question 2
In your Azure Cosmos DB Core (SQL) API account, you have a container named container1 whose contents you wish to make available as reference data for Azure Stream Analytics.
Solution: Build an Azure function that uses Azure Cosmos DB Core (SQL) API change feeds as triggers and Azure event hubs as outputs. Will this meet the goal?
  • A. Yes
  • B. No

Answer : A

Reference: https://docs.microsoft.com/en-us/azure/cosmos-db/sql/changefeed-ecommerce-solution

Question 3
App1 is a SQL API application that reads data from an Azure Cosmos DB Core (SQL) account every minute. With eventual consistency, App1 runs the same read queries every minute. A query in the cache consumes request units (RUs) instead of cache items, and you verify the IntegratedCacheiteItemHitRate metric and the IntegratedCacheQueryHitRate metric, both having values of 0. It is verified that the dedicated gateway cluster has been provisioned and is used in the connection string. You are required to ensure that App1 uses the Azure Cosmos DB integrated cache. What must you configure?
  • A. indexing policy of the Azure Cosmos DB container
  • B. consistency level of the requests from App1
  • C. connectivity mode of the App1 CosmosClient
  • D. default consistency level of the Azure Cosmos DB account

Answer : C

Reference: https://docs.microsoft.com/en-us/azure/cosmos-db/integrated-cache-faq

Question 4
In your Azure Cosmos DB Core (SQL) API account, you have a container named container1. There are three-second updates in container1, and you have an Azure Functions app named function1 that should run whenever an item is inserted or replaced. There is a problem with function1 that does not run on each upsert, and you need to ensure that function1 processes each upsert within one second. Which of the given property will you change in the Function.json file of function1?
  • A. checkpointInterval
  • B. leaseCollectionsThroughput
  • C. maxItemsPerInvocation
  • D. feedPollDelay

Answer : D

Reference: https://docs.microsoft.com/en-us/azure/azure-functions/functions-bindings-cosmosdb-v2-trigger

Question 5
You have the following query.
SELECT * FROM ׁ
WHERE c.sensor = “TEMP1”
AND c.value < 22 –
AND c.timestamp >= 1619146031231
You must  recommend a composite index strategy for minimizing the request units (RUs) consumed by the query. What will you recommend?
  • A. a composite index for (sensor ASC, value ASC) and a composite index for (sensor ASC, timestamp ASC)
  • B. a composite index for (sensor ASC, value ASC, timestamp ASC) and a composite index for (sensor DESC, value DESC, timestamp DESC)
  • C. a composite index for (value ASC, sensor ASC) and a composite index for (timestamp ASC, sensor ASC)
  • D. a composite index for (sensor ASC, value ASC, timestamp ASC)

Answer : A

Reference: https://azure.microsoft.com/en-us/blog/three-ways-to-leverage-composite-indexes-in-azure-cosmos-db/

Question 6
A Cosmos DB Core (SQL) API account will be created that uses customer-managed keys stored in Azure Key Vault, and you need to configure an Azure Key Vault access policy to allow Azure Cosmos DB to access those keys. Which three of the following permissions will you enable in the access policy?
  • A. Wrap Key
  • B. Get
  • C. List
  • D. Update
  • E. Sign
  • F. Verify
  • G. Unwrap Key

Answer : ABG

Reference: https://docs.microsoft.com/en-us/azure/cosmos-db/how-to-setup-cmk

Question 7
Apache Kafka must be configured to ingest data from an Azure Cosmos DB Core (SQL) API account. Data from telemetry containers must be added to the Kafka topic IoT, and the data must be stored in compact binary form. Which three of the following configuration items will you include in the solution?
  • A. “connector.class”: “com.azure.cosmos.kafka.connect.source.CosmosDBSourceConnector”
  • B. “key.converter”: “org.apache.kafka.connect.json.JsonConverter”
  • C. “key.converter”: “io.confluent.connect.avro.AvroConverter”
  • D. “connect.cosmos.containers.topicmap”: “iot#telemetry”
  • E. “connect.cosmos.containers.topicmap”: “iot”
  • F. “connector.class”: “com.azure.cosmos.kafka.connect.source.CosmosDBSinkConnector”

Answer : CDF

Reference: https://docs.microsoft.com/en-us/azure/cosmos-db/sql/kafka-connector-sink

https://www.confluent.io/blog/kafka-connect-deep-dive-converters-serialization-explained/

Question 8
To write a dataset, you will use an Azure Cosmos DB (SQL API) sink in an Azure Data Factory data flow. In order to optimise throughput, you need to ensure that 2,000 Apache Spark partitions are used to ingest the data. Which sink setting must be configured?
  • A. Throughput
  • B. Write throughput budget
  • C. Batch size
  • D. Collection action

Answer : C

Reference: https://docs.microsoft.com/en-us/azure/data-factory/connector-azure-cosmos-db

Question 9
There is a container named container1 in an Azure Cosmos DB Core (SQL) API account, and a user named User1 needs to be allowed to insert items into container1. The solution must make use of the principle of least privilege. Which of the following roles will you assign to User1?
  • A. CosmosDB Operator only
  • B. DocumentDB Account Contributor and Cosmos DB Built-in Data Contributor
  • C. DocumentDB Account Contributor only
  • D. Cosmos DB Built-in Data Contributor only

Answer : A

Reference: https://docs.microsoft.com/en-us/azure/cosmos-db/role-based-access-control

Question 10
In the Log Analytics workspace, you configure the diagnostic settings so that all log information is sent to your Azure Cosmos DB Core (SQL API) account. To identify when provisioned request units per second (RU/s) for resources within the account were modified, you need to identify when they were modified. You wrote the given query.
AzureDiagnostics –
| where Category == “ControlPlaneRequests”
What must be included in the query?
  • A. | where OperationName startswith “AccountUpdateStart”
  • B. | where OperationName startswith “SqlContainersDelete”
  • C. | where OperationName startswith “MongoCollectionsThroughputUpdate”
  • D. | where OperationName startswith “SqlContainersThroughputUpdate”

Answer : A

Reference: https://docs.microsoft.com/en-us/azure/cosmos-db/audit-control-plane-logs

Question 11
An Azure Cosmos DB Core (SQL API) account is used to run this query on a container within the account.
SELECT –
IS_NUMBER(“1234”) AS A,
IS_NUMBER(1234) AS B,
IS_NUMBER({prop: 1234}) AS C –
What will be the output of the query?
  • A. [{“A”: false, “B”: true, “C”: false}]
  • B. [{“A”: true, “B”: false, “C”: true}]
  • C. [{“A”: true, “B”: true, “C”: false}]
  • D. [{“A”: true, “B”: true, “C”: true}]

Answer : A

Reference: https://docs.microsoft.com/en-us/azure/cosmos-db/sql/sql-query-is-number

Question 12
Before an item is inserted into a container, you need to implement a trigger in Azure Cosmos DB Core (SQL) API. Which two of the following actions must be performed for ensuring that the trigger runs?
  • A. Append pre to the name of the JavaScript function trigger.
  • B. For each create request, set the access condition in RequestOptions.
  • C. Register the trigger as a pre-trigger.
  • D. For each create request, set the consistency level to session in RequestOptions.
  • E. For each create request, set the trigger name in RequestOptions.

Answer : C

Reference: https://docs.microsoft.com/en-us/azure/cosmos-db/sql/how-to-use-stored-procedures-triggers-udfs

Question 13
In Azure Cosmos DB Core (SQL) API account 1 you have an autoscale throughput account that requires you to run a function when a container in account1 reaches a certain normalized request units per second.
Solution: Configuring an Azure Monitor alert for triggering the function.
Will this meet the goal?
  • A. Yes
  • B. No

Answer : A

Reference: https://docs.microsoft.com/en-us/azure/cosmos-db/create-alerts

Question 14
In Azure Cosmos DB Core (SQL) API account 1 you have an autoscale throughput account that requires you to run a function when a container in account1 reaches a certain normalized request units per second.
Solution: Configuring the function for having an Azure CosmosDB trigger.
Will this meet the goal?
  • A. Yes
  • B. No

Answer : B

Reference: https://docs.microsoft.com/en-us/azure/cosmos-db/create-alerts

Question 15
In Azure Cosmos DB Core (SQL) API account 1 you have an autoscale throughput account that requires you to run a function when a container in account1 reaches a certain normalized request units per second.
Solution: Configuring an application for using the change feed processor for reading the change feed and configuring the application for triggering the function.
Will this meet the goal?
  • A. Yes
  • B. No

Answer : B

Reference: https://docs.microsoft.com/en-us/azure/cosmos-db/create-alerts

Question 16 
HOTSPOT – A Cosmos DB Core (SQL) API account named telemetry stores IoT data in two containers named readings and devices, which is part of your telemetry database.
Documents in readings have the following structure.
  1. ✑ id
  2. ✑ deviceid
  3. ✑ timestamp
  4. ✑ ownerid
  5. ✑ measures (array)
  • – type
  • – value
  • – metricid
Documents in devices have the following structure.
  1. ✑ id
  2. ✑ deviceid
  3. ✑ owner
  • Ownerid
  • Emailaddress
  • name
  1. ✑ brand
  2. ✑ model
For each of the following statements, select Yes if the statement is true. Otherwise, select No.
Hot Area:
Statements Yes/No
To return for all devices owned by a specific email address, multiple queries must be performed
To return deviceid, ownerid, timestamp, and value for a specific metricid, a join must be performed
To return deviceid, ownerid, emailaddress, and model, a join must be performed

Answer :

Statements Yes/No
To return for all devices owned by a specific email address, multiple queries must be performedYes
To return deviceid, ownerid, timestamp, and value for a specific metricid, a join must be performedNo
To return deviceid, ownerid, emailaddress, and model, a join must be performedNo
Question 17
DRAG DROP – In your Azure Cosmos DB Core (SQL API) account, you have two containers named container1 and container2, which are configured for multi-region writes.
The following is a sample of a document in container1:
{
“customerId”: 1234,
“firstName”: “John”,
“lastName”: “Smith”,
“policyYear”: 2021
}
The following is a sample of a document in container2:
{
“gpsId”: 1234,
“latitude”: 38.8951,
“longitude”: -77.0364
}
You are required to configure conflict resolution for meeting the following requirements:
  • ✑ For container1 you are required to resolve conflicts using the highest value for policyYear.
  • ✑ For container2 you are required to resolve conflicts by accepting the distance closest to latitude: 40.730610 and longitude: -73.935242.
  • ✑ Administrative effort are supposed to be minimized for implementing the solution.
What will you configure for each container? 
Select and Place:
Configurations Answer Area
Last write wins (default) modeContainer 1: 
Merge procedures (custom) modeContainer 2: 
An application that reads from the conflicts feed

Answer : 

Configurations Answer Area
Last write wins (default) modeContainer 1:  Last write wins (default) mode
Merge procedures (custom) modeContainer 2: Merge procedures (custom) mode
An application that reads from the conflicts feed
Question 18
DRAG DROP – You have an app that uses an Azure Cosmos DB Core (SQL API) account to store data. When the app performs queries, it returns large result sets, and you need to paginate the results. Each page of the results should return 80 items. Which three of the given actions are required to be performed in sequence?
Select and Place:
Actions Answer Area
Configure MaxItemCount in QueryRequestOptions
Run the query and provide a continuation token
Configure MaxBufferedItemCount in QueryRequestOptions
Append the results to a variable
Run the query and increment MaxItemCount

Answer : 

Actions Answer Area
Configure MaxItemCount in QueryRequestOptions
Run the query and provide a continuation token
Configure MaxBufferedItemCount in QueryRequestOptionsAppend the results to a variable
Run the query and increment MaxItemCount
Question 19
You maintain a relational database for a book publisher containing the following tables.
Name Column 
Author authorId (primary key)
fullname
address
contactinfo
Book bookId (primary key)
isbn
title
genre
BookauthorInk authorId (foreign key)
bookId (foreign key)
In most cases, a query will list the books for an authorId. In order to replace the relational database with Azure Cosmos DB Core (SQL) API, you must develop a non-relational data model. It is essential that the solution minimizes latency and read operation costs. What must be included in the solution?
  • A. Creating a container for Author and for a Book. In each Author document, embedding a bookId for each book by the author. In each Book document embedding an authorId of each author.
  • B. Creating Author, Book, and Bookauthorlnk documents in the same container.
  • C. Creating a container containing a document for each Author and a document for each Book. In each Book document, embedding an authorId.
  • D. Creating a container for Author and for a Book. In each Author document and Book document embedding the data from Bookauthorlnk.

Answer : A

Question 20
HOTSPOT – A container is in your Azure Cosmos DB Core (SQL) API account, and you need the Azure Cosmos DB SDK to use optimistic concurrency to replace a document. What must be included in the code? 
Hot Area:
Request Options property to set:
AccessCondition
ConsistencyLevel
SessionToken
Document property that will be compared: _etag
_id
_rid

Answer :

Request Options property to set:ConsistencyLevel
Document property that will be compared: _etag

The post Microsoft Azure Cosmos DB(DP-420) Sample Questions appeared first on Testprep Training Tutorials.

]]>
Microsoft Azure AZ-204 Sample Questions https://www.testpreptraining.com/tutorial/microsoft-azure-az-204-sample-questions/ Sun, 05 Jun 2022 17:38:52 +0000 https://www.testpreptraining.com/tutorial/?page_id=55591 This Microsoft Azure AZ-204 exam is designed to assess your ability to do the following technical tasks: Developing Azure compute solutions, developing for Azure storage, and integrating Azure security are some of these jobs. Monitoring, debugging, and improving Azure solutions are also included, as well as connecting to and consuming Azure and third-party services. Candidates...

The post Microsoft Azure AZ-204 Sample Questions appeared first on Testprep Training Tutorials.

]]>
Microsoft Azure AZ-204 Sample Questions

This Microsoft Azure AZ-204 exam is designed to assess your ability to do the following technical tasks: Developing Azure compute solutions, developing for Azure storage, and integrating Azure security are some of these jobs. Monitoring, debugging, and improving Azure solutions are also included, as well as connecting to and consuming Azure and third-party services. Candidates should also have subject area experience in designing, implementing, testing, and supporting cloud applications and services on Microsoft Azure before taking this exam.

Participating in all phases of cloud development, from establishing requirements to designing, is one of an Azure Developer’s primary tasks. Along with development, deployment, and upkeep. tweaking and monitoring of performance. The article provides a list of Microsoft Azure AZ-204 Sample Questions that cover core exam topics including –

  • Develop Azure compute solutions (25-30%)
  • Develop Azure compute solutions (25-30%)
  • Implement Azure security (15-20%)
  • Monitor, troubleshoot, and optimize Azure solutions (10-15%)
  • Connect to and consume Azure services and third-party services (25-30%)

Advanced Sample Questions

What is Azure Virtual Machines used for?

  • A) To run virtual machines in the cloud.
  • B) To store and manage data in the cloud.
  • C) To manage and monitor resources in Azure.

Answer: A) To run virtual machines in the cloud.

Explanation: Azure Virtual Machines is a service that enables organizations to run virtual machines in the cloud. It provides a fast and simple way to create and manage virtual machines, and enables organizations to run a variety of operating systems and applications in the cloud. Azure Virtual Machines supports a variety of operating systems, including Windows, Linux, and macOS, and can be easily integrated with other Azure services, such as Azure App Service and Azure Functions. By using Azure Virtual Machines, organizations can reduce the cost and complexity of managing virtual machines in the cloud, and simplify the deployment and management of their applications and services.

What is Azure Resource Manager (ARM)?

  • A) A deployment and management tool for Microsoft Azure resources.
  • B) A virtual network in Azure.
  • C) An Azure service that provides data storage and retrieval.

Answer: A) A deployment and management tool for Microsoft Azure resources.

Explanation: Azure Resource Manager (ARM) is a deployment and management tool for Microsoft Azure resources. It provides a single management plane to deploy, manage, and monitor all the resources in an Azure solution. ARM templates are JSON files that describe the resources, configuration, and deployment for an Azure solution. By using ARM, organizations can manage their resources in a consistent and predictable manner, automate the deployment and management of their solutions, and monitor their resources in real-time.

What is the purpose of an Azure App Service?

  • A) To host web and mobile applications in the cloud.
  • B) To store and manage data in the cloud.
  • C) To manage and monitor resources in Azure.

Answer: A) To host web and mobile applications in the cloud.

Explanation: Azure App Service is a platform for hosting web and mobile applications in the cloud. It provides a scalable and reliable environment for deploying and managing web and mobile applications, and offers a range of features and services to support the development and deployment of these applications. Azure App Service provides a scalable, secure, and highly available environment for deploying and running applications, and makes it easy to manage and monitor the performance of these applications.

What is Azure Blob Storage used for?

  • A) To store and manage data in the cloud.
  • B) To host web and mobile applications in the cloud.
  • C) To manage and monitor resources in Azure.

Answer: A) To store and manage data in the cloud.

Explanation: Azure Blob Storage is used to store and manage unstructured data, such as text and binary data, in the cloud. It is a scalable and highly available storage solution that provides organizations with a secure and reliable way to store and manage large amounts of data. Azure Blob Storage can be used for a variety of data scenarios, including the storage of documents, images, audio, and video files. By using Azure Blob Storage, organizations can reduce the cost and complexity of managing data storage and retrieval, and improve the performance and scalability of their data storage solutions.

What is the purpose of Azure Functions?

  • A) To run code in response to events.
  • B) To store and manage data in the cloud.
  • C) To manage and monitor resources in Azure.

Answer: A) To run code in response to events.

Explanation: Azure Functions is a serverless compute service that enables organizations to run code in response to events. It provides a way to run event-driven, scalable, and highly available code without having to manage the underlying infrastructure. Azure Functions can be triggered by a wide range of events, including changes in data, message queues, and HTTP requests, and can run code written in a variety of programming languages. By using Azure Functions, organizations can simplify the development and deployment of event-driven applications, and reduce the cost and complexity of managing infrastructure.

What is Azure Cosmos DB used for?

  • A) To store and manage globally distributed data.
  • B) To host web and mobile applications in the cloud.
  • C) To manage and monitor resources in Azure.

Answer: A) To store and manage globally distributed data.

Explanation: Azure Cosmos DB is a globally distributed, multi-model database service that is used to store and manage data. It provides organizations with a highly scalable, highly available, and low-latency data storage solution that supports multiple data models, including document, graph, key-value, and columnar data. Azure Cosmos DB provides a variety of consistency options, including strong, eventual, and session consistency, and enables organizations to easily replicate data to any number of regions to provide low-latency access to data for global users. By using Azure Cosmos DB, organizations can build highly scalable and globally distributed applications with a high degree of confidence in the performance and reliability of their data storage solutions.

What is Azure Virtual Network used for?

  • A) To host web and mobile applications in the cloud.
  • B) To store and manage data in the cloud.
  • C) To securely connect Azure resources to each other.

Answer: C) To securely connect Azure resources to each other.

Explanation: Azure Virtual Network (VNet) is used to securely connect Azure resources to each other. It provides organizations with a way to create a private network in the cloud and control the flow of inbound and outbound network traffic. Azure VNet enables organizations to create secure connections between resources in the cloud, and to connect to on-premises resources through site-to-site or point-to-site VPN connections. By using Azure VNet, organizations can create a secure and highly available network environment in the cloud, and simplify the deployment and management of their network infrastructure.

What is Azure App Service used for?

  • A) To host web and mobile applications in the cloud.
  • B) To store and manage data in the cloud.
  • C) To manage and monitor resources in Azure.

Answer: A) To host web and mobile applications in the cloud.

Explanation: Azure App Service is a fully managed platform for building, deploying, and scaling web and mobile applications in the cloud. It provides organizations with a way to quickly and easily build, deploy, and manage web and mobile applications, and enables developers to focus on writing code instead of managing infrastructure. Azure App Service supports a variety of programming languages, including .NET, Java, Node.js, PHP, and Python, and provides a highly scalable, highly available, and secure environment for running applications. By using Azure App Service, organizations can simplify the development and deployment of their applications, and reduce the cost and complexity of managing infrastructure.

What is Azure Container Instances used for?

  • A) To run containers in the cloud without managing infrastructure.
  • B) To store and manage data in the cloud.
  • C) To manage and monitor resources in Azure.

Answer: A) To run containers in the cloud without managing infrastructure.

Explanation: Azure Container Instances is a service that enables organizations to run containers in the cloud without having to manage infrastructure. It provides a fast and simple way to run containers, and enables organizations to run containers on demand, without having to manage a container orchestration service. Azure Container Instances provides organizations with a highly scalable, highly available, and secure environment for running containers, and can be easily integrated with other Azure services, such as Azure Functions and Azure App Service. By using Azure Container Instances, organizations can reduce the cost and complexity of running containers in the cloud, and simplify the deployment and management of their containerized applications.

What is Azure Monitor used for?

  • A) To store and manage data in the cloud.
  • B) To manage and monitor resources in Azure.
  • C) To host web and mobile applications in the cloud.

Answer: B) To manage and monitor resources in Azure.

Explanation: Azure Monitor is a service that enables organizations to manage and monitor resources in Azure. It provides organizations with a centralized view of their Azure resources, and enables them to monitor the performance and health of their applications and services. Azure Monitor provides a variety of features, including log analytics, performance monitoring, and alerting, and can be used to monitor resources across a variety of services, including Azure VMs, Azure Functions, and Azure App Service. By using Azure Monitor, organizations can gain a deeper understanding of the performance and health of their applications and services, and take proactive measures to address issues and improve performance.

Basic Sample Questions

Q1) You are in charge of creating a website. The website will be host in Azure. After the website is launch, you anticipate a huge number of traffic. You must keep the website accessible and responsive while keeping costs low. You must launch the webpage. So, what are your options?

  1. Set up a virtual computer to host the website. When the CPU demand is high, configure the virtual machine to automatically scale.
  2. Use the Shared service layer to deploy the website to an App Service. Configure the App Service strategy to automatically scale when CPU demand is high.
  3. Set up a virtual computer to host the website. When the CPU load is high, use a Scale Set to increase the virtual machine instance count.
  4. Use the Standard service tier to deploy the website to an App Service. When the CPU demand is high, configure the App Service strategy to automatically scale.

Correct Answer: Use the Standard service tier to deploy the website to an App Service. When the CPU demand is high, configure the App Service strategy to automatically scale.

Explanation: WAWS (Windows Azure Web Sites) comes in three modes: Standard, Free, and Shared. Even for sites with only one instance, Standard mode has an enterprise-grade SLA (Service Level Agreement) of 99.9% monthly. Standard mode differs from the other ways to purchase Windows Azure Web Sites in that it runs on dedicated instances.

Refer: Best Practices: Windows Azure Websites (WAWS)

Q2) To process Azure Storage blob data, you create an HTTP triggered Azure Function app. An output binding on the blob is used to start the app. After four minutes, the app continues to time out. The blob data must be processed by the program. You must guarantee that the app does not time out and that the blob data is processed. To process the blob data, use the Durable Function async pattern. Is the solution effective in achieving the goal?

  • Yes
  • No

Correct Answer: No

Explanation: Instead, send the HTTP trigger payload to an Azure Service Bus queue, where it will be handled by a queue trigger function, and you’ll get an HTTP success response right away. Large, long-running functions can result in unanticipated timeouts. Refactor huge functions into smaller function sets that work together and produce results quickly whenever possible, according to general best practices. A webhook or HTTP trigger function, for example, may need an acknowledgment response within a particular time limit; webhooks frequently demand an immediate response. The HTTP trigger payload can be placed in a queue and processed by a queue trigger function. This method allows you to postpone the actual task and respond quickly.

Refer: Best practices for reliable Azure Functions

Q4) You will not be able to return to this section after answering a question. As a result, the review screen will not include these questions. To process Azure Storage blob data, you create an HTTP-triggered Azure Function app. An output binding on the blob is used to start the app.
After four minutes, the app continues to time out. The blob data must be processed by the program. You must guarantee that the app does not time out and that the blob data is processed. Solution: Return an immediate HTTP success response bypassing the HTTP trigger payload into an Azure Service Bus queue to be handled by a queue trigger function. Is the solution effective in achieving the goal?

  • Yes
  • No

Correct Answer: Yes

Explanation: Large, long-running functions can result in unanticipated timeouts. Refactor huge functions into smaller function sets that work together and produce results quickly whenever possible, according to general best practices. A webhook or HTTP trigger function, for example, may need an acknowledgment response within a particular time limit; webhooks frequently demand an immediate response. The HTTP trigger payload can be placed in a queue and processed by a queue trigger function. This method allows you to postpone the actual task and respond quickly.

Refer: Best practices for reliable Azure Functions

Q5) To process Azure Storage blob data, you create an HTTP triggered Azure Function app. An output binding on the blob is used to start the app. After four minutes, the app continues to time out. The blob data must be processed by the program. You must guarantee that the app does not time out and that the blob data is processed. Solution: Enable the Always On setting and configure the app to use an App Service hosting plan. Is the solution effective in achieving the goal?

  • Yes
  • No

Correct Answer: No

Explanation: Instead, send the HTTP trigger payload to an Azure Service Bus queue, where it will be handled by a queue trigger function, and you’ll get an HTTP success response right away. Large, long-running functions can result in unanticipated timeouts. Refactor huge functions into smaller function sets that work together and produce results quickly whenever possible, according to general best practices. A webhook or HTTP trigger function, for example, may need an acknowledgment response within a particular time limit; webhooks frequently demand an immediate response. The HTTP trigger payload can be placed in a queue and processed by a queue trigger function. This method allows you to postpone the actual task and respond quickly.

Refer: Best practices for reliable Azure Functions

Q6) You create a software-as-a-service (SaaS) application for managing images. The photographs are uploaded to a web service, which subsequently stores them in Azure Storage Blob storage. General-purpose V2 is the storage account type. When photographs are submitted, they must be processed so that a mobile-friendly version of the image can be created and saved. In less than one minute, the process of creating a mobile-friendly version of the image must begin. You must create the procedure that initiates the photo processing.
Solution: Photo processing should be moved to an Azure Function that is triggered by the blob upload. Is the solution effective in achieving the goal?

  • Yes
  • No

Correct Answer: Yes

Explanation: Applications can react to events using Azure Storage events. Image or video processing, search indexing, or any file-oriented workflow are examples of common Blob storage event scenarios. Azure Event Grid pushes events to subscribers like Azure Functions, Azure Logic Apps, and even your own HTTP listener.

Refer: Reacting to Blob storage events

Q7) For auditing purposes, the application must access the transaction logs of all modifications to the blobs and blob metadata in the storage account. Only create, update, delete, and copy operations are allowed, and the changes must be kept in the sequence in which they occurred for compliance reasons. The transaction logs must be processed asynchronously. So, what are your options?

  1.  Process all Azure Blob storage events by using Azure Event Grid with a subscriber Azure Function app.
  2.  Enable the change feed on the storage account and process all changes for available events.
  3.  Process all Azure Storage Analytics logs for successful blob events.
  4.  Use the Azure Monitor HTTP Data Collector API and scan the request body for successful blob events.

Correct Answer: Enable the change feed on the storage account and process all changes for available events.

Explanation: The goal of the change feed is to give transaction logs of all modifications made to your storage account’s blobs and blob metadata. The change feed provides a read-only log of these modifications that is organized, guaranteed, durable, and immutable. Client applications can read these logs in streaming or batch mode at any time. The change feed enables you to create cost-effective and scalable solutions for processing change events in your Blob Storage account.

Refer: Change feed support in Azure Blob Storage

Q8)You’re working on an Azure Function App that processes photos uploaded to an Azure Blob storage container. After images are submitted, they must be processed as rapidly as possible, with the solution minimising latency. When the Function App is triggered, you write code to process photos. The Function App must be configured. So, what are your options?

  1. Use an App Service plan. Configure the Function App to use an Azure Blob Storage input trigger.
  2.  Use a Consumption plan. Configure the Function App to use an Azure Blob Storage trigger.
  3. Use a Consumption plan. Configure the Function App to use a Timer trigger.
  4. Use an App Service plan. Configure the Function App to use an Azure Blob Storage trigger.
  5. Use a Consumption plan. Configure the Function App to use an Azure Blob Storage input trigger.

Correct Answer: Use a Consumption plan. Configure the Function App to use an Azure Blob Storage trigger.

Explanation: When a new or updated blob is discovered, the Blob storage trigger starts a function. The function receives the contents of the blob as input. A function app on a single virtual machine (VM) is limited to 1.5 GB of memory on the Consumption plan.

Refer: Azure Blob storage trigger for Azure Functions

Q9)You’re getting ready to publish a website from a GitHub repository to an Azure Web App. A script generates static material for the webpage. You intend to use the continuous deployment functionality of Azure Web Apps. Before the website starts delivering traffic, you must run the static generating script. What are two options for achieving this goal? Each accurate response provides a comprehensive solution. NOTE: One point is awarded for each correct answer.

  1.  Add the path to the static content generation tool to WEBSITE_RUN_FROM_PACKAGE setting in the host.json file.
  2.  Add a PreBuild target in the websites csproj project file that runs the static content generation script.
  3. Create a file named run.cmd in the folder /run that calls a script which generates the static content and deploys the website.
  4. Create a file named .deployment in the root of the repository that calls a script which generates the static content and deploys the website.

Correct Answer: Add the path to the static content generation tool to WEBSITE_RUN_FROM_PACKAGE setting in the host.json file and Create a file named .deployment in the root of the repository that calls a script which generates the static content and deploys the website.

Explanation: Your functions can be run straight from a deployment package file in your function app in Azure. To enable your function app to run from a package, just add a WEBSITE RUN FROM PACKAGE setting to your function app settings. Include a.deployment file in the repository root to personalise your deployment. You only need to add a file with the name.deployment and the following content to the root of your repository:
COMMAND TO RUN FOR DEPLOYMENT [config] command = YOUR COMMAND TO RUN FOR DEPLOYMENT This command can simply be used to run a script (batch file) that contains everything you need for your deployment, such as moving files from the repository to the web root directory.

Refer: Run your functions from a package file in Azure

Q10)You’re working on a web application that’s being secure by the Azure Web Application Firewall (WAF). The web app’s traffic is route through an Azure Application Gateway instance that is share by several web apps.  Contoso.azurewebsites.net is the URL for the web app. SSL must be use to secure all traffic. Multiple web apps use the Azure Application Gateway instance.For the web app, you must configure Azure Application Gateway.Which of the two acts should you take? Each accurate response reveals a piece of the solution.

  1. In the Azure Application Gateway’s HTTP setting, enable the Use for App service setting.
  2.  Convert the web app to run in an Azure App service environment (ASE).
  3. Add an authentication certificate for contoso.azurewebsites.net to the Azure Application Gateway.
  4. In the Azure Application Gateway’s HTTP setting, set the value of the Override backend path option to contoso22.azurewebsites.net.

Correct Answer: In the Azure Application Gateway’s HTTP setting, enable the Use for App service setting. In the Azure Application Gateway’s HTTP setting, set the value of the Override backend path option to contoso22.azurewebsites.net.

Explanation:The HTTP settings define the ability to specify a host override, which may be applied to any back-end pool during rule construction.
The ability to extract the host name from the back-end pool members’ IP or FQDN. If configured with the option to derive host name from an individual back-end pool member, HTTP settings now enable the option to dynamically pick the host name from the FQDN of a back-end pool member. With multi-tenant services, SSL termination and end-to-end SSL are require. Trusted Azure services, such as Azure App service web apps, do not require whitelisting the backends in the application gateway when using end-to-end SSL.As a result, no authentication certificates are require.

Refer: Configure App Service with Application Gateway

Q11)You’re creating a website that stores data on Azure Blob storage. After 30 days, you configure the Azure Blob storage lifecycle to migrate all blobs to the archive layer. For data older than 30 days, customers have sought a service-level agreement (SLA). The minimal service level agreement (SLA) for data recovery must be document. What type of SLA should you use?

  1.  at least two days
  2. between one and 15 hours
  3. at least one day
  4. between zero and 60 minutes

Correct Answer: between one and 15 hours

Explanation: The lowest storage cost is in the archive access tier. However, in comparison to the hot and cool tiers, it has higher data retrieval costs. Depending on the priority of the request, retrieving data from the archive tier can take several hours. In the case of minor objects, a high priority rehydrate may be able to retrieve the object from the archive in less than an hour.

Refer: Hot, Cool, and Archive access tiers for blob data

Q12) You are in charge of creating Azure solutions. When an Azure virtual machine finishes processing data, a message must be sent to a.NET application. The communications must not be kept after the receiving program has processed them. The.NET object that will receive the messages must be implement. Which object do you think you should use?

  1. QueueClient
  2. SubscriptionClient
  3. TopicClient
  4. CloudQueueClient

Correct Answer: CloudQueueClient

Explanation: A queue allows a single consumer to handle a message. To access the Azure VM, you’ll need a CloudQueueClient.

Refer: Service Bus queues, topics, and subscriptions

Q13)You already have an Azure storage account where you store enormous amounts of data in various containers. All data from the previous storage account must be copied to the new storage account. The copying procedure must meet the following criteria: Data movement should be automated. Reduce the amount of user input necessary to complete the operation. Ascertain that the data transportation procedure can be recover. What type of material should you use?

  1.  AzCopy
  2. Azure Storage Explorer
  3.  Azure portal
  4. .NET Storage Client Library

Correct Answer: AzCopy

Explanation: Using the AzCopy v10 command-line utility, you can copy blobs, folders, and containers between storage accounts. Since the copy operation is synchronous, when the command completes, it means all files have been copied.

Refer: Copy blobs between Azure storage accounts by using AzCopy

Q14)You’re utilising the Azure Cosmos DB SQL API to create an Azure Cosmos DB solution. There are millions of documents in the database. Hundreds of properties can be found in a single document. There are no distinct partitioning values in the document properties. Azure Cosmos DB must scale individual database containers to fulfil the application’s performance requirements by distributing the workload evenly across all partitions over time. You must choose a partition key. Which two partition keys are available to you? Each accurate response provides a comprehensive solution.

  1.  a single property value that does not appear frequently in the documents
  2.  a value containing the collection name
  3.  a single property value that appears frequently in the documents
  4.  a concatenation of multiple property values with a random suffix appended
  5.  Further, a hash suffix appended to a property value

Correct Answer: a concatenation of multiple property values with a random suffix appended and a hash suffix appended to a property value

Explanation: Concatenating numerous property values into a single artificial partition key property can be use to create a partition key.  Appending a random integer to the end of the partition key value is another way to divide the burden more equitably. You can do parallel write operations across partitions when you distribute items this way.

Refer: Create a synthetic partition key

Q15)You’ve added a new Azure subscription to your account. You are developing an internal website for employees to view sensitive data. For authentication, the website uses Azure Active Directory (Azure AD). For the website, you must use multifactor authentication. Which of the two acts should you take? Each accurate response reveals a piece of the solution.

  1. Firstly, configure the website to use Azure AD B2C.
  2.  Secondly, in Azure AD, create a new conditional access policy.
  3. Next, upgrade to Azure AD Premium.
  4. In Azure AD, enable application proxy.
  5. In Azure AD conditional access, enable the baseline policy.

Correct Answer:  In Azure AD, create a new conditional access policy.

Explanation: Conditional access policy enables MFA. It’s the most adaptable way to give your users two-step verification. Conditional access policy is a premium feature of Azure AD that only works with Azure MFA in the cloud.

Refer: Plan an Azure Active Directory Multi-Factor Authentication deployment

Q16) You’re working on a Java application that stores key and value data in Cassandra. In the application, you intend to leverage a new Azure Cosmos DB resource and the Cassandra API. To allow provisioning of Azure Cosmos accounts, databases, and containers, you create an Azure Active Directory (Azure AD) group named Cosmos DB Creators. The Azure AD group should not have access to the keys needed to access the data. Access to the Azure AD group must be restrict. Which type of role-based access control should you implement?

  1. Firstly, documentDB Accounts Contributor
  2. Secondly, cosmos Backup Operator
  3. Next, Cosmos DB Operator
  4. Cosmos DB Account Reader

Correct Answer: Cosmos DB Operator

Explanation: Cosmos DB Operator is a new RBAC role in Azure Cosmos DB. This new role allows you to create Azure Cosmos accounts, databases, and containers, but it does not grant you access to the keys needed to access the data. This role is intended for circumstances where the ability to allow Azure Active Directory service principals access to control Cosmos DB deployment processes, including the account, database, and containers, is require.

Refer: Azure Cosmos DB Operator role for role-based access control (RBAC) is now available

Q17)You have an Azure Web app and many Azure Function apps in your application. Azure Key Vault stores application secrets such as connection strings and certificates. Secrets should not be kept in the application or runtime environment. Azure Active Directory (Azure AD) changes must be kept to a minimum. You must devise a method for loading application secrets. So, what are your options?

  1.  Firstly, create a single user-assigned Managed Identity with permission to access Key Vault and configure each App Service to use that Managed Identity. 
  2. Secondly, create a single Azure AD Service Principal with permission to access Key Vault and use a client secret from within the App Services to access Key Vault.
  3. Next, create a system assigned Managed Identity in each App Service with permission to access Key Vault.
  4. Create an Azure AD Service Principal with Permissions to access Key Vault for each App Service and use a certificate from within the App Services to access Key Vault.

Correct Answer: Create a single user-assigned Managed Identity with permission to access Key Vault and configure each App Service to use that Managed Identity. 

Explanation: For App Service and Azure Functions, use Key Vault references. Only system-assign manage identities are presently supported by Key Vault references. User-created IDs aren’t allowed to be use.

Refer: Use Key Vault references for App Service and Azure Functions

Q18)You have an Azure Web app and many Azure Function apps in your application. Azure Key Vault stores application secrets such as connection strings and certificates. Secrets should not be kept in the application or runtime environment. Azure Active Directory (Azure AD) changes must be kept to a minimum. You must devise a method for loading application secrets. So, what are your options?

  1. Firstly, copy blobs to Container2 by using the Put Blob operation of the Blob Service REST API
  2. Secondly, create an Event Grid topic that uses the Start-AzureStorageBlobCopy cmdlet
  3. Further, use AzCopy with the Snapshot switch to copy blobs to Container2
  4. Download the blob to a virtual machine and then upload the blob to Container2

Correct Answer: Create an Event Grid topic that uses the Start-AzureStorageBlobCopy cmdlet

Explanation: The Start-AzureStorageBlobCopy cmdlet begins copying a blob in Azure Storage.

Refer: Start-AzureStorageBlobCopy

Q19)You’re working on an ASP.NET Core site with Azure FrontDoor. Researchers can use the service to create custom weather data sets. Users can download data sets in Comma Separated Value (CSV) format. Every ten hours, the data is update. Based on the Response Header values, specific files must be removed from the FrontDoor cache. Individual assets must be remove from the Front Door cache. Which cache purge method should you use?

  1. single path 
  2.  wildcard
  3. root domain

Correct Answer: single path 

Explanation: In the lists of purge pathways, these forms are supported:

  • Purge individual assets by supplying the asset’s full path (without the protocol and domain), as well as the file extension.
  • Asterisk () can be use as a wildcard in purging. Purge all subfolders and files under a given folder by specifying the folder followed by /, for example, /pictures/*.
  • Purge the root domain of the endpoint by adding “/” to the path.

Refer: Caching with Azure Front Door

Q20)You work as a developer for a SaaS firm that provides a variety of web services. The following conditions must be met by all web services provided by the company:

  • Firstly, to gain access to the services, use API Management.
  • Secondly, for authentication, use OpenID Connect.
  • Next, avoid using your computer anonymously.
  • Several web services can be called without any authentication, according to a recent security audit.
  • What API Management policy should you use?
  1.  jsonp
  2. authentication-certificate
  3. check-header
  4. validate-jwt

Correct Answer: validate-jwt

Explanation: To validate the OAuth token for every incoming request, add the validate-jwt policy.

Refer: Protect an API in Azure API Management using OAuth 2.0 authorization with Azure Active Directory

Microsoft Azure AZ-204 free practice test

The post Microsoft Azure AZ-204 Sample Questions appeared first on Testprep Training Tutorials.

]]>
Microsoft Azure AZ-900 Sample Questions https://www.testpreptraining.com/tutorial/microsoft-azure-az-900-sample-questions/ Tue, 31 May 2022 08:02:09 +0000 https://www.testpreptraining.com/tutorial/?page_id=55484 With the latest updates in the AZ-900: Microsoft Azure Fundamentals Exam in the English version, it is very important to focus your preparation on the revised AZ-900 study guide and practice questions. Your preparation for the AZ-900 exam should concentrate on developing the skills around Cloud Concepts, Azure architecture and services and Azure management and...

The post Microsoft Azure AZ-900 Sample Questions appeared first on Testprep Training Tutorials.

]]>

With the latest updates in the AZ-900: Microsoft Azure Fundamentals Exam in the English version, it is very important to focus your preparation on the revised AZ-900 study guide and practice questions. Your preparation for the AZ-900 exam should concentrate on developing the skills around Cloud Concepts, Azure architecture and services and Azure management and governance. Also, candidates preparing for the exam are required to exhibit foundational knowledge of cloud concepts and Microsoft Azure. The article provides a list of AZ-900 Sample Exam Questions that cover core exam topics including –

  • First, Learn about Cloud Concepts (25 – 30%)
  • Second, Understanding Azure Architecture and Services (35 – 40%)
  • Third, Overview of Azure Management and Governance (30 – 35%)

AZ-900 Sample Questions

Advanced Sample Questions

Which of the following is a primary benefit of cloud computing?

  • a. Reduced costs
  • b. Increased hardware maintenance
  • c. Increased data center footprint
  • d. Increased physical security

Answer: a. Reduced costs

Explanation: One of the primary benefits of cloud computing is reduced costs. By moving to the cloud, organizations can reduce their hardware and software costs, and pay only for what they use.

Which of the following Azure services is used to build, deploy, and manage applications?

  • a. Azure Cosmos DB
  • b. Azure Functions
  • c. Azure Virtual Machines
  • d. Azure ExpressRoute

Answer: b. Azure Functions

Explanation: Azure Functions is a serverless compute service that enables developers to build, deploy, and manage applications without having to worry about infrastructure.

What is the name of the service in Azure that provides identity and access management?

  • a. Azure Active Directory
  • b. Azure Site Recovery
  • c. Azure Backup
  • d. Azure Virtual Network

Answer: a. Azure Active Directory

Explanation: Azure Active Directory is the service in Azure that provides identity and access management, allowing users to sign in and access cloud resources.

Which of the following is a type of Azure storage that is optimized for big data analytics workloads?

  • a. Azure Blob Storage
  • b. Azure Queue Storage
  • c. Azure File Storage
  • d. Azure Data Lake Storage

Answer: d. Azure Data Lake Storage

Explanation: Azure Data Lake Storage is a type of Azure storage that is optimized for big data analytics workloads, allowing users to store and analyze large amounts of data.

Which of the following is a service in Azure that enables users to manage and secure their network traffic?

  • a. Azure Firewall
  • b. Azure Traffic Manager
  • c. Azure Load Balancer
  • d. Azure Content Delivery Network

Answer: a. Azure Firewall

Explanation: Azure Firewall is a service in Azure that enables users to manage and secure their network traffic, allowing them to control access to their applications and resources.

What is the main benefit of using Azure virtual machines (VMs)?

  • a. They provide automatic backups of data.
  • b. They allow users to scale up or down as needed.
  • c. They require less maintenance than physical servers.
  • d. They offer increased physical security.

Answer: b. They allow users to scale up or down as needed.

Explanation: One of the main benefits of using Azure VMs is the ability to scale up or down as needed to meet changing demands. This allows users to avoid over-provisioning resources and paying for more than they need.

Which Azure service provides a fully-managed NoSQL database that can be scaled globally?

  • a. Azure SQL Database
  • b. Azure Cosmos DB
  • c. Azure Database for MySQL
  • d. Azure Database for PostgreSQL

Answer: b. Azure Cosmos DB

Explanation: Azure Cosmos DB is a fully-managed NoSQL database that can be scaled globally, making it a good choice for applications that require high availability and low latency.

Which Azure service provides a fully-managed Kubernetes container orchestration service?

  • a. Azure Kubernetes Service (AKS)
  • b. Azure Container Registry
  • c. Azure Container Instances
  • d. Azure Functions

Answer: a. Azure Kubernetes Service (AKS)

Explanation: Azure Kubernetes Service (AKS) is a fully-managed Kubernetes container orchestration service that makes it easy to deploy and manage containerized applications.

Which Azure service provides a way to manage virtual networks, subnets, and network security groups?

  • a. Azure Firewall
  • b. Azure Traffic Manager
  • c. Azure Load Balancer
  • d. Azure Virtual Network

Answer: d. Azure Virtual Network

Explanation: Azure Virtual Network provides a way to manage virtual networks, subnets, and network security groups, allowing users to control traffic flow and secure their applications.

What is the main benefit of using Azure App Service to host web applications?

  • a. Automatic scaling
  • b. Lower costs than other hosting options
  • c. Greater security than other hosting options
  • d. More customization options than other hosting options

Answer: a. Automatic scaling

Explanation: One of the main benefits of using Azure App Service to host web applications is automatic scaling, which allows the application to handle increased traffic without manual intervention.

Basic Sample Questions

Question 1. A company has an on-premises network containing several servers. The company plans to migrate all the servers to Azure. John has been asked to provide a solution to make sure that some of the servers are available in case a single Azure data center goes offline for an extended period. What must John do in this case?
  1. Fault Tolerance
  2. Elasticity
  3. Scalability
  4. Low Latency

Correct Answer: Fault Tolerance

Explanation: Fault tolerance is the property that allows a system to continue operating properly in the event of the failure of (or one or more faults within) some of its components. The Availability Zones expand the level of control for maintaining the availability of the applications and data on the Virtual Machines. The physical separation of Availability Zones within a region safeguards applications and data from data center failures. Moreover, with Availability Zones, Azure offers an uptime of 99.99% for Virtual Machines SLA. Therefore, by architecting solutions to use replicated Virtual Machines in zones, we can protect the applications and data from the loss of a data center.

Refer: Availability options for Azure Virtual Machines

Question2: When we implement a Software as a Service (SaaS)solution, then we become responsible to ________________.
  1. Configure high availability
  2. Define scalability rules
  3. Install the SaaS solution
  4. Configure the SaaS Solution

Correct Answer: Configure the SaaS Solution

Explanation: While implementing a Software as a Service (SaaS) solution, you become responsible to configure the SaaS solution. SaaS needs the least amount of management as the cloud provider is responsible to manage everything, allowing the end-user to use the software smoothly. Moreover, Software as a service (SaaS) permits the users to connect to and use cloud-based apps over the Internet like email, calendaring and office tools.

Reference: What is SaaS? and Azure Fundamental Concepts

Question 3: A company that hosts its infrastructure in _________________________ does not require its own data center.
  1. Private Cloud
  2. Public Cloud
  3. Hybrid Cloud
  4. Hyper-V Cost

Correct Answer: Public Cloud

Explanation: A company that hosts its infrastructure in a public cloud can close its data centre. The public cloud is one of the most common deployment models. In this case, there is no need to manage local hardware or keep it updated since everything runs on the cloud provider’s hardware. On the other hand, a private cloud is hosted in your own data centre. Therefore, you cannot close your data centre if you are using a private cloud.

Reference: Different types of Cloud Models

Question 4. Which of the following are the characteristics of the public cloud?

(A) Dedicated hardware
(B) Unsecured connections
(C) Limited storage
(D) Metered pricing
(E) Self-service management

  1. Only (A) and (B)
  2. Only (B) and (C)
  3. Only (C) and (D)
  4. Only (D) and (E)

Correct Answer: Only (D) and (E)

Explanation: Azure Cloud service offers metered pricing, as we pay for the resources being used. Also, public cloud services offer a self-managed service, as you can use the portal to add, change and also remove the resources as and when needed. Since hardware is shared among public cloud clients, so it is not dedicated. Therefore, connections on the cloud are secured. Also, storage is virtually unlimited on the cloud and in the public cloud, you get pay-as-you-go pricing with no CapEx costs. Also, the public cloud offers self-service management services.

Refer: Types of Cloud Models

Question 5: What should you do when planning to migrate a public website to Azure?
  1. Deploying a VPN
  2. Paying monthly usage cost
  3. Paying to transfer all the website data to Azure
  4. Reducing the number of connections on the website

Correct Answer: Paying monthly usage cost

Explanation: In order to migrate to a cloud platform there are some key features to consider when using Azure Websites as your hosting solution.  Like it should be globally available, should have a built-in load balancer and more.

Reference: How to plan your migration to Azure Website

Question 6: An organization intends to migrate all its data and resources to Azure. The migration plan developed requires that only Platform as a Service (PaaS) solutions must be used in Azure. John has been asked to deploy an Azure environment that meets the requirement of the migration plan. John suggests creating an Azure App Service and Azure SQL databases. Does the suggested solution meet the requirement?
  1. Yes, the solution meets the requirement
  2. No, the solution does not meet the requirement

Correct Answer: Azure App Service and Azure SQL databases are illustrations of Azure PaaS solutions. Thus, the suggested solution meets the requirement.

Reference: SQL Database PaaS Overview

Question7: An organization plans to host an accounting application called App1 that will be used by all the customers of the organization. It was observed that App1 had low usage during the initial three weeks of each month and very high usage during the last week of each month. Which of the following advantages of Azure Cloud Services will support cost management to handle this kind of usage pattern?
  1. High availability
  2. High latency
  3. Elasticity
  4. Load balancing

Correct Answer: Elasticity

Explanation: Elasticity offers the ability to provide additional compute resources when needed and reduce them when not required for reducing the costs. One of the examples of elasticity is Autoscaling. Elastic computing provides the ability to quickly expand or decrease computer processing, memory and storage resources to meet changing demands without worrying about capacity planning and engineering for peak usage. Moreover, cloud elasticity offers an organization to not pay for any unused capacity or unused resources.

References: About Elastic Computing

Question 8: An organization plans to migrate a web application to Azure that can be accessed by external users. Peter has been asked to suggest a cloud deployment solution for minimizing the amount of administrative effort in managing the web application. Which of the following should Peter include in the solution to meet the requirement?
  1. Software as a Service (SaaS)
  2. Platform as a Service (PaaS)
  3. Infrastructure as a Service (IaaS)
  4. Database as a Service (DaaS)

Correct Answer: Platform as a Service (PaaS)

Explanation: Since, Azure App Service is a platform-as-a-service (PaaS) that allows you to create web and mobile apps for any platform/device and then connect it to data anywhere, in the cloud or on-premises. Also, App Service includes the web and mobile capabilities that were previously delivered separately as Azure Websites and Azure Mobile Services.

Reference: PaaS Application using App Service

Question 9: Which of the given cloud deployment solution is suggested for Azure virtual machines?
  1. Infrastructure as a Service (IaaS)
  2. Platform as a Service (PaaS)
  3. Software as a Service (SaaS)
  4. Database as a Service (DaaS)

Correct Answer: Infrastructure as a Service (IaaS)

Explanation: Azure virtual machines are Infrastructure as a Service (IaaS) which is the most flexible category of cloud services. IaaS aims to offer a complete control over the hardware that runs the application including IT infrastructure servers, virtual machines, storage, networks, and operating systems. Therefore rather than buying hardware, with IaaS, we rent it.

Reference: Principles of Cloud Computing

Question 10: You have an on-premises network that contains 100 servers. You need to recommend a solution that provides additional resources to your users. The solution must minimize capital and operational expenditure costs. What should you include in the recommendation?
  1. Complete migration to the public cloud
  2. Additional data center
  3. A private cloud
  4. A hybrid cloud

Correct Answer: Hybrid Cloud

Explanation: A hybrid cloud is a combination of a private cloud and a public cloud. Capital expenditure involves spending money up-front for infrastructure like new servers. Also, with a hybrid cloud, we can use the on-premises servers while adding new servers in the public cloud. Moreover, adding new servers in Azure reduces the capital expenditure costs as we will not be paying for new servers which could have been the case if we deployed new server on-premises.

Reference: https://docs.microsoft.com/en-gb/learn/modules/principles-cloud-computing/4-cloud-deployment-models

Question 11: An organization is planning to migrate several servers from an on-premises network to Azure. Which of the following is a benefit of using a public cloud service for the servers over an on-premises network?
  1. Public cloud is owned by the public, NOT a private corporation
  2. Public cloud is a crowd-sourcing solution that provides corporations with the ability to enhance the cloud
  3. All public cloud resources can be freely accessed by every member of the public
  4. Public cloud is a shared entity whereby multiple corporations each use a portion of the resources in the cloud

Correct Answer: Public cloud is a shared entity whereby multiple corporations each use a portion of the resources in the cloud

Explanation: Since, the public cloud is a shared entity where multiple corporations can use a portion of the resources in the cloud. Also, the hardware resources (servers, infrastructure etc.) are managed by the cloud service provider. Multiple organization are now building resources such as virtual machines (VMs) and virtual networks on the hardware resources.

Question 12: In which of the given kind of cloud model are all the hardware resources owned by a third party and shared between multiple tenants?
  1. Private Cloud
  2. Hybrid Cloud
  3. Public Cloud
  4. Multi-vendor Cloud

Correct Answer: Public Cloud

Explanation: Microsoft Azure, Amazon Web Services (AWS) and Google Cloud are some of the examples of public cloud service providers. Microsoft, Amazon and Google own the hardware. The tenants are the customers who use the public cloud services.

Question13: An organization has 1,000 virtual machines (VMs) hosted on the Hyper-V hosts in a data center. They are planning to migrate all the virtual machines to an Azure pay-as-you-go subscription. John has been asked to suggest the expenditure model to use for the planned Azure solution. Which of the given expenditure model should he choose in this case?
  1. Operational
  2. Elastic
  3. Capital
  4. Scalable

Correct Answer: Operational

Explanation: The most significant change that will be face when moving from an on-premises cloud to a public cloud is the switch from capital expenditure (i.e, buying hardware) to operating expenditure (paying for service). Also, this shift requires more careful management of costs and expenditures. The primary advantage of the cloud is that you can positively impact the cost of a service you use by merely shutting down or resizing it when not required

Reference: Microsoft Cloud Adoption Framework for Azure

Question 14: State whether the following statement holds True or False. “A company deploying its own data center is an example of CapEx.”
  1. Yes, the statement is correct
  2. No, the statement is not correct

Correct Answer: Yes, the statement is correct

Explanation: Deploying your own datacenter is an example of CapEx since it is required to purchase all the infrastructure upfront before it can be used.

Reference: Microsoft Cloud Adoption Framework for Azure

Question 15: A company plans to offer Infrastructure as a Service (IaaS) resources in Azure. Which of the following resources is an example of Infrastructure as a Service (IaaS)?
  1. An Azure web app
  2. An Azure virtual machine
  3. An Azure logic app
  4. An Azure SQL database

Correct Answer: Azure virtual machine

Explanation: An Azure virtual machine is an example of Infrastructure as a Service (IaaS). On the other hand, Azure web app, Azure logic app and Azure SQL database are all examples of Platform as a Service (Paas).

Reference: Introduction to IaaS and What is PaaS

Question 16: On which of the following cloud models can we deploy physical servers?
  1. Private cloud and Hybrid cloud only
  2. Private cloud-only
  3. Private cloud, Hybrid cloud and Public cloud
  4. Hybrid cloud-only

Correct Answer: Private cloud and Hybrid cloud only

Explanation: Since a private cloud is on-premises therefore we can deploy physical servers. Also, Aa hybrid cloud is a mix of on-premise and public cloud resources. So, we can deploy physical servers on-premises.

Reference: Introduction to Hybrid Cloud

Question 17: A company has 50 Virtual Machines (VMs) hosted on-premises and 50 Virtual Machines (VMs) hosted in Azure. The Azure virtual machines and on-premises virtual machines connect to each other. Which of the following type of cloud model does this represent?
  1. Hybrid Cloud
  2. Private Cloud
  3. Public Cloud

References: Introduction to Hybrid Cloud

Question 18: An organization is planning to migrate all its data and resources to Azure. The migration plan of the organization indicates that only Platform as a Service (PaaS) solutions must be used in Azure. Peter has been asked to deploy an Azure environment that fulfils the company migration plan . Peter suggests to create Azure virtual machines, Azure SQL databases, and Azure Storage accounts to meet the requirement. Does the suggested solution meet the goal?
  1. Yes, the solution meets the requirement
  2. No, the solution does not meet the requirement

Correct Answer: No, the solution does not meet the requirement

Explanation: Platform as a service (PaaS) offers a complete development and deployment environment in the cloud. PaaS provides infrastructure servers, storage, and networking features as well as middleware, development tools, business intelligence (BI) services, database management systems, and more. PaaS has been designed to support the complete web application lifecycle including building, testing, deploying, managing, and updating. VMs is an examples of Infrastructure as a service (IaaS) with instant computing infrastructure, that is provisioned and managed over the internet.

References: Introduction to PaaS and Introduction to IaaS

Question 19: An organization is planning to deploy several custom applications to Azure. These custom applications offer invoicing services to the customers of the company. Every application will have several prerequisite applications and services installed. Peter has been asked to suggest a cloud deployment solution for all the applications. What of the following should he suggest to meet the requirement?
  1. Software as a Service (SaaS)
  2. Platform as a Service (PaaS)
  3. Infrastructure as a Service (laaS)

Correct Answer: Infrastructure as a Service (laaS)

Explanation: Infrastructure as a service (IaaS) is an instant computing infrastructure, provisioned and managed over the internet. The Infrastructure as a service (IaaS) provider manages the infrastructure, while the organization purchase, install, configure, and manage their own software.

References: Introduction to IaaS

Question 20: Azure Cosmos DB is an example of ___________________.
  1. Software as a Service (SaaS)
  2. Platform as a Service (PaaS)
  3. Infrastructure as a Service (laaS)
  4. Functions as a Service (FaaS)

Correct Answer: Platform as a Service (PaaS)

Explanation: Azure Cosmos DB is an example of a platform as a service (PaaS) cloud database provider.

Reference: Azure Cosmos DB resource model

Microsoft Azure Fundamentals AZ-900 Free Practice Test

The post Microsoft Azure AZ-900 Sample Questions appeared first on Testprep Training Tutorials.

]]>
Microsoft AZ-720 Exam FAQs https://www.testpreptraining.com/tutorial/microsoft-az-720-exam-faqs/ Wed, 11 May 2022 10:44:08 +0000 https://www.testpreptraining.com/tutorial/?page_id=55158 Microsoft AZ-720 Exam Basic FAQ What is the Microsoft AZ-720 Exam? Candidates who have familiarity with networking and hybrid settings, as well as an understanding of routing, permissions, and account restrictions, should take Exam AZ-720: Troubleshooting Microsoft Azure Connectivity. The exam requires the ability to detect problems with business continuity, hybrid environments, Infrastructure as a Service...

The post Microsoft AZ-720 Exam FAQs appeared first on Testprep Training Tutorials.

]]>
Microsoft AZ-720 Exam FAQs

Microsoft AZ-720 Exam Basic FAQ

What is the Microsoft AZ-720 Exam?

Candidates who have familiarity with networking and hybrid settings, as well as an understanding of routing, permissions, and account restrictions, should take Exam AZ-720: Troubleshooting Microsoft Azure Connectivity. The exam requires the ability to detect problems with business continuity, hybrid environments, Infrastructure as a Service (IaaS), Platform as a Service (PaaS), access control, networking, and virtual machine connection using accessible tools.

What is the knowledge required for the Microsoft AZ-720 Exam?
  • Candidates for the Azure Support Engineer for Connectivity Specialty certification are support engineers with subject matter knowledge in employing advanced troubleshooting methods to fix networking and connectivity issues in Azure.
  • Secondly, professionals in this area have the skills to troubleshoot issues with Azure Virtual Machines, virtual networks, and connections between on-premises and Azure services in hybrid settings.
  • They diagnose and uncover root causes for complicated situations using a variety of methods and technology.
How many questions are there on AZ-720 Exam?

In the Microsoft AZ-720 exam, there will be 40-60 questions.

What is the course outline for the AZ-720 Exam?

The topics in the Microsoft AZ-720 exam are –

  • Troubleshoot business continuity issues
  • Troubleshoot hybrid and cloud connectivity issues
  • Troubleshooting Platform as a Service issues
  • Troubleshoot authentication and access control issues
  • Troubleshooting network
  • Troubleshoot VM connectivity issues

How much the AZ-720 Exam will cost?

The Microsoft AZ-720 exam will cost $165 USD with additional taxes.

In what language can we give the AZ-720 Exam?

This exam is available in the English language.

What accommodations are available for candidates with disabilities?

Microsoft is ensuring that exams are accessible to everyone, including people with disabilities.

Microsoft AZ-720 Exam Specifics FAQ

What types of questions are there on the Microsoft Certification exams?

Microsoft introduces innovative testing technologies and question types, so, they do not provide for the specific item types that will appear on a given exam.

Why is there is the use of the case study exam format?

The case study exam format uses complex scenarios that more accurately simulate what professionals do on the job. However, scenario-based questions included in the case studies are designed to test your ability to identify the critical information needed to solve a problem and then analyze and synthesize it to make decisions.

Can we review the questions after the completion of the case study?

Yes, you may review the questions in a case study only after moving to the next case or section of the exam. After completing a case study and its associated questions, a review screen will appear.

What is a short answer question in the Microsoft exam?

There are questions of the short response question type that can be answered by putting a few lines of code in the available text-entry field. There, you may select from a list of keyword possibilities for usage in the code you create. You may, however, double-check your syntax once you’ve entered your code.

Is there any negative marking for the wrong answer?

No, you are not penalized for answering incorrectly. As, for single-point items, you need to answer completely and correctly to be awarded the point. And, you don’t earn points for the parts of your response that are incorrect.

Can I review all of my answers before leaving a section or completing the exam?

Yes, you can review your answers to most questions. However, there are yes/no questions that describe a problem and a possible solution. And, you will not be able to review these questions’ answers. In addition, after you move to the next question in this set, you are not able to change your answer. These questions are preceded by an overview screen that provides this information, and each question includes a reminder that you cannot return to the question or change your answer after leaving it.

Microsoft Exam Scoring and Results FAQ

When and how will I get my exam results?

Within a few minutes of finishing the exam, you will be notified if you passed or failed. You will also receive a printed report that includes your exam score as well as feedback on your performance in the skill areas assessed.

How does the score report look like?

The score report provides a numeric score for overall exam performance, pass/fail status, a bar chart showing performance on each skill area assessed on the exam, and details on how to interpret your results and next steps. Using this information, candidates can determine areas of strength and weakness.

Does the score report show a numerical score for each section?

Each part does not have a numerical score in the score reports. Only the pass/fail status reflects in the score reports, which give an overall numerical score. We give score bars to illustrate topic areas of strength and weakness instead of providing a numerical score for each segment.

How are exam scores calculated?

After you complete your exam, the points you earn for on each question are summed and then compared with the cut score to determine whether the result is pass or fail.

If I receive the same score every time I retake the same exam, does this imply an error in the computation of the results?

No. Receiving the same score on multiple attempts does not indicate that the program computing the results is in error. It is not uncommon for candidates to obtain similar or identical scores on multiple attempts of an exam. This consistent result demonstrates the reliability of the exam in evaluating skills in this content domain. If this happens on multiple attempts, you may want to reconsider how you’re preparing for the exam and seek other opportunities to learn and practice the skills measured by the exam.

I passed my first Microsoft Certification exam (at Pearson VUE). Now what do I do?

To explore the next steps and available benefits, see your benefits and exams dashboard. Sign in using the same Microsoft account you used to register for your exam.

If I do not pass, what can I do?

Prioritize the skills that you should practice by focusing on the content areas where your exam performance was the weakest and in the content areas that have the highest percentage of questions. Additionally, you may want to review the resources on the exam details page and our Study Groups. For this, check the bottom of the individual exam details page. When you are ready to retake the exam, schedule an appointment as you normally would. Note that you must pay for each exam you retake and follow Microsoft’s retake policy.

If I do not pass an exam, can I have a refund?

No. Microsoft does not offer refunds for exams you do not pass or exam appointments you miss.

AZ-720 practice tests
For More Information Check Microsoft Exam Policies

Go back to the Tutorial

The post Microsoft AZ-720 Exam FAQs appeared first on Testprep Training Tutorials.

]]>