This page was exported from Exam for engine [ http://blog.test4engine.com ] Export date:Mon Nov 18 4:37:18 2024 / +0000 GMT ___________________________________________________ Title: [Q22-Q43] Best Quality AI-100 Exam Questions Microsoft Test To Gain Brilliante Result! --------------------------------------------------- Best Quality AI-100 Exam Questions Microsoft Test To Gain Brilliante Result! Preparations of AI-100 Exam 2022 Azure AI Engineer Associate Unlimited 163 Questions NO.22 You are designing a solution that will analyze bank transactions in real time. The transactions will be evaluated by using an algorithm and classified into one of five groups. The transaction data will be enriched with information taken from Azure SQL Database before the transactions are sent to the classification process. The enrichment process will require custom code. Data from different banks will require different stored procedures.You need to develop a pipeline for the solution.Which components should you use for data ingestion and data preparation? To answer, select the appropriate options in the answer area.NOTE: Each correct selection is worth one point. Explanation:References:https://docs.microsoft.com/bs-latn-ba/azure/architecture/example-scenario/data/fraud-detectionNO.23 Your company is building a cinema chatbot by using the BOT Framework and Language Understanding (LUS).You are designing the intents and the entities for LUIS.The following are utterances that customers might provide:* Which movies are playing on December 8?* What time the performance of Movie1?* I would like to purchase two adult tickets in the balcony section for Movie2.You need to identify which entity types to use . The solution must minimize development effort.Which entity type should you use for each entity? To answer, select the appropriate options in the answer area.NOTE: Each correct selection is worth one point. NO.24 You are designing an AI application that will use an azure Machine Learning Studio experiment.The source data contains more than 200 TB of relational tables. The experiment will run once a month.You need to identify a data storage solution for the application. The solution must minimize compute costs.Which data storage solution should you identify?  Azure Database for MySQL  Azure SQL Database  Azure SQL Data Warehouse Explanation/Reference:References:https://azure.microsoft.com/en-us/pricing/details/sql-database/single/NO.25 You are designing an Al application that will perform real-time processing by using Microsoft Azure Stream Analytics.You need to identify the valid outputs of a Stream Analytics job.What are three possible outputs? Each correct answer presents a complete solution.NOTE: Each correct selection is worth one point.  a Hive table in Azure HDInsight  Azure SQL Database  Azure Cosmos DB  Azure Blob storage  Azure Redis Cache References:https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-define-outputsNO.26 You are designing an AI solution that will provide feedback to teachers who train students over the Internet.The students will be in classrooms located in remote areas. The solution will capture video and audio data of the students in the classrooms.You need to recommend Azure Cognitive Services for the AI solution to meet the following requirements:Alert teachers if a student facial expression indicates the student is angry or scared.Identify each student in the classrooms for attendance purposes.Allow the teachers to log voice conversations as text.Which Cognitive Services should you recommend?  Face API and Text Analytics  Computer Vision and Text Analytics  QnA Maker and Computer Vision  Speech to Text and Face API Speech-to-text from Azure Speech Services, also known as speech-to-text, enables real-time transcription of audio streams into text that your applications, tools, or devices can consume, display, and take action on as command input.Face detection: Detect one or more human faces in an image and get back face rectangles for where in the image the faces are, along with face attributes which contain machine learning-based predictions of facial features. The face attribute features available are: Age, Emotion, Gender, Pose, Smile, and Facial Hair along with 27 landmarks for each face in the image.References:https://docs.microsoft.com/en-us/azure/cognitive-services/speech-service/speech-to-texthttps://azure.microsoft.com/en-us/services/cognitive-services/face/NO.27 You plan to build an application that will perform predictive analytics. Users will be able to consume the application data by using Microsoft Power Bl or a custom website.You need to ensure that you can audit application usage.Which auditing solution should you use?  Azure Storage Analytics  Azure Application Insights  Azure diagnostic logs  Azure Active Directory (Azure AD) reporting ExplanationReferences:https://docs.microsoft.com/en-us/azure/active-directory/reports-monitoring/concept-audit-logsNO.28 You are designing a solution that will use the Azure Content Moderator service to moderate user-generated content.You need to moderate custom predefined content without repeatedly scanning the collected content.Which two APIs should you use? Each correct answer presents part of the solution.NOTE: Each correct selection is worth one point.)  Term List API  Text Moderation API  Image Moderation API  Workflow API ExplanationExplanation:The default global list of terms in Azure Content Moderator is sufficient for most content moderation needs.However, you might need to screen for terms that are specific to your organization. For example, you might want to tag competitor names for further review.Use the List Management API to create custom lists of terms to use with the Text Moderation API. The Text – Screen operation scans your text for profanity, and also compares text against custom and shared blacklists.C: Use Content Moderator’s machine-assisted image moderation and human-in-the-loop Review tool to moderate images for adult and racy content.Instead of moderating the same image multiple times, you add the offensive images to your custom list of blocked content. That way, your content moderation system compares incoming images against your custom lists and stops any further processing.Incorrect Answers:B: Use the Text Moderation API in Azure Content Moderator to scan your text content. The operation scans your content for profanity, and compares the content against custom and shared blacklists.References:https://docs.microsoft.com/en-us/azure/cognitive-services/content-moderator/try-terms-list-apihttps://docs.microsoft.com/en-us/azure/cognitive-services/content-moderator/image-moderation-apiNO.29 Your company has a data team of Scala and R experts.You plan to ingest data from multiple Apache Kafka streams.You need to recommend a processing technology to broker messages at scale from Kafka streams to Azure Storage.What should you recommend?  Azure Databricks  Azure Functions  Azure HDInsight with Apache Storm  Azure HDInsight with Microsoft Machine Learning Server Explanation/Reference:https://docs.microsoft.com/en-us/azure/hdinsight/hdinsight-streaming-at-scale-overview?toc=https%3A%2F%2Fdocs.microsoft.com%2Fen-us%2Fazure%2Fhdinsight%2Fhadoop%2FTOC.json&bc=https%3A%2F%2Fdocs.microsoft.com%2Fen-us%2Fazure%2Fbread%2Ftoc.jsonNO.30 You need to meet the greeting requirements for Butler.Which type of authentication should you use?  AdaptiveCard  SigninCard  CardCarousel  HeroCard Scenario: Butler must greet users by name when they first connect.HeroCard defines a card with a large image, title, text, and action buttons.Incorrect Answers:B: SigninCard defines a card that lets a user sign in to a service.References:NO.31 You plan to deploy Azure IoT Edge devices. Each device will store more than 10,000 images locally. Each image is approximately 5 MB.You need to ensure that the images persist on the devices for 14 days.What should you use?  Azure Stream Analytics on the IoT Edge devices  Azure Database for Postgres SQL  Azure Blob storage on the IoT Edge devices  Microsoft SQL Server on the IoT Edge devices ExplanationAzure Blob Storage on IoT Edge provides a block blob and append blob storage solution at the edge. A blob storage module on your IoT Edge device behaves like an Azure blob service, except the blobs are stored locally on your IoT Edge device.This e is useful where data needs to be stored locally until it can be processed or transferred to the cloud. This data can be videos, images, finance data, hospital data, or any other unstructured data.References:https://docs.microsoft.com/en-us/azure/iot-edge/how-to-store-data-blobNO.32 You have thousands of images that contain text.You need to process the text from the images to a machine-readable character stream.Which Azure Cognitive Services service should you use?  the Image Moderation API  Text Analytics  Translator Text  Computer Vision ExplanationWith Computer Vision you can detect text in an image using optical character recognition (OCR) and extract the recognized words into a machine-readable character stream.Reference:https://azure.microsoft.com/en-us/services/cognitive-services/computer-vision/https://docs.microsoft.com/en-us/azure/cognitive-services/content-moderator/image-moderation-apiNO.33 You plan to deploy an application that will perform image recognition. The application will store image data in two Azure Blob storage stores named Blob! and Blob2. You need to recommend a security solution that meets the following requirements:*Access to Blobl must be controlled by a using a role.*Access to Blob2 must be time-limited and constrained to specific operations.What should you recommend using to control access to each blob store? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point. ExplanationReferences:https://docs.microsoft.com/en-us/azure/storage/common/storage-authNO.34 You have Azure loT Edge devices that collect measurements every 30 seconds. You plan to send the measurements to an Azure loT hub. You need to ensure that every event is processed as quickly as possible.What should you use?  Apache Kafka  Azure Stream Analytics record functions  Azure Stream Analytics windowing functions  Azure Machine Learning on the loT Edge devices ExplanationReferences:https://docs.microsoft.com/en-us/azure/hdinsight/kafka/apache-kafka-connector-iot-hubNO.35 Your company plans to implement an Al solution that will analyse data from loT devices.Data from the devices will be analysed in real time. The results of the analysis will be stored in a SQL database.You need to recommend a data processing solution that uses the Transact-SQL language.Which data processing solution should you recommend?  Azure Stream Analytics  SQL Server Integration Services (SSIS)  Azure Event Hubs  Azure Machine Learning ExplanationReferences:https://www.linkedin.com/pulse/getting-started-azure-iot-services-stream-analytics-rob-tiffanyNO.36 You are designing an Azure Batch Al solution that will be used to train many different Azure Machine Learning models. The solution will perform the following:* Image recognition* Deep learning that uses convolutional neural networksYou need to select a compute infrastructure for each model. The solution must minimize the processing time.What should you use for each model? To answer, drag the appropriate compute infrastructures to the correct models. Each compute infrastructure may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.NOTE: Each correct selection is worth one point. ExplanationReferences:https://docs.microsoft.com/en-us/azure/virtual-machines/windows/sizes-gpuNO.37 You are designing a solution that will ingest data from an Azure loT Edge device, preprocess the data in Azure Machine Learning, and then move the data to Azure HDInsight for further processing.What should you include in the solution? To answer, select the appropriate options in the answer area.NOTE: Each correct selection is worth one point. ExplanationBox 1: Export DataThe Export data to Hive option in the Export Data module in Azure Machine Learning Studio. This option is useful when you are working with very large datasets, and want to save your machine learning experiment data to a Hadoop cluster or HDInsight distributed storage.Box 2: Apache HiveApache Hive is a data warehouse system for Apache Hadoop. Hive enables data summarization, querying, and analysis of data. Hive queries are written in HiveQL, which is a query language similar to SQL.Box 3: Azure Data LakeDefault storage for the HDFS file system of HDInsight clusters can be associated with either an Azure Storage account or an Azure Data Lake Storage.References:https://docs.microsoft.com/en-us/azure/machine-learning/studio-module-reference/export-to-hive-queryhttps://docs.microsoft.com/en-us/azure/hdinsight/hadoop/hdinsight-use-hiveNO.38 You are designing a solution that will use the Azure Content Moderator service to moderate user- generated content.You need to moderate custom predefined content without repeatedly scanning the collected content.Which API should you use?  Term List API  Text Moderation API  Image Moderation API  Workflow API Explanation/Reference:Explanation:The default global list of terms in Azure Content Moderator is sufficient for most content moderation needs.However, you might need to screen for terms that are specific to your organization. For example, you might want to tag competitor names for further review.Use the List Management API to create custom lists of terms to use with the Text Moderation API. The Text – Screen operation scans your text for profanity, and also compares text against custom and shared blacklists.Incorrect Answers:B: Use the Text Moderation API in Azure Content Moderator to scan your text content. The operation scans your content for profanity, and compares the content against custom and shared blacklists.References:https://docs.microsoft.com/en-us/azure/cognitive-services/content-moderator/try-terms-list-apiNO.39 Your company recently purchased several hundred hardware devices that contain sensors.You need to recommend a solution to process the sensor data. The solution must provide the ability to write back configuration changes to the devices.What should you include in the recommendation?  Microsoft Azure IoT Hub  API apps in Microsoft Azure App Service  Microsoft Azure Event Hubs  Microsoft Azure Notification Hubs Explanation/Reference:References:https://azure.microsoft.com/en-us/resources/samples/functions-js-iot-hub-processing/ Integrate AI models into solutions Testlet 2 Overview Contoso, Ltd. has an office in New York to serve its North American customers and an office in Paris to serve its European customers.Existing EnvironmentInfrastructureEach office has a small data center that hosts Active Directory services and a few off-the-shelf software solutions used by internal users.The network contains a single Active Directory forest that contains a single domain named contoso.com. Azure Active Directory (Azure AD) Connect is used to extend identity management to Azure.The company has an Azure subscription. Each office has an Azure ExpressRoute connection to the subscription. The New York office connects to a virtual network hosted in the US East 2 Azure region. The Paris office connects to a virtual network hosted in the West Europe Azure region.The New York office has an Azure Stack Development Kit (ASDK) deployment that is used for development and testing.Current Business ModelContoso has a web app named Bookings hosted in an App Service Environment (ASE). The ASE is in the virtual network in the East US 2 region. Contoso employees and customers use Bookings to reserve hotel rooms.Data EnvironmentBookings connects to a Microsoft SQL Server database named hotelDB in the New York office.The database has a view named vwAvailability that consolidates columns from the tables named Hotels, Rooms, and RoomAvailability. The database contains data that was collected during the last 20 years.Problem StatementsContoso identifies the following issues with its current business model:* European users report that access to Booking is slow, and they lose customers who must wait on the phone while they search for available rooms.* Users report that Bookings was unavailable during an outage in the New York data center for more than 24 hours.RequirementsContoso identifies the following issues with its current business model:* European users report that access to Bookings is slow, and they lose customers who must wait on the phone while they search for available rooms.* Users report that Bookings was unavailable during on outage in the New York data center for more than 24 hours.Business GoalsContoso wants to provide a new version of the Bookings app that will provide a highly available, reliable service for booking travel packages by interacting with a chatbot named Butler.Contoso plans to move all production workloads to the cloud.Technical requirementsContoso identifies the following technical requirements:* Data scientists must test Butler by using ASDK.* Whenever possible, solutions must minimize costs.* Butler must greet users by name when they first connect.* Butler must be able to handle up to 10,000 messages a day.* Butler must recognize the users’ intent based on basic utterances.* All configurations to the Azure Bot Service must be logged centrally.* Whenever possible, solutions must use the principle of least privilege.* Internal users must be able to access Butler by using Microsoft Skype for Business.* The new Bookings app must provide a user interface where users can interact with Butler.* Users in an Azure AD group named KeyManagers must be able to manage keys for all Azure Cognitive Services.* Butler must provide users with the ability to reserve a room, cancel a reservation, and view existing reservations.* The new Bookings app must be available to users in North America and Europe if a single data center or Azure region fails.* For continuous improvement, you must be able to test Butler by sending sample utterances and comparing the chatbot’s responses to the actual intent.NO.40 You need to configure security for an Azure Machine Learning service used by groups of data scientists. The groups must have access to only their own experiments and must be able to grant permissions to the members of their team.What should you do? To answer, select the appropriate options in the answer area.NOTE: Each correct selection is worth one point. Explanation:References:https://docs.microsoft.com/en-us/machine-learning-server/operationalize/configure-roles#how-are-roles-assignedhttps://docs.microsoft.com/en-us/azure/machine-learning/service/how-to-assign-rolesNO.41 Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.After you answer a question, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.You need to create an IoT solution that performs the following tasks:* Identifies hazards* Provides a real-time online dashboard* Takes images of an area every minute* Counts the number of people in an area every minuteSolution: You configure the IoT devices to send the images to an Azure IoT hub, and then you configure an Azure Automation call to Azure Cognitive Services that sends the results to an Azure event hub. You configure Microsoft Power BI to connect to the event hub by using Azure Stream Analytics.Does this meet the goal?  Yes  No Instead use Cognitive Services containers on the IoT devices.References:https://azure.microsoft.com/es-es/blog/running-cognitive-services-on-iot-edge/https://docs.microsoft.com/en-us/azure/iot-hub/iot-hub-live-data-visualization-in-power-biNO.42 You need to build a sentiment analysis solution that will use input data from JSON documents and PDF documents. The JSON documents must be processed in batches and aggregated.Which storage type should you use for each file type? To answer, select the appropriate options in the answer area.NOTE: Each correct selection is worth one point. ExplanationReferences:https://docs.microsoft.com/en-us/azure/architecture/data-guide/big-data/batch-processingNO.43 Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.After you answer a question, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.You are deploying an Azure Machine Learning model to an Azure Kubernetes Service (AKS) container.You need to monitor the scoring accuracy of each run of the model.Solution: You modify the Config.json file.Does this meet the goal?  Yes  No Instead update the manifest file.Reference:https://azure.github.io/learnAnalytics-UsingAzureMachineLearningforAIWorkloads/lab07- deploying_a_scoring_service_to_aks/0_README.html Integrate AI models into solutions Testlet 2 Overview Contoso, Ltd. has an office in New York to serve its North American customers and an office in Paris to serve its European customers.Existing EnvironmentInfrastructureEach office has a small data center that hosts Active Directory services and a few off-the-shelf software solutions used by internal users.The network contains a single Active Directory forest that contains a single domain named contoso.com. Azure Active Directory (Azure AD) Connect is used to extend identity management to Azure.The company has an Azure subscription. Each office has an Azure ExpressRoute connection to the subscription. The New York office connects to a virtual network hosted in the US East 2 Azure region. The Paris office connects to a virtual network hosted in the West Europe Azure region.The New York office has an Azure Stack Development Kit (ASDK) deployment that is used for development and testing.Current Business ModelContoso has a web app named Bookings hosted in an App Service Environment (ASE). The ASE is in the virtual network in the East US 2 region. Contoso employees and customers use Bookings to reserve hotel rooms.Data EnvironmentBookings connects to a Microsoft SQL Server database named hotelDB in the New York office.The database has a view named vwAvailability that consolidates columns from the tables named Hotels, Rooms, and RoomAvailability. The database contains data that was collected during the last 20 years.Problem StatementsContoso identifies the following issues with its current business model:* European users report that access to Booking is slow, and they lose customers who must wait on the phone while they search for available rooms.* Users report that Bookings was unavailable during an outage in the New York data center for more than 24 hours.RequirementsContoso identifies the following issues with its current business model:* European users report that access to Bookings is slow, and they lose customers who must wait on the phone while they search for available rooms.* Users report that Bookings was unavailable during on outage in the New York data center for more than 24 hours.Business GoalsContoso wants to provide a new version of the Bookings app that will provide a highly available, reliable service for booking travel packages by interacting with a chatbot named Butler.Contoso plans to move all production workloads to the cloud.Technical requirementsContoso identifies the following technical requirements:* Data scientists must test Butler by using ASDK.* Whenever possible, solutions must minimize costs.* Butler must greet users by name when they first connect.* Butler must be able to handle up to 10,000 messages a day.* Butler must recognize the users’ intent based on basic utterances.* All configurations to the Azure Bot Service must be logged centrally.* Whenever possible, solutions must use the principle of least privilege.* Internal users must be able to access Butler by using Microsoft Skype for Business.* The new Bookings app must provide a user interface where users can interact with Butler.* Users in an Azure AD group named KeyManagers must be able to manage keys for all Azure Cognitive Services.* Butler must provide users with the ability to reserve a room, cancel a reservation, and view existing reservations.* The new Bookings app must be available to users in North America and Europe if a single data center or Azure region fails.* For continuous improvement, you must be able to test Butler by sending sample utterances and comparing the chatbot’s responses to the actual intent. Loading … Focus on AI-100 All-in-One Exam Guide For Quick Preparation: https://www.test4engine.com/AI-100_exam-latest-braindumps.html --------------------------------------------------- Images: https://blog.test4engine.com/wp-content/plugins/watu/loading.gif https://blog.test4engine.com/wp-content/plugins/watu/loading.gif --------------------------------------------------- --------------------------------------------------- Post date: 2022-07-24 09:27:17 Post date GMT: 2022-07-24 09:27:17 Post modified date: 2022-07-24 09:27:17 Post modified date GMT: 2022-07-24 09:27:17