Microsoft 70-475 Exam Questions 2021

Want to know microsoft 70 475 features? Want to lear more about microsoft 70 475 experience? Study exam 70 475. Gat a success with an absolute guarantee to pass Microsoft 70-475 (Designing and Implementing Big Data Analytics Solutions) test on your first attempt.

Free demo questions for Microsoft 70-475 Exam Dumps Below:

NEW QUESTION 1
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the states goals. Some question sets might have more than one correct solution, while the others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You plan to implement a new data warehouse.
You have the following information regarding the data warehouse:
70-475 dumps exhibit The first data files for the data warehouse will be available in a few days.
70-475 dumps exhibit Most queries that will be executed against the data warehouse are ad-hoc.
70-475 dumps exhibit The schemas of data files that will be loaded to the data warehouse change often.
70-475 dumps exhibit One month after the planned implementation, the data warehouse will contain 15 TB of data. You need to recommend a database solution to support the planned implementation.
Solution: You recommend a Microsoft SQL server on a Microsoft Azure virtual machine. Does this meet the goal?

  • A. Yes
  • B. No

Answer: B

NEW QUESTION 2
You have an application that displays data from a Microsoft Azure SQL database. The database contains credit card numbers.
You need to ensure that the application only displays the last four digits of each credit card number when a credit card number is returned from a query. The solution must NOT require any changes to the data in the database.
What should you use?

  • A. Dynamic Data Masking
  • B. cell-level security
  • C. Transparent Data Encryption (TDE)
  • D. row-level security

Answer: A

NEW QUESTION 3
You need to create a query that identifies the trending topics.
How should you complete the query? To answer, drag the appropriate values to the correct targets. Each value may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.
NOTE: Each correct selection is worth one point.
70-475 dumps exhibit

    Answer:

    Explanation: From scenario: Topics are considered to be trending if they generate many mentions in a specific country during a 15-minute time frame.
    Box 1: TimeStamp
    Azure Stream Analytics (ASA) is a cloud service that enables real-time processing over streams of data flowing in from devices, sensors, websites and other live systems. The stream-processing logic in ASA is expressed in a SQL-like query language with some added extensions such as windowing for performing temporal calculations.
    ASA is a temporal system, so every event that flows through it has a timestamp. A timestamp is assigned automatically based on the event's arrival time to the input source but you can also access a timestamp in your event payload explicitly using TIMESTAMP BY:
    SELECT * FROM SensorReadings TIMESTAMP BY time Box 2: GROUP BY
    Example: Generate an output event if the temperature is above 75 for a total of 5 seconds SELECT sensorId, MIN(temp) as temp
    FROM SensorReadings TIMESTAMP BY time
    GROUP BY sensorId, SlidingWindow(second, 5) HAVING MIN(temp) > 75
    Box 3: SlidingWindow
    Windowing is a core requirement for stream processing applications to perform set-based operations like counts or aggregations over events that arrive within a specified period of time. ASA supports three types of windows: Tumbling, Hopping, and Sliding.
    With a Sliding Window, the system is asked to logically consider all possible windows of a given length and output events for cases when the content of the window actually changes – that is, when an event entered or existed the window.

    NEW QUESTION 4
    Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
    After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
    Your company has multiple databases that contain millions of sales transactions. You plan to implement a data mining solution to identity purchasing fraud.
    You need to design a solution that mines 10 terabytes (TB) of sales data. The solution must meet the following requirements:
    70-475 dumps exhibit Run the analysis to identify fraud once per week.
    70-475 dumps exhibit Continue to receive new sales transactions while the analysis runs.
    70-475 dumps exhibit Be able to stop computing services when the analysis is NOT running. Solution: You create a Microsoft Azure HDlnsight cluster.
    Does this meet the goal?

    • A. Yes
    • B. No

    Answer: B

    Explanation: HDInsight cluster billing starts once a cluster is created and stops when the cluster is deleted. Billing is pro-rated per minute, so you should always delete your cluster when it is no longer in use.

    NEW QUESTION 5
    You are developing a solution to ingest data in real-time from manufacturing sensors. The data will be archived. The archived data might be monitored after it is written.
    You need to recommend a solution to ingest and archive the sensor data. The solution must allow alerts to be sent to specific users as the data is ingested.
    What should you include in the recommendation?

    • A. a Microsoft Azure notification hub and an Azure function
    • B. a Microsoft Azure notification hub an Azure logic app
    • C. a Microsoft Azure Stream Analytics job that outputs data to an Apache Storm cluster in AzureHDInsight
    • D. a Microsoft Azure Stream Analytics job that outputs data to Azure Cosmos DB

    Answer: C

    NEW QUESTION 6
    You have a Microsoft Azure Data Factory that loads data to an analytics solution. You receive an alert that an error occurred during the last processing of a data stream. You debug the problem and solve an error.
    You need to process the data stream that caused the error. What should you do?

    • A. From Azure Cloud Shell, run the az dla job command.
    • B. From Azure Cloud Shell, run the az batch job enable command.
    • C. From PowerShell, run the Resume-AzureRmDataFactoryPipeline cmdlet.
    • D. From PowerShell, run the Set-AzureRmDataFactorySliceStatus cmdlet.

    Answer: D

    Explanation: ADF operates on data in batches known as slices. Slices are obtained by querying data over a date-time window—for example, a slice may contain data for a specific hour, day, or week.
    References:
    https://blogs.msdn.microsoft.com/bigdatasupport/2021/08/31/rerunning-many-slices-and-activities-in-azure-data

    NEW QUESTION 7
    Your company supports multiple Microsoft Azure subscriptions.
    You plan to deploy several virtual machines to support the services in Azure.
    You need to automate the management of all the subscriptions. The solution must minimize administrative effort.
    Which two cmdlets should you run? Each correct answer presents part of the solution.
    NOTE: Each correct selection is worth one point.

    • A. Clear-AzureProfile
    • B. Add-AzureSubscription
    • C. Add-AzureRMAccount
    • D. Import-AzurePublishSettingsFile
    • E. Get-AzurePublishSettingsFile

    Answer: DE

    NEW QUESTION 8
    You plan to implement a Microsoft Azure Data Factory pipeline. The pipeline will have custom business logic that requires a custom processing step.
    You need to implement the custom processing step by using C#.
    Which interface and method should you use? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.
    70-475 dumps exhibit

      Answer:

      Explanation: References:
      https://github.com/MicrosoftDocs/azure-docs/blob/master/articles/data-factory/v1/data-factory-use-custom-activ

      NEW QUESTION 9
      You need to implement rls_table1.
      Which code should you execute? To answer, drag the appropriate values to the correct targets. Each value may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.
      NOTE: Each correct selection is worth one point.
      70-475 dumps exhibit

        Answer:

        Explanation: Box 1: Security Security Policy
        Example: After we have created Predicate function, we have to bind it to the table, using Security Policy. We will be using CREATE SECURITY POLICY command to set the security policy in place.
        CREATE SECURITY POLICY DepartmentSecurityPolicy
        ADD FILTER PREDICATE dbo.DepartmentPredicateFunction(UserDepartment) ON dbo.Department WITH(STATE = ON)
        Box 2: Filter
        [ FILTER | BLOCK ]
        The type of security predicate for the function being bound to the target table. FILTER predicates silently filter the rows that are available to read operations. BLOCK predicates explicitly block write operations that violate the predicate function.
        Box 3: Block
        Box 4: Block
        Box 5: Filter

        NEW QUESTION 10
        Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
        After you answer a question in this section, you will NOT be able to return to it. As a result, these questions
        will not appear in the review screen.
        You have a Microsoft Azure subscription that includes Azure Data Lake and Cognitive Services. An administrator plans to deploy an Azure Data Factory.
        You need to ensure that the administrator can create the data factory. Solution: You add the user to the Owner role.
        Does this meet the goal?

        • A. Yes
        • B. No

        Answer: B

        NEW QUESTION 11
        Your company has a Microsoft Azure environment that contains an Azure HDInsight Hadoop cluster and an Azure SQL data warehouse. The Hadoop cluster contains text files that are formatted by using UTF-8 character encoding.
        You need to implement a solution to ingest the data to the SQL data warehouse from the Hadoop cluster. The solution must provide optimal read performance for the data after ingestion.
        Which three actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.
        70-475 dumps exhibit

          Answer:

          Explanation: SQL Data Warehouse supports loading data from HDInsight via PolyBase. The process is the same as loading data from Azure Blob Storage - using PolyBase to connect to HDInsight to load data.
          Use PolyBase and T-SQL Summary of loading process: Recommendations
          Create statistics on newly loaded data. Azure SQL Data Warehouse does not yet support auto create or auto update statistics. In order to get the best performance from your queries, it's important to create statistics on all columns of all tables after the first load or any substantial changes occur in the data.

          NEW QUESTION 12
          You have a Microsoft Azure subscription that contains an Azure Data Factory pipeline. You have an RSS feed that is published on a public website.
          You need to configure the RSS feed as a data source for the pipeline. Which type of linked service should you use?

          • A. web
          • B. OData
          • C. Azure Search
          • D. Azure Data Lake Store

          Answer: A

          Explanation: Reference: https://docs.microsoft.com/en-us/azure/data-factory/data-factory-web-table-connector

          NEW QUESTION 13
          You plan to deploy a Microsoft Azure Data Factory pipeline to run an end-to-end data processing workflow. You need to recommend winch Azure Data Factory features must be used to meet the Following requirements: Track the run status of the historical activity.
          Enable alerts and notifications on events and metrics.
          Monitor the creation, updating, and deletion of Azure resources.
          Which features should you recommend? To answer, drag the appropriate features to the correct requirements. Each feature may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.
          NOTE: Each correct selection is worth one point.
          70-475 dumps exhibit

            Answer:

            Explanation: Box 1: Azure Hdinsight logs Logs contain historical activities. Box 2: Azure Data Factory alerts Box 3: Azure Data Factory events

            NEW QUESTION 14
            You are designing an Internet of Thing: (IoT) solution intended to identify trends. The solution requires the realtime analysis of data originating from sensors. The results of the analysis will be stored in a SQL database.
            You need to recommend a data processing solution that uses the Transact-SQL language. Which data processing solution should you recommend?

            • A. Microsoft Azure Stream Analytics
            • B. Microsoft SQL Server Integration Services (SSIS)
            • C. Microsoft Azure Machine Learning
            • D. Microsoft Azure HDInsight Hadoop clusters

            Answer: A

            NEW QUESTION 15
            You plan to analyze the execution logs of a pipeline to identify failures by using Microsoft power BI. You need to automate the collection of monitoring data for the planned analysis.
            What should you do from Microsoft Azure?

            • A. Create a Data Factory Set
            • B. Save a Data Factory Log
            • C. Add a Log Profile
            • D. Create an Alert Rule Email

            Answer: A

            Explanation: You can import the results of a Log Analytics log search into a Power BI dataset so you can take advantage of its features such as combining data from different sources and sharing reports on the web and mobile devices.
            To import data from a Log Analytics workspace into Power BI, you create a dataset in Power BI based on a log search query in Log Analytics. The query is run each time the dataset is refreshed. You can then build Power BI reports that use data from the dataset.
            References: https://docs.microsoft.com/en-us/azure/azure-monitor/platform/powerbi

            NEW QUESTION 16
            You are building a streaming data analysis solution that will process approximately 1 TB of data weekly. You plan to use Microsoft Azure Stream Analytics to create alerts on real-time data. The data must be preserved for deeper analysis at a later date.
            You need to recommend a storage solution for the alert data. The solution must meet the following requirements:
            70-475 dumps exhibit Support scaling up without any downtime
            70-475 dumps exhibit Minimize data storage costs.
            What should you recommend using to store the data?

            • A. Azure Data Lake
            • B. Azure SQL Database
            • C. Azure SQL Data Warehouse
            • D. Apache Kafka

            Answer: A

            NEW QUESTION 17
            You are designing a solution for an Internet of Things (IoT) project.
            You need to recommend a data storage solution for the project. The solution must meet the following
            requirements:
            70-475 dumps exhibit Allow data to be queried in real-time as it streams into the solution
            70-475 dumps exhibit Provide the lowest amount of latency for loading data into the solution. What should you include in the recommendation?

            • A. a Microsoft Azure SQL database that has In-Memory OLTP enabled
            • B. a Microsoft Azure HDInsight Hadoop cluster
            • C. a Microsoft Azure HDInsight R Server cluster
            • D. a Microsoft Azure Table Storage solution

            Answer: A

            Explanation: References:
            https://azure.microsoft.com/en-gb/blog/in-memory-oltp-in-azure-sql-database/

            NEW QUESTION 18
            You are developing an Apache Storm application by using Microsoft Visual Studio. You need to implement a custom topology that uses a custom bolt. Which type of object should you initialize in the main class?

            • A. Stream
            • B. TopologyBuilder
            • C. Streamlnfo
            • D. Logger

            Answer: A

            100% Valid and Newest Version 70-475 Questions & Answers shared by Certleader, Get Full Dumps HERE: https://www.certleader.com/70-475-dumps.html (New 102 Q&As)