This page was exported from Actual Test Materials [ http://blog.actualtests4sure.com ] Export date:Fri Nov 15 18:25:27 2024 / +0000 GMT ___________________________________________________ Title: 2023 Realistic DP-300 Dumps are Available for Instant Access [Q116-Q134] --------------------------------------------------- 2023 Realistic DP-300 Dumps are Available for Instant Access Download Exam DP-300 Practice Test Questions with 100% Verified Answers QUESTION 116You plan to create a table in an Azure Synapse Analytics dedicated SQL pool.Data in the table will be retained for five years. Once a year, data that is older than five years will be deleted.You need to ensure that the data is distributed evenly across partitions. The solutions must minimize the amount of time required to delete old data.How should you complete the Transact-SQL statement? To answer, drag the appropriate values to the correct targets. Each value may be used once, more than once, or not at all.You may need to drag the split bar between panes or scroll to view content.NOTE: Each correct selection is worth one point. ExplanationGraphical user interface, text, application Description automatically generatedBox 1: HASHBox 2: OrderDateKeyIn most cases, table partitions are created on a date column.A way to eliminate rollbacks is to use Metadata Only operations like partition switching for data management.For example, rather than execute a DELETE statement to delete all rows in a table where the order_date was in October of 2001, you could partition your data early. Then you can switch out the partition with data for an empty partition from another table.Reference:https://docs.microsoft.com/en-us/sql/t-sql/statements/create-table-azure-sql-data-warehousehttps://docs.microsoft.com/en-us/azure/synapse-analytics/sql/best-practices-dedicated-sql-poolQUESTION 117You deploy a database to an Azure SQL Database managed instance.You need to prevent read queries from blocking queries that are trying to write to the database.Which database option should set?  PARAMETERIZATION to FORCED  PARAMETERIZATION to SIMPLE  Delayed Durability to Forced  READ_COMMITTED_SNAPSHOT to ON In SQL Server, you can also minimize locking contention while protecting transactions from dirty reads of uncommitted data modifications using either:* The READ COMMITTED isolation level with the READ_COMMITTED_SNAPSHOT database option set to ON.* The SNAPSHOT isolation level.If READ_COMMITTED_SNAPSHOT is set to ON (the default on SQL Azure Database), the Database Engine uses row versioning to present each statement with a transactionally consistent snapshot of the data as it existed at the start of the statement. Locks are not used to protect the data from updates by other transactions.Incorrect Answers:A: When the PARAMETERIZATION database option is set to SIMPLE, the SQL Server query optimizer may choose to parameterize the queries. This means that any literal values that are contained in a query are substituted with parameters. This process is referred to as simple parameterization. When SIMPLE parameterization is in effect, you cannot control which queries are parameterized and which queries are not.B: You can specify that all queries in a database be parameterized by setting the PARAMETERIZATION database option to FORCED. This process is referred to as forced parameterization.C: Delayed transaction durability is accomplished using asynchronous log writes to disk. Transaction log records are kept in a buffer and written to disk when the buffer fills or a buffer flushing event takes place. Delayed transaction durability reduces both latency and contention within the system.Some of the cases in which you could benefit from using delayed transaction durability are:* You can tolerate some data loss.* You are experiencing a bottleneck on transaction log writes.* Your workloads have a high contention rate.Reference:https://docs.microsoft.com/en-us/sql/t-sql/statements/set-transaction-isolation-level-transact-sqlQUESTION 118You have an Azure SQL database that contains a table named Employees. Employees contains a column named Salary.You need to encrypt the Salary column. The solution must prevent database administrators from reading the data in the Salary column and must provide the most secure encryption.Which three actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order. Explanation:Step 1: Create a column master keyCreate a column master key metadata entry before you create a column encryption key metadata entry in the database and before any column in the database can be encrypted using Always Encrypted.Step 2: Create a column encryption key.Step 3: Encrypt the Salary column by using the randomized encryption type.Randomized encryption uses a method that encrypts data in a less predictable manner. Randomized encryption is more secure, but prevents searching, grouping, indexing, and joining on encrypted columns.Note: A column encryption key metadata object contains one or two encrypted values of a column encryption key that is used to encrypt data in a column. Each value is encrypted using a column master key.Incorrect Answers:Deterministic encryption.Deterministic encryption always generates the same encrypted value for any given plain text value. Using deterministic encryption allows point lookups, equality joins, grouping and indexing on encrypted columns. However, it may also allow unauthorized users to guess information about encrypted values by examining patterns in the encrypted column, especially if there’s a small set of possible encrypted values, such as True/False, or North/South/East/West region.Reference:https://docs.microsoft.com/en-us/sql/relational-databases/security/encryption/always-encrypted-database-engineQUESTION 119You have an Azure subscription that contains the resources shown in the following table.You need to create a read-only replica of DB1 and configure the App1 instances to use the replica.What should you do? To answer, select the appropriate options in the answer area.NOTE: Each correct selection is worth one point. Reference:https://sqlserverguides.com/read-only-replica-azure-sql/QUESTION 120You have an Azure subscription that contains the resources shown in the following table.App1 experiences transient connection errors and timeouts when it attempts to access db1 after extended periods of inactivity.You need to modify db1 to resolve the issues experienced by Appl as soon as possible, without considering immediate costs What you do?  Increase the number of vCores allocated to db1.  Decrease the auto-pause delay for db1.  Disable auto-pause delay for db1.  Enable automatic tuning for db1. QUESTION 121You have an Azure virtual machine based on a custom image named VM1.VM1 hosts an instance of Microsoft SQL Server 2019 Standard.You need to automate the maintenance of VM1 to meet the following requirements:Automate the patching of SQL Server and Windows Server.Automate full database backups and transaction log backups of the databases on VM1.Minimize administrative effort.What should you do first?  Enable a system-assigned managed identity for VM1  Register VM1 to the Microsoft.Sql resource provider  Install an Azure virtual machine Desired State Configuration (DSC) extension on VM1  Register VM1 to the Microsoft.SqlVirtualMachine resource provider Automated Patching depends on the SQL Server infrastructure as a service (IaaS) Agent Extension. The SQL Server IaaS Agent Extension (SqlIaasExtension) runs on Azure virtual machines to automate administration tasks. The SQL Server IaaS extension is installed when you register your SQL Server VM with the SQL Server VM resource provider.Reference:https://docs.microsoft.com/en-us/azure/azure-sql/virtual-machines/windows/sql-server-iaas-agent-extensionautomate-managementQUESTION 122You have an Azure data factory that has two pipelines named PipelineA and PipelineB.PipelineA has four activities as shown in the following exhibit.PipelineB has two activities as shown in the following exhibit.You create an alert for the data factory that uses Failed pipeline runs metrics for both pipelines and all failure types. The metric has the following settings:Operator: Greater thanAggregation type: TotalThreshold value: 2Aggregation granularity (Period): 5 minutesFrequency of evaluation: Every 5 minutesData Factory monitoring records the failures shown in the following table.For each of the following statements, select Yes if the statement is true. Otherwise, select No.NOTE: Each correct selection is worth one point. ExplanationText Description automatically generatedBox 1: NoJust one failure within the 5-minute interval.Box 2: NoJust two failures within the 5-minute interval.Box 3: NoJust two failures within the 5-minute interval.Reference:https://docs.microsoft.com/en-us/azure/azure-monitor/alerts/alerts-metric-overviewQUESTION 123You have an Azure SQL database that contains a table named factSales. FactSales contains the columnsshown in the following table.FactSales has 6 billion rows and is loaded nightly by using a batch process.Which type of compression provides the greatest space reduction for the database?  page compression  row compression  columnstore compression  columnstore archival compression Section: [none]Explanation:Columnstore tables and indexes are always stored with columnstore compression. You can further reduce thesize of columnstore data by configuring an additional compression called archival compression.Note: Columnstore – The columnstore index is also logically organized as a table with rows and columns, butthe data is physically stored in a column-wise data format.Incorrect Answers:B: Rowstore – The rowstore index is the traditional style that has been around since the initial release of SQLServer.For rowstore tables and indexes, use the data compression feature to help reduce the size of the database.Reference:https://docs.microsoft.com/en-us/sql/relational-databases/data-compression/data-compressionQUESTION 124You have an Always On availability group deployed to Azure virtual machines. The availability group contains a database named DB1 and has two nodes named SQL1 and SQL2. SQL1 is the primary replica.You need to initiate a full backup of DB1 on SQL2.Which statement should you run?BACKUP DATABASE DB1 TO URL=’https://mystorageaccount.blob.core.windows.net/  mycontainer/DB1.bak’ with (Differential, STATS=5, COMPRESSION);BACKUP DATABASE DB1 TO URL=’https://mystorageaccount.blob.core.windows.net/  mycontainer/DB1.bak’ with (COPY_ONLY, STATS=5, COMPRESSION);BACKUP DATABASE DB1 TO URL=’https://mystorageaccount.blob.core.windows.net/  mycontainer/DB1.bak’ with (File_Snapshot, STATS=5, COMPRESSION);BACKUP DATABASE DB1 TO URL=’https://mystorageaccount.blob.core.windows.net/  mycontainer/DB1.bak’ with (NoInit, STATS=5, COMPRESSION); BACKUP DATABASE supports only copy-only full backups of databases, files, or filegroups when it’s executed on secondary replicas. Copy-only backups don’t impact the log chain or clear the differential bitmap.Incorrect Answers:A: Differential backups are not supported on secondary replicas. The software displays this error because the secondary replicas support copy-only database backups.Reference:https://docs.microsoft.com/en-us/sql/database-engine/availability-groups/windows/active-secondaries-backup- on-secondary-replicas-always-on-availability-groupsQUESTION 125You need to recommend the appropriate purchasing model and deployment option for the 30 new databases. The solution must meet the technical requirements and the business requirements.What should you recommend? To answer, select the appropriate options in the answer area.NOTE: Each correct selection is worth one point. Reference:https://docs.microsoft.com/en-us/azure/azure-sql/database/elastic-pool-overviewhttps://docs.microsoft.com/en-us/azure/azure-sql/database/reserved-capacity-overviewQUESTION 126DRAG DROPYou need to apply 20 built-in Azure Policy definitions to all new and existing Azure SQL Database deployments in an Azure subscription. The solution must minimize administrative effort.Which three actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.Select and Place: Section: [none]Explanation:Step 1: Create an Azure Policy InitiativeThe first step in enforcing compliance with Azure Policy is to assign a policy definition. A policy definition defines under what condition a policy is enforced and what effect to take.With an initiative definition, you can group several policy definitions to achieve one overarching goal. An initiative evaluates resources within scope of the assignment for compliance to the included policies.Step 2: Create an Azure Policy Initiative assignmentAssign the initiative definition you created in the previous step.Step 3: Run Azure Policy remediation tasksTo apply the Policy Initiative to the existing SQL databases.Reference:https://docs.microsoft.com/en-us/azure/governance/policy/tutorials/create-and-manageQUESTION 127Which audit log destination should you use to meet the monitoring requirements?  Azure Storage  Azure Event Hubs  Azure Log Analytics ExplanationScenario: Use a single dashboard to review security and audit data for all the PaaS databases.With dashboards can bring together operational data that is most important to IT across all your Azure resources, including telemetry from Azure Log Analytics.Note: Auditing for Azure SQL Database and Azure Synapse Analytics tracks database events and writes them to an audit log in your Azure storage account, Log Analytics workspace, or Event Hubs.Reference:https://docs.microsoft.com/en-us/azure/azure-monitor/visualize/tutorial-logs-dashboardsQUESTION 128You need to implement statistics maintenance for SalesSQLDb1. The solution must meet the technical requirements.Which four actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order. 1 – Create an Azure Automation account.2 – Import the SqlServer module.3 – Create a runbook that runs a PowerShell script.4 – Create and configure a schedule.Reference:https://techcommunity.microsoft.com/t5/azure-database-support-blog/automating-azure-sql-db-index-and-statistics-maintenance-using/ba-p/368974QUESTION 129Which windowing function should you use to perform the streaming aggregation of the sales data?  Sliding  Hopping  Session  Tumbling ExplanationScenario: The sales data, including the documents in JSON format, must be gathered as it arrives and analyzed online by using Azure Stream Analytics. The analytics process will perform aggregations that must be done continuously, without gaps, and without overlapping.Tumbling window functions are used to segment a data stream into distinct time segments and perform a function against them, such as the example below. The key differentiators of a Tumbling window are that they repeat, do not overlap, and an event cannot belong to more than one tumbling window.Timeline Description automatically generatedReference:https://github.com/MicrosoftDocs/azure-docs/blob/master/articles/stream-analytics/stream-analytics-window-funQUESTION 130You have a Microsoft SQL Server database named DB1 that contains a table named Table1.The database role membership for a user named User1 is shown in the following exhibit.Use the drop-down menus to select the answer choice that completes each statement based on the information presented in the graphic.NOTE: Each correct selection is worth one point. Explanation:Box 1: delete a row from Table1Members of the db_datawriter fixed database role can add, delete, or change data in all user tables.Box 2: db_datareaderMembers of the db_datareader fixed database role can read all data from all user tables.Reference:https://docs.microsoft.com/en-us/sql/relational-databases/security/authentication-access/database-level-rolesQUESTION 131You have an Azure subscription that contains the resources shown in the following table.You need to create a read-only replica of DB1 and configure the App1 instances to use the replica.What should you do? To answer, select the appropriate options in the answer area.NOTE: Each correct selection is worth one point. Reference:https://sqlserverguides.com/read-only-replica-azure-sql/QUESTION 132You manage an enterprise data warehouse in Azure Synapse Analytics.Users report slow performance when they run commonly used queries. Users do not report performance changes for infrequently used queries.You need to monitor resource utilization to determine the source of the performance issues.Which metric should you monitor?  Local tempdb percentage  DWU percentage  Data Warehouse Units (DWU) used  Cache hit percentage ExplanationTempdb is used to hold intermediate results during query execution. High utilization of the tempdb database can lead to slow query performance.Note: If you have a query that is consuming a large amount of memory or have received an error message related to allocation of tempdb, it could be due to a very large CREATE TABLE AS SELECT (CTAS) or INSERT SELECT statement running that is failing in the final data movement operation.Reference:https://docs.microsoft.com/en-us/azure/synapse-analytics/sql-data-warehouse/sql-data-warehouse-managemonitoQUESTION 133You have an Azure SQL database named DB1 in the General Purpose service tier.You need to monitor D81 by using SQL insights.What should you include in the solution? To answer, select the appropriate options in the answer area.NOTE: Each correct selection is worth one point. ExplanationQUESTION 134You are building an Azure Stream Analytics job to retrieve game data.You need to ensure that the job returns the highest scoring record for each five-minute time interval of each game.How should you complete the Stream Analytics query? To answer, select the appropriate options in the answer area.NOTE: Each correct selection is worth one point. Reference:https://docs.microsoft.com/en-us/stream-analytics-query/topone-azure-stream-analyticshttps://github.com/MicrosoftDocs/azure-docs/blob/master/articles/stream-analytics/stream-analytics-window-functions.md Loading … Positive Aspects of Valid Dumps DP-300 Exam Dumps! : https://www.actualtests4sure.com/DP-300-test-questions.html --------------------------------------------------- Images: https://blog.actualtests4sure.com/wp-content/plugins/watu/loading.gif https://blog.actualtests4sure.com/wp-content/plugins/watu/loading.gif --------------------------------------------------- --------------------------------------------------- Post date: 2023-09-29 09:39:35 Post date GMT: 2023-09-29 09:39:35 Post modified date: 2023-09-29 09:39:35 Post modified date GMT: 2023-09-29 09:39:35