This page was exported from Free valid test braindumps [ http://free.validbraindumps.com ] Export date:Sat Apr 5 1:25:20 2025 / +0000 GMT ___________________________________________________ Title: Online Questions - Valid Practice DP-300 Exam Dumps Test Questions [Q38-Q57] --------------------------------------------------- Online Questions - Valid Practice DP-300 Exam Dumps Test Questions 100% Real DP-300 dumps  - Brilliant DP-300 Exam Questions PDF Q38. You have SQL Server 2019 on an Azure virtual machine that contains an SSISDB database.A recent failure causes the master database to be lost.You discover that all Microsoft SQL Server integration Services (SSIS) packages fail to run on the virtual machine.Which four actions should you perform in sequence to resolve the issue? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct. ExplanationStep 1: Attach the SSISDB databaseStep 2: Turn on the TRUSTWORTHY property and the CLR propertyIf you are restoring the SSISDB database to an SQL Server instance where the SSISDB catalog was never created, enable common language runtime (clr) Step 3: Open the master key for the SSISDB database Restore the master key by this method if you have the original password that was used to create SSISDB.open master key decryption by password = ‘LS1Setup!’ –‘Password used when creating SSISDB’ Alter Master Key Add encryption by Service Master Key Step 4: Encrypt a copy of the mater key by using the service master key Reference:https://docs.microsoft.com/en-us/sql/integration-services/backup-restore-and-move-the-ssis-catalogQ39. You are monitoring an Azure Stream Analytics job.You discover that the Backlogged input Events metric is increasing slowly and is consistently non-zero.You need to ensure that the job can handle all the events.What should you do?  Remove any named consumer groups from the connection and use $default.  Change the compatibility level of the Stream Analytics job.  Create an additional output stream for the existing input stream.  Increase the number of streaming units (SUs). Backlogged Input Events: Number of input events that are backlogged. A non-zero value for this metric implies that your job isn’t able to keep up with the number of incoming events. If this value is slowly increasing or consistently non-zero, you should scale out your job, by increasing the SUs.Reference:https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-monitoringQ40. You have an Azure SQL database.You are reviewing a slow performing query as shown in the following exhibit.Use the drop-down menus to select the answer choice that completes each statement based on the information presented in the graphic.NOTE: Each correct selection is worth one point. Explanation:Graphical user interface, text, application, email Description automatically generatedReference:https://docs.microsoft.com/en-us/sql/relational-databases/performance/live-query-statistics?view=sql-server-ver1Q41. You have an Azure SQL database.You are reviewing a slow performing query as shown in the following exhibit.Use the drop-down menus to select the answer choice that completes each statement based on the information presented in the graphic.NOTE: Each correct selection is worth one point. ExplanationGraphical user interface, text, application, email Description automatically generatedReference:https://docs.microsoft.com/en-us/sql/relational-databases/performance/live-query-statistics?view=sql-server-ver1Q42. You have an Azure subscription.You need to deploy an Instance of SQL Server on Azure Virtual Machines. The solution must meet the following ‘requirements:* Custom performance configuration. such as lCPS. capacity, and throughout, must be supported.* Costs must be minimizedWhich type of disk should you include in the solution?  Premium SSD v2  Premium SSD  Ultra SSD  Standard SSD Q43. Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.You have an Azure Data Lake Storage account that contains a staging zone.You need to design a daily process to ingest incremental data from the staging zone, transform the data by executing an R script, and then insert the transformed data into a data warehouse in Azure Synapse Analytics.Solution: You use an Azure Data Factory schedule trigger to execute a pipeline that executes mapping data flow, and then inserts the data into the data warehouse.Does this meet the goal?  Yes  No ExplanationIf you need to transform data in a way that is not supported by Data Factory, you can create a custom activity, not a mapping flow,5 with your own data processing logic and use the activity in the pipeline. You can create a custom activity to run R scripts on your HDInsight cluster with R installed.Reference:https://docs.microsoft.com/en-US/azure/data-factory/transform-dataQ44. You have two on-premises servers that run Windows Server 2019 and host a Microsoft SQL Server 2017 Always On availability group named AG1. AG1 contains a single database named DB1.You have an Azure subscription. The subscription contains a virtual machine named VM1 that runs Linux.You need to migrate DB1 to a SQL Server 2019 instance on VM1. The solution must minimize the downtime of DB1 during the migration.What should you do? To answer, select the appropriate options in the answer area.NOTE: Each correct selection is worth one point. see the answer below in explanation.ExplanationAnswer as in image below.Q45. You have SQL Server on an Azure virtual machine that contains a database named DB1. DB1 contains a tablenamed CustomerPII.You need to record whenever users query the CustomerPII table.Which two options should you enable? Each correct answer presents part of the solution.NOTE: Each correct selection is worth one point.  server audit specification  SQL Server audit  database audit specification  a server principal An auditing policy can be defined for a specific database or as a default server policy in Azure (which hosts SQL Database or Azure Synapse):A server policy applies to all existing and newly created databases on the server.If server auditing is enabled, it always applies to the database. The database will be audited, regardless of the database auditing settings.Enabling auditing on the database, in addition to enabling it on the server, does not override or change any of the settings of the server auditing. Both audits will exist side by side.Note:The Server Audit Specification object belongs to an audit.A Database Audit Specification defines which Audit Action Groups will be audited for the specific database in which the specification is created.Reference:https://docs.microsoft.com/en-us/azure/azure-sql/database/auditing-overviewQ46. You are creating a managed data warehouse solution on Microsoft Azure.You must use PolyBase to retrieve data from Azure Blob storage that resides in parquet format and load the data into a large table called FactSalesOrderDetails.You need to configure Azure Synapse Analytics to receive the data.Which four actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order. ExplanationGraphical user interface, text, application, chat or text message Description automatically generatedTo query the data in your Hadoop data source, you must define an external table to use in Transact-SQL queries. The following steps describe how to configure the external table.Step 1: Create a master key on database.1. Create a master key on the database. The master key is required to encrypt the credential secret.(Create a database scoped credential for Azure blob storage.)Step 2: Create an external data source for Azure Blob storage.2. Create an external data source with CREATE EXTERNAL DATA SOURCE..Step 3: Create an external file format to map the parquet files.3. Create an external file format with CREATE EXTERNAL FILE FORMAT.Step 4. Create an external table FactSalesOrderDetails4. Create an external table pointing to data stored in Azure storage with CREATE EXTERNAL TABLE.Reference:https://docs.microsoft.com/en-us/sql/relational-databases/polybase/polybase-configure-azure-blob-storageQ47. You have an Azure SQL database.You run the following PowerShell script.For each of the following statements, select Yes if the statement is true. Otherwise, select No.NOTE: Each correct selection is worth one point. ExplanationText Description automatically generatedReference:https://docs.microsoft.com/en-us/powershell/module/az.sql/set-azsqldatabasebackupshorttermretentionpolicy?viehttps://docs.microsoft.com/en-us/powershell/module/az.sql/set-azsqldatabasebackuplongtermretentionpolicy?vieQ48. You have an Azure virtual machine named VM1 on a virtual network named VNet1. Outbound traffic from VM1 to the internet is blocked.You have an Azure SQL database named SqlDb1 on a logical server named SqlSrv1.You need to implement connectivity between VM1 and SqlDb1 to meet the following requirements:* Ensure that VM1 cannot connect to any Azure SQL Server other than SqlSrv1.* Restrict network connectivity to SqlSrv1.What should you create on VNet1?  a VPN gateway  a service endpoint  a private link  an ExpressRoute gateway ExplanationAzure Private Link enables you to access Azure PaaS Services (for example, Azure Storage and SQL Database) and Azure hosted customer-owned/partner services over a private endpoint in your virtual network.Traffic between your virtual network and the service travels the Microsoft backbone network. Exposing your service to the public internet is no longer necessary.Reference:https://docs.microsoft.com/en-us/azure/private-link/private-link-overviewQ49. You have an on-premises Microsoft SQL Server 2016 server named Server1 that contains a database named DB1.You need to perform an online migration of DB1 to an Azure SQL Database managed instance by using Azure Database Migration Service.How should you configure the backup of DB1? To answer, select the appropriate options in the answer area.NOTE: Each correct selection is worth one point. Reference:https://docs.microsoft.com/en-us/azure/dms/known-issues-azure-sql-db-managed-instance-onlineQ50. You are designing an enterprise data warehouse in Azure Synapse Analytics that will store website traffic analytics in a star schema.You plan to have a fact table for website visits. The table will be approximately 5 GB.You need to recommend which distribution type and index type to use for the table. The solution must provide the fastest query performance.What should you recommend? To answer, select the appropriate options in the answer area.NOTE:Each correct selection is worth one point. ExplanationGraphical user interface, text, application, table, chat or text message Description automatically generatedBox 1: HashConsider using a hash-distributed table when:The table size on disk is more than 2 GB.The table has frequent insert, update, and delete operations.Box 2: Clustered columnstoreClustered columnstore tables offer both the highest level of data compression and the best overall query performance.Reference:https://docs.microsoft.com/en-us/azure/synapse-analytics/sql-data-warehouse/sql-data-warehouse-tables-distribuhttps://docs.microsoft.com/en-us/azure/synapse-analytics/sql-data-warehouse/sql-data-warehouse-tables-indexQ51. You have an Azure subscription.You plan to deploy an Azure SQL database by using an Azure Resource Manager template.How should you complete the template? To answer, select the appropriate options in the answer area.NOTE: Each correct selection is worth one point. Reference:https://docs.microsoft.com/en-us/azure/azure-sql/database/single-database-create-arm-template-quickstartQ52. You have an Azure SQL database named sqldb1.You need to minimize the amount of space by the data and log files of sqldb1.What should you run?  DBCC SHRINKDATABASE  sp_clean_db_free_space  sp_clean_db_file_free_space  DBCC SHRINKFILE DBCC SHRINKDATABASE shrinks the size of the data and log files in the specified database.Reference:https://docs.microsoft.com/en-us/sql/t-sql/database-console-commands/dbcc-shrinkdatabase-transact-sqlQ53. Which audit log destination should you use to meet the monitoring requirements?  Azure Storage  Azure Event Hubs  Azure Log Analytics Section: [none]Explanation:Scenario: Use a single dashboard to review security and audit data for all the PaaS databases.With dashboards can bring together operational data that is most important to IT across all your Azureresources, including telemetry from Azure Log Analytics.Note: Auditing for Azure SQL Database and Azure Synapse Analytics tracks database events and writes themto an audit log in your Azure storage account, Log Analytics workspace, or Event Hubs.Reference:https://docs.microsoft.com/en-us/azure/azure-monitor/visualize/tutorial-logs-dashboardsQuestion Set 3Q54. You have an Azure subscription that contains the resources shown in the following table.You need to create a read-only replica of DB1 and configure the App1 instances to use the replica.What should you do? To answer, select the appropriate options in the answer area.NOTE: Each correct selection is worth one point. Explanation:Text Description automatically generatedReference:https://sqlserverguides.com/read-only-replica-azure-sql/Q55. Case Study 5 – ADatum CorporationOverviewADatum Corporation is a financial services company that has a main office in New York City.Existing EnvironmentLicensing AgreementADatum has a Microsoft Volume Licensing agreement that includes Software Assurance.Network InfrastructureADatum has an on-premises datacenter and an Azure subscription named Sub1.Sub1 contains a virtual network named Network1 in the East US Azure region.The datacenter is connected to Network1 by using a Site-to-Site (S2S) VPN.Identity EnvironmentThe on-premises network contains an Active Directory Domain Services (AD DS) forest.The forest contains a single domain named corp.adatum.com.The corp.adatum.com domain syncs with a Microsoft Entra tenant named adatum.com.Database EnvironmentThe datacenter contains the servers shown in the following table.DB1 and DB2 are used for transactional and analytical workloads by an application named App1.App1 runs on Microsoft Entra hybrid joined servers that run Windows Server 2022. App1 uses Kerberos authentication.DB3 stores compliance data used by two applications named App2 and App3.DB3 performance is monitored by using Extended Events sessions, with the event_file target set to a file share on a local disk of SVR3.Resource allocation for DB3 is managed by using Resource Governor.RequirementsPlanned ChangesADatum plans to implement the following changes:– Deploy an Azure SQL managed instance named Instance1 to Network1.– Migrate DB1 and DB2 to Instance1.– Migrate DB3 to Azure SQL Database.– Following the migration of DB1 and DB2, hand over database development to remote developers who use Microsoft Entra joined Windows 11 devices.– Following the migration of DB3, configure the database to be part of an auto-failover group.Availability RequirementsADatum identifies the following post-migration availability requirements:– For DB1 and DB2, offload analytical workloads to a read-only database replica in the same Azure region.– Ensure that if a regional disaster occurs, DB1 and DB2 can be recovered from backups.– After the migration, App1 must maintain access to DB1 and DB2.– For DB3, manage potential performance issues caused by resource demand changes by App2 and App3.– Ensure that DB3 will still be accessible following a planned failover.– Ensure that DB3 can be restored if the logical server is deleted.– Minimize downtime during the migration of DB1 and DB2.Security RequirementsADatum identifies the following security requirements for after the migration:– Ensure that only designated developers who use Microsoft Entra joined Windows 11 devices can access DB1 and DB2 remotely.– Ensure that all changes to DB3, including ones within individual transactions, are audited and recorded.Management RequirementsADatum identifies the following post-migration management requirements:– Continue using Extended Events to monitor DB3.– In Azure SQL Database, automate the management of DB3 by using elastic jobs that have database-scoped credentials.Business RequirementsADatum identifies the following business requirements:– Minimize costs whenever possible, without affecting other requirements.– Minimize administrative effort.You need to recommend which configuration to perform twice to enable access to the primary and secondary replicas of DB3. The solution must meet the availability requirements.What should you recommend?  Enable database firewall rules.  Create database-scoped credentials.  Configure connection strings that reference the read-write listener.  Configure virtual network service endpoints. Q56. You have an Azure SQL database named DB1 that contains two tables named Table1 and Table2. Both tables contain a column named a Column1. Column1 is used for joins by an application named App1.You need to protect the contents of Column1 at rest, in transit, and in use.How should you protect the contents of Column1? To answer, select the appropriate options in the answer area.NOTE: Each correct selection is worth one point. Explanation:Box 1: Column encryption KeyAlways Encrypted uses two types of keys: column encryption keys and column master keys. A column encryption key is used to encrypt data in an encrypted column. A column master key is a key-protecting key that encrypts one or more column encryption keys.Reference:https://docs.microsoft.com/en-us/sql/relational-databases/security/encryption/always-encrypted-database-engineQ57. You have an Azure SQL Database managed instance named sqldbmi1 that contains a database name Sales.You need to initiate a backup of Sales.How should you complete the Transact-SQL statement? To answer, select the appropriate options in the answer area.NOTE: Each correct selection is worth one point. Reference:https://techcommunity.microsoft.com/t5/azure-sql-database/native-database-backup-in-azure-sql-managed-instance/ba-p/386154 Loading … DP-300 Exam PDF [2024] Tests Free Updated Today with Correct 373 Questions: https://www.validbraindumps.com/DP-300-exam-prep.html --------------------------------------------------- Images: https://free.validbraindumps.com/wp-content/plugins/watu/loading.gif https://free.validbraindumps.com/wp-content/plugins/watu/loading.gif --------------------------------------------------- --------------------------------------------------- Post date: 2024-06-25 14:11:54 Post date GMT: 2024-06-25 14:11:54 Post modified date: 2024-06-25 14:11:54 Post modified date GMT: 2024-06-25 14:11:54