Improved Performance Monitoring Comes to Azure SQL Data Warehouse

Microsoft adds Azure Monitor support, enabling customers to keep a closer eye on the big data service.

Microsoft

Azure SQL Data Warehouse, Microsoft's elastic, cloud-based data warehousing offering for big data workloads, is now offering its customers more detailed performance insights.

The service supports Azure Monitor, revealed Kevin Ngo, program manager of SQL Engineering at Microsoft, in an April 12 announcement. As its name suggests, Azure Monitor is a performance monitoring tool that allows customers to keep track of resource utilization and determine the health of their cloud resources.

The integration "not only enables you to monitor your data warehouse within the Azure portal, but its tight integration between Azure services also enables you to monitor your entire data analytics solution within a single interface," Ngo added.

Providing up-to-the-minute metrics, Azure Monitor allows users to view CPU consumption and IO (input/output) in near real time, revealing potential bottlenecks that affect performance. Users can also track DWU (data warehouse unit) utilization statistics, including the number of DWUs used for a given workload, or analyze historical data. By default, Azure Monitor retains data warehouse metrics for 90 days.

Azure SQL Data Warehouse also features tighter integration with Azure Analysis Services, a cloud-based analytics engine. Using the Azure Portal management hub, users can simply click on the Model and Cache Data button in the Task view to build semantic models based on information stored in the service.

Finally, Microsoft rolled out a new feature that can help organizations trim their cloud costs during times of low activity.

Azure SQL Data Warehouse includes a pause feature that shuts down the service's compute functionality. An alert will appear in the Azure Portal if it detects active queries before pausing the service, which can cause interruptions to end-user applications, explained Ngo.

Database Backup Flexibility, New Purchasing Model

Microsoft has steadily been bulking up its cloud data platform ecosystem, including some recent changes to Azure SQL Database's long-term backups.

The service's backup retention allows customers to manage database backups for up to 10 years. Now, instead of requiring users to deploy and manage a Backup Service Vault for this purpose, SQL Database will now use Azure Blob storage, Microsoft's object storage solution.

"This new design will enable flexibility for your backup strategy, and overall more control over costs," noted Alexander Nosov, principal program manager of Azure SQL Database at Microsoft, in a blog. The switch to Blob Storage will allow organizations in all Azure regions to access the service's long-term retention and data protection capabilities and enable users to configure policies that support weekly, monthly, yearly and "week-within-a-year backups," he added.

Finally, Microsoft is giving customers a new way to configure and pay for their Azure SQL Databases. Organizations can use the company's new vCore-based purchasing model for their Azure SQL Database elastic pools or single database deployments.

Under v-Core, users can select between two tiers of service, General Purpose and Business Critical, both of which allow organizations to independently configure the underlying compute and storage resources to strike a suitable balance between price and performance. Microsoft will continue to support the existing DTU (Database Transaction Unit) model and its preconfigured bundles.

Pedro Hernandez

Pedro Hernandez

Pedro Hernandez is a contributor to eWEEK and the IT Business Edge Network, the network for technology professionals. Previously, he served as a managing editor for the Internet.com network of...