Seven Steps to Effective Database Automation

1 of 10

Seven Steps to Effective Database Automation

For companies developing applications, one step is often the most painful: managing database changes. In fact, anything to do with creating and managing databases can be painful. It involves a lot of manual labor to make sure database changes are in sync with application changes. Managing and deploying database updates is by far the slowest and riskiest part of the application-release process. This is why an IT cottage industry, led by Quest Software and others, has been so successful in providing third-party database management tools that make life easier for DB admins. In this eWEEK slide show, Pete Pickerill, co-founder of Datical, discusses the pillars of effective database automation. These data points represent what is required to bring database change management practices in line with modern app development and IT operations practices.

2 of 10

Keep DBA and Devs on the Same Page at All Times

Changes in the database are often made by both the application developer and the database administrator (DBA). The application developer typically has a good understanding of why a change is being made and the impact that change will have on the rest of the product. However, this is not the case for the DBA, the person who is actually handling the change. DBAs are simply informed that the change is necessary and hope they deploy everything they should and don't deploy anything they shouldn't.

3 of 10

STEP 1 in DB Change: Tightly Associate It With Dev Effort

The first step in packaging a database change is to tightly associate it with the development effort it supports. By tying changes to their features, DBAs are able to associate the activity with the driver more quickly and trace whether or not all of the appropriate schema components for an application change have been applied.

4 of 10

STEP 2: Tie New Features to Release Vehicle

The second step of packaging is then tying the features to the appropriate release vehicle. This allows DBAs to better understand and participate in the application release process, especially as modifications to the established plan are made. If all changes are tightly associated with their feature and release, it becomes much easier to move those features in and out of releases without missing anything or grabbing too much.

5 of 10

STEP 3: Validate Automation All Around

Validation is focused on implementing automation with a human touch. This speeds the process of change validations without sacrificing the safety inherent in manual reviews. Recovering from a bad database change can be difficult, if not impossible. To avoid making a bad change in the first place, there are two levels of validation that must occur. Over time, databases serving the same application can drift out of sync due to their persistent nature and the manual process of updating them. To compensate for these possible differences, the first layer of validation should confirm that the database is in the appropriate state to successfully receive the changes.

6 of 10

STEP 4: Always Stick With Best Practices

The second level of validating a change is making sure it adheres to best practices, organizational standards and compliance requirements. To provide this in an automated fashion, a fully customizable validation mechanism must be present. DBAs must be able to configure automation to find the same things they look for when reviewing a change. If a change does not violate any policies in the rule suite, the change can pass through the system unhindered. However, if a change does violate a rule, the system will be stopped and the DBA will be alerted. From there, the DBA can accept or reject the change, and the process starts again. By only involving the DBA for manual review when it's absolutely necessary, change moves through the system faster without sacrificing quality or security.

7 of 10

STEP 5: Avoid Triaging Batches of Errors

The current practice of manually processing a batch of scripts—and triaging the errors that arise until everything is eventually successful—must stop. The new deployment process should be simple and identical in every environment in the life cycle to ensure the same validation and consistency that enables other application components to be released worry-free. This can be partially accomplished by employing the first two data points above to ensure completeness and validity of the changes to be deployed. Once that's taken care of, it's simply one-click.

8 of 10

STEP 6: Tracking Must Be Automated

Database change tracking methods are largely manual, which introduces the possibility of human error and prevents us from truly trusting the various artifacts we use to piece the puzzle together. Therefore, tracking must happen automatically. Data must be collected and stored when a managed environment is materially changed. Removing the reliance on human action to track these changes leads to tracking data that is high quality, consistent and complete.

9 of 10

STEP 7: Go and Do the Integration

The tool chains employed by application development groups are as varied as they are powerful. There are a wide variety of IDEs, source code control solutions, infrastructure providers, build systems and deployment offerings. No matter what individual products make up a group's chain of delivery, the integration between them is seamless. Your tools should integrate and collaborate as you do. When choosing a database change management solution for agile enablement, organizations should not be forced to limit options for tools that handle other areas of the process. Information should flow freely between the tools, and the tools should be easily invoked by any and all orchestration platforms so the flexibility to implement or change any other link in the chain is there.

10 of 10

10 Predictions for the Data Analytics Market for 2017

In 2016, a big trend was do-it-yourself IT, and it was truly a breakthrough year for self-service data preparation and analytics. Adoption of self-service analytics solutions skyrocketed, because business users demand the ability to analyze data without having to rely on IT. Self-service data preparation tools also experienced rapid growth, as more data users realize the technology can save them tremendous time, budget and resources by finding and accessing data from virtually any source rapidly and then preparing it for analysis in a fraction of the time that it takes using spreadsheets and other manually intensive measures. This trend will continue to expand going into 2017 and, in fact, tremendous opportunity lies ahead. We'll continue to see just as many—if not more—innovations that will persist in transforming how data scientists, data analysts and business users harness insights to...
Top White Papers and Webcasts