How to Achieve Greater Application Interoperability in Your Data Center

In today's data center, application interoperability matters more than ever. There are security, performance and reliability risks associated with poor application interoperability. Here, Knowledge Center contributor Rex Black explains how to reduce the risk of nasty surprises in your data center's production environment.


In one of my previous Knowledge Center articles, I discussed how to build quality applications by integrating multiple quality assurance activities into your process. This is important, but most of us don't have the luxury of thinking about only one application. In today's data center, it's not enough to know that each application works properly on its own.

More and more, everything in a data center talks to everything else, and the trend is for greater interoperability in the future. Before we put disparate applications together in our data centers, we need to know if they'll work together.

Of course, you can and should expand the scope of the quality assurance techniques I discussed in my article about building quality applications to address issues of interoperability. When identifying requirements, you must consider the various applications that should talk to each other. When designing an application, construct simple, robust channels for data and control to flow between those interoperating applications.

Application interoperability risks

You should particularly consider the security, performance and reliability risks associated with interoperability. Connecting two or more applications together magnifies the potential likelihood and impact of security vulnerabilities. For example, a previously stand-alone application with access to customer credit card information, when connected to an application that provides information to your company Website, might create an avenue for leakage of this sensitive data.

When applications communicate, it can affect the performance of each application. For example, suppose an application can now request data from another application that stores the requested information on two or more tables. That could result in a database join query that deals with large volumes of data. This problem can be made worse yet if the table indices are not set up properly. I once worked with a client where a multiyear project (that involved dozens of people) failed after three years of effort because of database-related performance problems.

Application interoperability can also affect reliability. For example, if delays or losses of information occur, this can result in timeouts, unexpected voids in data records and processing with illegal default values. In addition, the lack of proper data conversion by the sending or receiving application can occur. This could cause the application to stop responding or even crash, or simply result in garbage in, garbage out scenarios that cause failures later. As an example, the Mars Climate Orbiter mission was lost due to problems with data conversion between metric and English units in two interoperating NASA systems. It was found that "one team used English units (inches, feet and pounds) while the other used metric units for a key spacecraft operation."

As I mentioned, you can start to address these risks during requirements definition and the design. In addition, code reviews and static analysis can also locate potential problems. However, you'll also need to carry out system integration testing prior to putting interoperating applications into the data center.