In one of my previous Knowledge Center articles, I discussed how to build quality applications by integrating multiple quality assurance activities into your process. This is important, but most of us don’t have the luxury of thinking about only one application. In today’s data center, it’s not enough to know that each application works properly on its own.
More and more, everything in a data center talks to everything else, and the trend is for greater interoperability in the future. Before we put disparate applications together in our data centers, we need to know if they’ll work together.
Of course, you can and should expand the scope of the quality assurance techniques I discussed in my article about building quality applications to address issues of interoperability. When identifying requirements, you must consider the various applications that should talk to each other. When designing an application, construct simple, robust channels for data and control to flow between those interoperating applications.
Application interoperability risks
You should particularly consider the security, performance and reliability risks associated with interoperability. Connecting two or more applications together magnifies the potential likelihood and impact of security vulnerabilities. For example, a previously stand-alone application with access to customer credit card information, when connected to an application that provides information to your company Website, might create an avenue for leakage of this sensitive data.
When applications communicate, it can affect the performance of each application. For example, suppose an application can now request data from another application that stores the requested information on two or more tables. That could result in a database join query that deals with large volumes of data. This problem can be made worse yet if the table indices are not set up properly. I once worked with a client where a multiyear project (that involved dozens of people) failed after three years of effort because of database-related performance problems.
Application interoperability can also affect reliability. For example, if delays or losses of information occur, this can result in timeouts, unexpected voids in data records and processing with illegal default values. In addition, the lack of proper data conversion by the sending or receiving application can occur. This could cause the application to stop responding or even crash, or simply result in garbage in, garbage out scenarios that cause failures later. As an example, the Mars Climate Orbiter mission was lost due to problems with data conversion between metric and English units in two interoperating NASA systems. It was found that “one team used English units (inches, feet and pounds) while the other used metric units for a key spacecraft operation.”
As I mentioned, you can start to address these risks during requirements definition and the design. In addition, code reviews and static analysis can also locate potential problems. However, you’ll also need to carry out system integration testing prior to putting interoperating applications into the data center.
How to Test Application Interoperability
How to test application interoperability
When you do system integration testing, interoperability is one of the main types of testing you are typically doing, along with security, performance, reliability and end-to-end functionality. All of these types of testing require the use of realistic test data so that important test conditions are covered. Production data is the obvious place to obtain realistic test data, but there are pitfalls and surprises that can occur.
Remember that production data can include personal and confidential information. For good reason, organizations tend to place restrictions on who can access such data. This becomes particularly critical if you intend to outsource some of the system integration testing. There are tools available to anonymize production data in a way that preserves valuable test conditions while irreversibly hiding personal data. Production data sets are often very large, so be sure to allocate plenty of time to complete any test data anonymization project.
Software as a service and open-source applications
Keep in mind that using software as a service (SAAS) and open-source applications changes the situation, but it certainly does not make these risks go away. With either approach, if you are sharing data with such an application, you lose some of the control you would otherwise have over the interfaces between those applications and your own.
In the case of open-source software, you can at least determine your own schedule for taking updates. For SAAS, the software can change without your knowledge. If you share data with SAAS or open-source applications in typical ways, you can rely to some extent on the risks being addressed as part of the release process.
However, if your interoperations with these applications are atypical, you could end up discovering issues that no one thought to address. You should try to find out whether your usage of these applications matches the typical usage in order to understand interoperability risks and, thus, the degree of system integration testing required.
Consider Interoperability Throughout Life Cycle
Consider interoperability throughout life cycle
In today’s data centers-whether the systems are collocated or distributed across the cloud-interoperability matters more than ever. You should start considering interoperability during requirements and design. You should continue to address key interoperability risks such as performance, security and reliability throughout the life cycle-including code reviews, static analysis and testing, especially during system integration testing.
Since system integration testing involves test types that have very particular test data requirements, be sure to plan carefully for this phase of testing. And, in your system integration test plans, don’t forget to include SAAS and open-source applications that you use in your data center.
Achieving interoperability in your data center is not a trivial matter. By addressing the main topics raised in this article, you can reduce the risk of nasty interoperability surprises in your production environments.
Rex Black is President of RBCS. Rex is also the immediate past president of the International Software Testing Qualifications Board and the American Software Testing Qualifications Board. Rex has published six books, which have sold over 50,000 copies, including Japanese, Chinese, Indian, Hebrew and Russian editions.
Rex has written over thirty articles, presented hundreds of papers, workshops and seminars, and given over fifty speeches at conferences and events around the world. Rex may be reached at email@example.com.