In today’s technological age, cloud computing is increasingly ubiquitous. Savvy IT and business decision makers who seek to leverage these technologies should evaluate the platform, mitigate risks and look for a single point for data integration, data governance and cost-effectiveness. This has to be done while preserving SLAs (service-level agreements) for mission-critical applications that span both cloud and on-premises applications/workflows.
However, I often find that organizations are unprepared for the vast array of options and flexibility that cloud computing offers. Advances and demands in cloud computing raise valid questions about platforms, tool sets, compliance and security.
To help address these concerns and get the most value out of the platform, you should focus on four tasks: assess targeted workflows, plan for data security, build in robust integration and connectivity for cloud and on-premises needs, and monitor and maintain workflows for both internal and external SLAs. Let’s take a closer look at each task:
1. Assess targeted workflows
Most companies want quick ROI and minimum upfront investment, so you need to be realistic and honest. Review the needs of the business and create a short list of target workflows. Remember to ensure continuity in workflow for a brief period of time in order to compare results.
For example, you don’t want to risk overlooking the Accounts Payable/Receivable processes or the Support workflow in a process move. Involve an executive sponsor, project manager and business unit representative to help you fully understand important workflows and avoid costly remediation downstream.
Plan for Data Security
2. Plan for data security
According to a recent survey of CIOs, security technologies ranked in the Top 10 Technology Priorities in 2010. Many cloud-based platforms provide compliance for SAS 70 (Statement on Auditing Standards No. 70: Service Organizations). Consider whether the PCI DSS (Payment Card Industry Data Security Standard) and the FISMA (Federal Information Security Management Act of 2002) are also a concern for your organization. Since retrofitting industry standards on a project nearing completion can incur additional cost, avoid this by making sure cloud and integration platforms are aligned with key standards up-front.
Once compliance requirements are identified, start planning for data movement. In cloud-to-cloud integrations, security-sensitive data should not persist on the cloud platform. While creating and using an intermediate data set might seem appealing, it adds little value and increases security risk.
When one or more applications or processes are on-premises, the data will then have to be sent into the cloud. If data moves in near real time, record by record, you won’t need to persist the data. But if you have periodic batch-like processes, you will need some method to securely persist that data.
Therefore, look for an integration platform with both a lightweight “agent” for on-premises connectivity and the ability to push data into the cloud. If business needs or restrictions require a workflow to ship a “file” up to the cloud, consider both encryption and a SFTP Server hosted in the company’s DMZ area. If your policies allow, you can also directly expose an on-premises application to the cloud with your integration agent.
Build in Robust Integration and Connectivity
3. Build in robust integration and connectivity
According to a recent poll, nearly 35 percent of CIOs cite integration as their number two concern. Integration is a key part of your cloud strategy in many ways. You should choose an integration platform to support transactional, real-time changes spanning both cloud-based and on-premises applications.
The platform should also be scalable and flexible in order to meet three needs: First, today’s connectivity needs such as e-mail, SFTP, HTTP Secure and WSDL (Web Services Description Language). Second, legacy needs such as COBOL, flat file and reports. And third, emerging needs such as XML, XBRL (eXtensible Business Reporting Language) and electronic data interchange (EDI). Look for a platform that encompasses on-premises and cloud-based connectivity with the flexibility to continue to innovate and evolve as new demands emerge.
4. Monitor and maintain workflows for both internal and external SLAs
When applications are within the walls of an organization, it’s relatively easy to monitor their status, run down the hall to make a change, and add RAM here or swap a hard drive there. While cloud computing cost savings are compelling, you lose the ability to physically touch your resources. Cloud models let you expand computing capacity at will without an army of IT professionals. Depreciation and upgrade costs also go away, but the need to monitor your workflows and internal and external SLAs does not. You don’t want the frustration of waiting for a cloud-based process to work with no indication of the problem, the solution and the time to resolution.
Therefore, look for cloud and integration platforms that allow you to see the infrastructure as well as give you the ability to make a change at the application level and the integration level. Monitoring dashboards can help you maintain workflows and address issues as needed. Make sure you have monitoring options and flexibility. Make sure that response times allow your execution engines to handle peak loads and support future growth.
For instance, proactive alerts (such as potential performance bottlenecks) from the infrastructure and the integration job give you the means to take immediate action and stay on your SLA timetable. Some cloud solutions give differing types of alerts to not only the project sponsor but their users (such as Login Failed, Connection Refused, and Files Not Present).
Cloud computing is coming, fast and furious. Leveraging it in the right way requires a bit of upfront research and process planning. IT executives who keep focus on the right resources, processes and technology-and not the marketing hype-will push into the cloud with ease and success.
Bill Humphrey is a PMI-certified Project Management Professional with more than 10 years of industry programming and technical experience in various languages and platforms. Bill has in-depth understanding of multiple programming languages/interfaces and knowledge of various business cultures, practices and ethics worldwide. At Pervasive Software, Bill leads all technical client-facing teams in support of the company’s integration projects. Bill is responsible for managing cloud-based and on-premises technical solution architecture as well as designing and documenting best practices for a range of integration scenarios. Prior to Pervasive, Bill worked for HP Enterprise Business (formerly EDS) where he led the technical claims processing for several NHIC Medicaid programs. Bill holds a Bachelor’s degree in Computer Science and a Master’s degree in International Business. He can be reached at bill.humphrey@pervasive.com.