A cloud data center closes: What this tells us about cloud governance

Lately, I’ve been working on a chapter on cloud security and governance in my forthcoming book, Cloud Standards, so I have been more attuned than usual to events in the cloud governance world. Even so, I don’t think I would have missed the flurry of news about a cloud data center closing at the end of February. You can read more about it here. What I find interesting is what this scenario tells us about governance concerns in the cloud.


Cloud governance is a major obstacle to cloud computing


The undercurrent is that certain end customers – in this situation government and commercial – feel more comfortable hosting their own critical information on premise vs. remotely. I’d like to analyze this in more depth. This is a serious issue for cloud. The unwillingness of customers to move mission-critical resources to the cloud demotes cloud computing to a second-tier technology, unsuitable for the most important enterprise purposes, denying mission-critical computing the cost-savings and flexibility that cloud promises.


What is the barrier?


Cloud data centers surrounded by high walls and razor wire do not look insecure, and public cloud provider sites bristle with security statements. A key to the problem is that traditional IT governance and security assumes the infrastructure is under the direct control of the enterprise. Cloud infrastructure is under the control of the cloud provider. If the cloud is not private, the enterprise chain of responsibility is not there to enforce controls in the cloud. But a mission-critical consumer has no choice: they must always assert adequate control to establish governance over their mission-critical resources.


Think about a typical ISO/IEC 27001 scenario in implementing an IT control. 27001 advocates a Deming Plan-Do-Check-Act cycle.  In the Plan phase, risks, mitigations, costs, and benefits are evaluated and somebody makes a decision that a control should be implemented. In the Act phase, funds are allocated and a team is designated to execute the control. A simple example might be to declare that a certain set of files are critical and all access to physical storage and backup media and all access to  file contents must be controlled and logged — a common procedure on-premises in the enterprise, and not that hard to establish in a third-party cloud data center surrounded by razor wire. Where is the team that does this when the control is on a file system on a cloud? Could be the enterprise team on IaaS, could be the cloud team on PaaS. All that has to be worked out.  


Then comes the Check phase. Someone must check to be sure the controls are doing what they were intended to do. How do you do that? Within the enterprise, it is a probably set of reports from the operations team and some spot checking by an auditor. In a third-party data center, the answer is not so clear. The enterprise operations team is remote, not there to police for backup drives left sitting in the lunch room. The third-party personnel are not in the enterprise chain of responsibility. How can the enterprise be certain that third parties are following proper procedures? Files on disks in locked rooms on premises may not need to be encrypted, but they may have to be in the cloud.  Is the third-party provider taking care of it? Or does the enterprise team have to do it? Has that been written into the control specification? The point is not that encryption is onerous, but new procedures and responsibilities have to be developed to ensure that the right entity is aware that it has to be done.


Currently, the tool most used to assure that controls are maintained is a contract or service-level agreement (SLA) that obligates the provider to protect the consumer. That approach can work, but such a contract is far from a click through. Negotiating a sufficiently detailed contract is difficult, time consuming, expensive, and can wipe out all the flexibility that makes cloud attractive. Particularly in an IaaS implementation, controls can be mixes of provider and consumer implementations with complex division of responsibility.


Provider certification is not enough


 A provider can certify the services they offer to the consumer, but certifying the consumer’s services implemented on the provider’s cloud is not ordinarily something the provider can do. In fact, the company that decided to close this data center had every major certificate, and you could conclude that if provider certification were enough, the data center would still be open. The problem is that procedures are still unclear and the certificates focus on a single enterprise, not an enterprise and third parties who have taken partial responsibility for mission-critical resources.


This problem will be solved, but it is not solved yet. Groups like the Cloud Security Alliance are making great strides toward clarifying the tangle, and I will not be surprised to see the problem fall behind us soon, but for the time being, this public situation is an indication of the current state of affairs.


*Free license image courtesy of stock.xchng.

The following two tabs change content below.

Marvin Waschke

Marv Waschke is a senior principal architect at CA Technologies. He has represented CA Technologies in several standards groups including the Cloud Management Working Group and Configuration Management Database Federation working groups of the Distributed Management Task Force (DMTF). He is also a member of the Oasis Topology and Orchestration Specification for Cloud Applications technical committee and an author of the W3C Service Modeling Language specification. He is a member of the company’s Council for Technical Excellence. He joined CA Technologies in 1994, where he worked on the design and development of CA Service Desk Manager and expanding it to new platforms. He also contributed to the design and implementation of the CA CMDB. Marv holds a Computer Science degree from Western Washington University and a Master of Arts in the humanities from the University of Chicago.

This article has 3 comments

  1. Cloud privacy is an important issue. If you are moving important data to the cloud and accessing it from different devices, you want it to be secure. The hesitation is understandable.

  2. Its interesting this “barrier” has been put into place. I cant imagine it lasting long and the interference will eventually stop.

  3. Its interesting this “barrier” has been put into place. I cant imagine it lasting long and the interference will eventually stop.

Leave a Reply