Consumer Driven IT: Accept Imitations, but Not Limitations

There is a tremendous amount of software development going on in large enterprises today. Customers often joke that they have more developers on staff than Oracle or Microsoft.


But none of this software is developed in isolation. Ask developers, and they’ll tell you that the infrastructure they have to deal with is more complex and interdependent than ever before. In truth, much of software development is in fact an integration effort. IT development teams are building for myriad front-end web and mobile platforms, while also bringing along everything else the company has ever done. Enterprise development today relies on an increasing number of continually changing apps, data sources, everything as a service (IaaS, PaaS, SaaS, etc.), and integration middleware to play nicely with the real world and work properly for the organization.


Getting all of these distributed systems to an available and stable state for development and testing is like aligning the stars – it is nearly impossible. This lack of stable and available resources creates constraints that delay or prevent successful application deployments.


As an example of what companies face today, an IT leader at a large financial institution recently told me about a process his organization goes through three times a year. He’s dubbed it “enterprise release,” and it involves hundreds of applications, all pushed live simultaneously. These applications—some homegrown, some off-the-shelf, all highly customized—might address thousands of business and technical requirements. But none of the components can be effectively tested against any other systems, because nothing is ever really finished or ready at the same time, so they simply drop functionality out of each release. A development manager at the company summed up the situation this way: “I can’t do anything until I have everything, but I never have everything.”


These guys tried to create copies of production at huge expense, but that never provided enough stability for reliable results. The performance lab environment wouldn’t provide more than 10 percent of the peak production load, so there’s no chance to test for scalability. The code can’t be tested against specialized data scenarios they need, because there’s no DBA access to the data (e.g. where a specific customer scenario might be entered to give triple reward points after her third card purchase at a grocery store).


Sadly, that large financial institution is not alone. I’m sure many readers are nodding in agreement, having experienced similar less-than-ideal development scenarios that arise as a result of consumer-driven IT demands in a distributed software world. The goal of faster delivery drives many companies to implement software changes without forward visibility due to constraints in the environment. In the end, deadlines are missed, the IT organization’s reputation is further diminished, and in some cases (especially with well-known brands and public-facing applications), well-publicized failures make headlines. How can all of us in the business of developing software fix this problem, wherever we work?
 
We need to accept imitations, not limitations.

Unlike virtually all other manufacturing disciplines, the software development industry typically doesn’t validate its products in a simulator before finalizing and shipping its designs. Can you imagine Boeing taking an experimental wing, bolting it on an airplane in San Francisco, and seeing how well it works on the next scheduled flight to New York?


No, Boeing engineers wouldn’t dream of using the real thing. They test designs using a flight simulator and a wind tunnel, where any condition, from rainy days to high winds can be simulated. In the same way, enterprise software should be engineered and tested using service virtualization, which simulates an application’s surrounding real-world environment, data scenarios, and workload.


Service virtualization “listens” to applications and the messages passed between systems. It then clones those underlying systems in a stable, scalable virtual service environment for software development teams to use. A service virtualization platform such as CA LISA behaves and reacts just like the actual production systems being updated, integrated, or otherwise leveraged. Virtual services can be infinitely customized and used by multiple development teams at the same time.


The result is the polar opposite of what most companies dealing with complex IT landscapes are experiencing today: faster time to market (with few, if any, missed deadlines, planned rationally); lower development costs; and few or no embarrassing defects and performance issues escaping into production to vex end users and customers alike.


With the competition just a click away, isn’t it time for software development to behave more like a real engineering discipline? It’s time we embraced simulation to prove, perfect, and deliver new business and technical functionality without limitations.

The following two tabs change content below.

John Michelsen

Chief Technology Officer at CA Technologies
As the Chief Technology Officer of CA Technologies, John is responsible for technical leadership and innovation, further developing the company’s technical community, and aligning its software strategy, architecture and partner relationships to deliver customer value. John is also responsible for delivering the company's common technology services, ensuring architectural compliance, and integrating products and solutions. John holds multiple patents including market-leading inventions delivered in database, distributed computing, virtual/cloud management, multi-channel web application portals and Service Virtualization (LISA). In 1999, John founded ITKO, and built LISA from the ground up to optimize today's heterogeneous, distributed application environments. Under his leadership, LISA’s platform for agile development grew in breadth and depth. The company was acquired by CA Technologies in 2011. CA LISA’s suite reshapes customers’ software lifecycles with dramatic results. Today, it delivers 1000%+ ROI for customers and is a lead offering in the Service Virtualization market. Prior to ITKO, John led SaaS and E-commerce transformations for global enterprises at Trilogy and Agency.com. He also founded a boutique custom software firm that focused on distributed, mission-critical application development projects for customers like American Airlines, Citibank and Xerox. John earned degrees in business and computer science from Trinity University and Columbus University. He has authored a best practices book, “Service Virtualization: Reality is Overrated,” which will be available this fall. He has contributed to dozens of leading technical journals and publications on topics ranging from hierarchical database techniques and agile development to virtualization.

Latest posts by John Michelsen (see all)

This article has 1 comment

  1. Good article featuring Service Virtualization and ITKO (CA) founder John Michelsen. Posted an update about it at http://blog.itko.com/2012/05/on-ca-communities-accept-imitations-not-limitations.html

Leave a Reply