Mainframe’s Past Informs the Future of IT
As a futurist and technologist, I believe in the power of technology to drive change and tend to look toward the future not the past.
As a futurist and technologist, I believe in the power of technology to drive change and tend to look toward the future not the past. I bet many of you are the same. Yet, ironically, if we look at the past from just the right angle we can see the future. Given that this year is the 50th anniversary of the first mainframe computer it is great time to look back and see how we got to where we are today—and how the mainframe fits into the future.
IBM had been producing large computers since 1952 but 50 years ago, in 1964, the company introduced System/360, the mainframe that was a family ancestor to the zSeries we know today.
To put 1964 in perspective, lets take a look at some other events that had nothing to do with computers.
And the world of IT looked very different 50 years ago:
Amidst all that was going on socially and technically, corporations with centralized IT departments and “glass-house” data centers got a more powerful machine in IBM’s System/360 running OS/360. Back then, the general population didn’t understand computers at all and certainly knew nothing of networks – unlike today when the majority has at least a passing grasp of IP and MAC addresses. The lack of knowledge around the power of big computers may have amplified the ‘60s growing cynicism of centralized, big government. Whether or not that was the case, by the ‘70s enough people were ready to embrace personal computers and this led to the ‘80s distributed computing IT revolution.
There were no personal computers in 1964 of course. It would be another 4 years before semiconductor pioneers Robert Noyce and Gordon Moore would start Intel. And it would be the late ‘70s before affordable microprocessors like Intel’s 8086 and Motorola’s 68000 would be widely used in personal systems. The future started to become visible then and Moore made his famous prediction about the growth in the power of chips. Even so, he underestimated the changes increasing microprocessor miniaturization would bring about – changes that include smart phones, iPads, Google Glass and other wearable technologies.
Today both customers and IT professionals experience extreme change — change that is shaping the future. Today most of us have at home more compute power and storage than large IT departments had in 1964. While this is due to the microprocessor-fueled personal computing revolution it is also due to the emergence of cloud computing.
The cloud heralds a new form of centralized computing. Instead of being closed and controlled by the few, it is open and democratized. Instead of being defined by hardware it is defined by software and instead of vendor lock-in we have open-source alternatives and the agility of SaaS.
As the mainframe celebrates its 50th anniversary, we find the latest incarnations of the IBM mainframe uniquely suited to provide the centralized compute power needed by private and public clouds. The mainframe has been the backbone of entire industries, and although some refuse to acknowledge it, the mainframe is stronger, more evolved, and more relied upon than ever before.
Young IT professionals who have not yet joined the respective mainframe or distributed computing campsare open to using the best technologies to get the job done. The mainframe, as I spoke about in a previous post, “Boomers And Beyond — The Future of Mainframe Software”, is evolving to meet this new generation of IT professionals on their terms. Is IT ready to evolve to match the changes in the mainframe?
Those of us who have been around since mainframes were the only systems should be prepared to change with the times. The past, after all, is a nice place to look back on, but you can’t live there. There are lessons to be learned from a look back, but acting on those lessons is up to you. You have 50 years to draw on and no time at all to wait.