Courtesy of: Chris Skinner
I’m always ready to see both sides of the fence. In fact, I would rather see both sides of the fence than just my side. My side is biased and tainted by my senses and inputs. I realised that recently when I blogged about Hong Kong. There are always two sides to the story. I realise this when I see the news reporting about Russia. As a guy living next door to that great nation, I know there are two sides to the story. Never read a book by its cover.
But, bringing this closer to my comfort zone, what are the two sides to the story of banks using mainframes with COBOL code to run their business?
Most of us, as soon as we see mainframe and COBOL think Yeuch! Code from the 1960s? Systems from the Iron Age? Get rid of them!!!
Now, I’ve been an advocate for years of refreshing core systems and moving to cloud, but am I advocating getting rid of mainframe systems with COBOL code? Not necessarily.
It depends on fitness-for-purpose in today’s real-time world. Are these systems fit-for-purpose?
The COBOL community claim yes. In a recent survey, over 92% of respondents believe their existing COBOL applications are strategic to the business, up from 95% in 2017.
Most articles have raised questions about whether COBOL remains fit for purpose, secure, or even viable as a modern day technology within an ever-changing digital world. We, the COBOL community, know that these reports aren’t necessarily grounded in fact, but they can offer an entertaining read.
In a more recent article Tom Taulli, who runs an online COBOL course, argues the fantastic virtues of COBOL:
It powers about 80% of in-person financial services transactions and 95% of ATM swipes.
On a daily basis, it processes $3 trillion in commerce.
There are over 220 billion lines of code and 1.5 billion are written each year.
He concludes that: COBOL is, in a lot of ways, an antiquated, bad programming language. But for its particular domain, it’s better than anything else.
So, let’s get a reality check here.
COBOL is good for transactions even if it is sixty-years old. Does the fact that it is sixty-years old make it bad? Are we being ageist?
COBOL’s natural domain is reliable high-volume data processing and it’s a perfect language for that use. COBOL is solely outdated as a general-purpose language. Hackaday
Research found that 63% of respondents will be improving on their COBOL systems in 2020. Moreover, COBOL code bases continue to grow, with the average code base now running to 9.9 million lines, versus 8.4 million in 2017 – reflecting the ongoing investment, re-use and expansion in core business systems.
The interesting thread amongst all of these articles is a company called Microfocus.
Micro Focus has been the market leader and provider of distributed COBOL development tooling for nearly 40 years.
Microfocus are leaders in supporting banks and governments dependent upon ageing COBOL systems and keeping them fresh and strategic. Good for them.
But is COBOL right for today? Are systems developed for in-house transaction processing in batch appropriate for a world of companies born on the internet, open and API driven, platform-based and real-time?
There are at least 1.1 million CICS transactions run every second of every day! Surely Google searches can beat that? Nope…there are only about 59,421 Google searches every second globally. By the way, a large majority of CICS transactions are written in…you guessed it: COBOL.
I guess my answer is that if the system is built for in-house transaction processing in batch then yes, it is obviously wrong, legacy technology but, if it can scale to support large-scale real-time transaction processing then no, it is appropriate. Should it be in COBOL? I’ll let you answer that one.
My own take?
Well, if you’ve got millions and billions of lines of code that still work, it’s too hard to change it but, if you were starting out today on a new digital bank, I don’t know of one that would choose COBOL for their core systems build.
Author, Speaker and Troublemaker