Evangelizing Mainframe
Print Email

Capacity is King: How the IBM z14 Addresses Telecom Volume Concerns

One of the leading telecommunications providers in Australia needs to track every text message and phone call sent or received over its network and every byte downloaded by its customers all over the world. To do this, it has three banks of mainframes on rotating duty: While one system processes data from the previous day, the second system begins crunching the transactions from the last stroke of midnight, while a third set waits in reserve in case one of the other two goes down.

Even with two overlapping rows of the most powerful computers in the world and a third on standby, it takes them 32 hours to process 24 hours’ worth of data. The risks and pitfalls created by this sort of backlog go without saying, yet this leading telecommunications provider is by no means alone in this predicament.

Pumping up the Volume

Telecom companies worldwide are struggling to keep up with the sheer volume of data generated by our communications devices. And the burgeoning Internet of Things is already turning a mountain of information into a whole planet, growing the data we produce exponentially. Unlike in the finance industry where mainframes help mitigate the risks of latency, mainframes in the telecom space must counter the ever-growing problem of volume.

Taking 32 hours to process 24 hours’ worth of data is not a sustainable paradigm, which means communications companies need a mainframe that can crunch 24 hours’ worth of data within a 24-hour window, and maybe have time left over to provide some redundancy. Capacity is king, and the IBM z14 system seems poised to provide the sort of capacity to help the telecom industry thrive in the ever-growing data explosion.

The Z Solution

First introduced this summer, the IBM z14 is a monumental achievement in mainframe processing. Power, and the speed that it provides, is the primary trait that makes mainframes so indispensable to high-volume computing. And the z14 represents a new horizon in efficient data processing and a leap forward for industries that depend on Big Iron, telecom most definitely included.

With a whole host of processing improvements, from more efficient processors to architectural changes that help crank through much more data per second, the z14 can process up to 12 billion transactions in a 24-hour period, collating a full day’s data for analytics in a timely, actionable manner. This additional processor speed also drastically improves response time, making the z14 an attractive option for telecom firms thinking of moving to an enterprise cloud system.

The z14 offers trailblazing encryption enhancements. Its processor is fit not only for a fully loaded telecommunications company, but for any IT shop looking to establish a network able to scale into the future. Companies seeking greater capacity can use the z14 to process their workloads more efficiently, creating more time for redundant checks and improving their margin of error. Pervasive encryption, advanced machine learning capabilities and expanded ease of use make the z14 a new generation processor built specifically for workloads like those of the Australian telecom provider.

Keeping up

Telecom is just one industry where the volume of incoming data to be processed will only grow as time passes. The z14 represents a major milestone for these companies: a platform that can process 24 hours’ worth of data in less than 24 hours, creating a buffer for rechecks or an early start on the next day’s work. This not only keeps companies ahead of the curve instead of struggling to keep up, it negates the need for a third bank of mainframes just to ensure redundancy.

As our reliance on data and digital communication expands, more processing power will be required to keep up with the increasing volume of information necessary to keep telecom and other industries processing at peak efficiency. The z14 is a great step toward keeping up with these demands, with future mainframes in IBM Z waiting to be unveiled.

Jennifer Nelson is the managing director of R&D Database Servers and Tools at Rocket Software. After serving in the U.S. military, Jennifer attended the University of Texas while moonlighting as a Db2 database administrator. For the last 20 years, she has focused on business intelligence and analytics, and is a recognized expert on how high-powered analytics can transform organizations.

Posted: 12/5/2017 12:00:09 AM by Jennifer Nelson | with 0 comments

Print Email

Please sign in to comment.

Sign In


Comments
Blog post currently doesn't have any comments.