In the Middleware: A Transaction Process Monitor for Mission-Critical Systems - TmaxSoft
TmaxSoft Named a Mainframe Modernization Software Leader for the Second Year in a Row.
Read the Full Report

In the Middleware: A Transaction Process Monitor for Mission-Critical Systems

It’s no secret that we live in an on-demand, instant-gratification, digital world that’s swimming in data. Blogs, articles, books, and videos are full of examples. Consumers, customers, partners, and employees expect digital experiences to be available whenever and wherever they want them. Each time someone logs in to an app or application, the data footprint they leave behind is massive. Your company wants to use that data to grow your business as much as your customers want to interact with you on their terms.

The systems you use to run your business need to be up to the task of balancing the impact of this big data with the need for constant uptime. These mission-critical systems need mission-critical optimized transaction processing, reliability, and transaction integrity so you can keep your business going and delight your users 24/7. Online transaction processing (OLTP) middleware called a transaction processing monitor (TPM) plays a key role in delivering these key characteristics. Let’s look at how.

OLTP and TPM past

OLTP starts with transactional data, which tracks business transactions, such as payments made to your suppliers or received from your customers, inventory movement, orders, and services delivered. OLTP database systems record these business interactions as they occur during your daily operations and support querying this data.

The main purpose of a TPM is to allow resource sharing and assure optimal use of the resources by applications. It monitors OLTP transactions from one stage to the next to make sure that each one completes successfully. It will take appropriate action if the completion is not a success or if there is an error.

A TPM is critical if you have multi-tier architectures. Because the processes are running on different platforms, it’s possible to forward a transaction to any one of several servers. Generally, the TPM handles all load balancing. After completing each transaction, the TPM can process another transaction without being influenced by the prior transaction.

OLTP and TPM present

When TPMs were developed, there was no such thing as streaming data or multi-cloud systems. Hadoop didn’t exist. and mission-critical systems hummed right along on the mainframe. So, it was easy to deliver critical, optimized transaction processing, reliability, and transaction integrity with a TPM.

Not so today, of course. There’s the cloud, the home of some of the world’s most successful applications and new storage types such as data lakes. Modern mission-critical applications also have many more “must-haves”: built-in analytics, cloud deployment capability, and complex logic for making real time, informed decisions instantly. Therefore, today’s TP monitors need to offer reliability, high availability, high performance, compatibility, and convenient functions for OLTP.

Although other solutions for offering those requirements have been proposed over the years, the TPM has stood the test of OLTP time. The fact that it is middleware is its strength. The very essence of middleware is adaptability, and it has its origins in enabling the integration of two disparate types of software. It is not difficult to adapt it for specific environments, service architecture, various web servers, and modern front ends.

OLTP and TPM future: It’s now

The key to enabling OLTP for today’s complex mission-critical applications and workloads is building on a firm OLTP middleware foundation that has been proven over the years. The traditional functions of a TPM—process management, distributed transaction processing, load balancing, fault tolerance and failover, naming service, security, and system management—are modernized and brought up to today’s standards.

Then, new technology is added to modernize the TPM so it addresses today’s IT trends: proliferation of cloud computing, increased hardware computing power, and the higher premium placed on security. This technology includes advanced dynamic changes, automated deployment, millisecond granularity, and increased name length, and more.

Underneath it all is conformance with the DTP model of X/Open, the International TP-Monitor Standard, and definition of the functional layer for processing distributed transactions (DTP) that follows the Open Systems Interconnection Basic Reference Model (OSI model).

The benefits of a modernized TPM

A modernized TPM has impressive advantages. These include a convenient production and management environment, use of increased computing power, extended web client interfaces, and improved security and encryption and decryption performance. In addition, stream pipe communication provides full queue prevention and fast recovery and peer-to-peer monitor makes immediate error handling possible.

The reliable queuing in the modernized TPM prevents data loss and guarantees data transfer. Interfaces for compatibility between systems are provided by an open and flexible architecture that also monitors resources automatically according to the system load. And, all communication is high performance. Compatibility with different TPM applications and an in-memory data grid round out the major benefits.

Ready for a next-generation TPM designed for today’s mission-critical applications?

Mission-critical OLTP workloads demand an infrastructure that delivers mission-critical performance, reliability, availability, and manageability. Tmax from TmaxSoft is a TPM that delivers all this and more, enabling you to transform your infrastructure to keep up with today’s business demands, all within the bounds of your IT resources. Visit to learn more.

Paul Bobak is the Vice President, Technical Field Services at TmaxSoft. He has more than 30 years of IT and ISV senior management experience with global companies. At TmaxSoft, he has responsibility for pre- and post-sales support and services. Paul has a diverse mainframe, database, distributed and SOA technology background along with in-depth experience growing and managing teams in multi-platform enterprise-wide environments. He has consistently taken a consultative approach to solving client business challenges, while strategically aligning technology to support clients’ business objectives. Paul has a successful track record for hiring, motivating and retaining performance-driven teams and building a culture of doing what’s needed to ensure customer success. His leadership experience includes senior management roles at Legent, Oracle, Tibco and Netezza. He holds two degrees in Computer Science.