Typically the IBM Z Mainframe is recognized as the de facto System Of Record (SOR) for storing Mission Critical data. It therefore follows for generic business applications, DB2, IMS (DB) and even VSAM could be considered as database servers, while CICS and IMS (DC) are transaction servers. Extracting value from the Mission Critical data source has always been desirable, initially transferring this valuable Mainframe data source to a Distributed Platform via ETL (Extract, Transform, Load) processes. A whole new software and hardware ecosystem was born for these processes, typically classified as data warehousing. This process has proved valuable for the last 20 years or so, but more recently the IT industry has evolved, embracing Artificial Intelligence (AI) technologies, ultimately generating Machine Learning capabilities.
For some, it’s important to differentiate between Artificial Intelligence and Machine Learning, so here goes! Artificial Intelligence is an explicit Computer Science activity, endeavouring to build machines capable of intelligent behaviour. Machine Learning is a process of evolving computing platforms to act from data patterns, without being explicitly programmed. In the “what came first world, the chicken or the egg”? You need AI scientists and engineers to build the smart computing platforms, but you need data scientists or pseudo machine learning experts to make these new computing platforms intelligent.
Conceptually, Machine Learning could be classified as:
- An automated and seamless learning ability, without being explicitly programmed
- The ability to grow, change, evolve and adapt when encountering new data
- An ability to deliver personalized and optimized outcomes from data analysed
When considering this Machine Learning ability with the traditional ETL model, eliminating the need to move data sources from one platform to another, eradicates the “point in time” data timestamp of such a model, and any associated security exposure of the data transfer process. Therefore, returning to the IBM Z Mainframe being the de facto System Of Record (SOR) for storing Mission Critical data, it’s imperative that the IBM Z Mainframe server delivers its own Machine Learning ability…
IBM Machine Learning for z/OS is an enterprise class machine learning platform solution, assisting the user to create, train and deploy machine learning models, extracting value from your mission critical data on IBM Z platforms, retaining the data in situ, within the IBM Z complex.
Machine Learning for z/OS integrates several IBM machine learning capabilities, including IBM z/OS Platform for Apache Spark. It simplifies and automates the machine learning workflow, enabling collaboration on machine learning projects across personal and disciplines (E.g. Data Scientists, Business Analysts, Application Developers, et al). Retaining your Mission Critical data in situ, on your IBM Z platforms, Machine Learning for z/OS significantly reduces the cost, complexity security risk and time for Machine Learning model creation, training and deployment.
Simplistically there are two categories of Machine Learning:
- Supervised: A model is trained from a known set of data sources, with a target output in mind. In mathematical terms, a formulaic approach.
- Unsupervised: There is no input or output structure and unsupervised machine learning is required to formulate results from evolving data patterns.
In theory, we have been executing supervised machine learning for some time, but unsupervised is the utopia.
Essentially Machine Learning for z/OS comprises the following functions:
- Data ingestion (From SOR data sources, DB2, IMS, VSAM)
- Data preparation
- Data training and validation
- Data evaluation
- Data analysis deployment (predict, score, act)
- Ongoing learning (monitor, ingestion, feedback)
For these various Machine Learning functions, several technology components are required:
- z/OS components on z/OS (MLz scoring service, various SPARK ML libraries and CADS/HPO library)
- Linux/x86 components (Docker images for Repository, Deployment, Training, Ingestion, Authentication and Metadata, services)
The Machine Learning for z/OS solution incorporates the following added features:
- CADS: Cognitive Assistant for Data Scientist (helps select the best fit algorithm for training)
- HPO: Hyper Parameter Optimization (provides the Data Scientist with optimal parameters)
- Brunel Visualization Tool (assist the Data Scientist to understand data distribution)
Machine Learning for z/OS provides a simple framework to manage the entire machine learning workflow. Key functions are delivered through intuitive web based GUI, a RESTful API and other programming APIs:
- Ingest data from various sources including DB2, IMS, VSAM or Distributed Systems data sources.
- Transform and cleanse data for algorithm input.
- Train a model for the selected algorithm with the prepared data.
- Evaluate the results of the trained model.
- Intelligent and automated algorithm/model selection/model parameter optimization based on IBM Watson Cognitive Assistant for Data Science (CADS) and Hyper Parameter Optimization (HPO) technology.
- Model management.
- Optimized model development and Production.
- RESTful API provision allowing Application Development to embed the prediction using the model.
- Model status, accuracy and resource consumption monitoring.
- An intuitive GUI wizard allowing users to easily train, evaluate and deploy a model.
- z Systems authorization and authentication security.
In conclusion, the Machine Learning for z/OS solution delivers the requisite framework for the emerging Data Scientists to collaborate with their Business Analysts and Application Developer colleagues for delivering new business opportunities, with smarter outcomes, while lowering risk and associated costs.