The Software Defined Mainframe (SDM): An Alternative Approach?

Some consider the IBM Mainframe to be the last bastion of proprietary computing platforms, for obvious reasons, namely the CPU server architecture and the single manufacturer, IBM.  The historical and legacy ability of said IBM Mainframe to transform Data Processing into Information Technology and still participating in the Digital Era is without doubt.  However, for many, the complicated and perceived ultra-expensive world of software pricing generate concern, largely based upon Fear, Uncertainty and Doubt (FUD), which might have generated years if not decades of under investment for those organizations with an IBM Mainframe.

Having worked with the IBM Mainframe for 35+ years, I have gained a knowledge that allows cost optimization and contemporaneous usability, which given the importance of the IBM Mainframe platform to IBM from a revenue viewpoint, will safeguard that the IBM Mainframe will have a long future.  However, the last decade or so has seen a rapid evolution in Open Source, DevOps, Enterprise Class Support for Distributed Platforms, Mobile and Cloud computing, et al, potentially generating an opportunity for the global IBM Mainframe user base to once again consider the platforms value proposition…

Let’s consider this server platform choice from a business viewpoint.  On the one hand, there are the well versed market statements, where 80%+ of corporate data resides or originates from IBM Mainframes, while IBM Mainframes enable 70%+ of global commercial transactions, et al.  In recent times there are global businesses, leveraging from the cloud or Linux Open Source technologies, to run their business.  For instance, Netflix reportedly runs its media on demand business via the Amazon Web Services (AWS) cloud, while said platform is facilitating a Data Centre reduction of 34 to 4 for General Electric (GE).  There are many other such “early adopters” of this commodity infrastructure provision opportunity, including Capital One, Hertz and Juniper, naming but a few.

Quite simply, the power of Mobile processors, primarily ARM and supporting software ecosystem empower each and every potential consumer with a palm sized smart computing platform, while the power and supporting software ecosystem of x86 processors, generate an environment for each and every global business, mature or not even launched, to deliver an eminently usable and scalable IT Infrastructure for their business model.

Of course, the IBM Mainframe can do this, it always has been at the forefront of IT architectures and always will be, but for the “naysayers”, its perceived high acquisition and running costs are always an easy target.  As somebody much cleverer than I once said, timing is everything, and we’re now encountering a “golden sunset” for those Mainframe Baby Boomers, just like myself, that will retire in the next decade or so.  Recently I was talking with a large IBM Mainframe customer, who stated “we’re going to lose 1500 years of IBM Mainframe experience in the next 10 years, how can you replace that resource easily”?  Let’s just think about that metric; ~50 people with an average of ~30 years’ experience, but of course, they will all retire in a short time frame!  You must draw your own conclusions as to that conundrum, how do you replace that level of experience?

In conclusion, no matter what IBM deliver from an IBM system z viewpoint, there is no substitute for experience and skill and no company, especially IBM has an answer to skills provision.  In the last 10-20 years, Outsourcing or Managed Services has provided an alternative approach for some companies, but even this option has finite resource.  If we consider the CFO viewpoint, where the bottom line is the only true financial metric, it’s easy to envisage a situation where many companies consider an alternative to the IBM Mainframe platform, both from a cost and viability viewpoint.  As a lifelong IBM Mainframe champion and as previously stated, there will always be a solution for safeguarding the longevity and viability of the IBM mainframe for any Medium to Large sized business.  However, now is the time to act, embrace the new Open Source, DevOps and Hybrid Cloud opportunities, to transition from a Baby Boomer to Millennial Mainframe workforce!

Is there an alternative approach and what is the Software Defined Mainframe (SDM)?

Put simply, SDM is a technology from LzLabs enabling the migration of mission-critical workloads from legacy IBM Mainframe environments to x86 Linux platforms.  Put another way, LzLabs have developed a managed software container that provides enterprises with a viable way to lift and shift applications from IBM Mainframes into Red Hat Linux or Cloud environments.  From my first glance, the primary keyword here is container; there was a time where the term container might have been foreign to the System z Mainframe, but with LinuxONE and zVM, Docker and KVM are now commonplace and accepted functions.  The primary considerations for any platform migration would include:

  • Seamless Migration: The LzLabs Software Defined Mainframe (SDM) ensures the key capabilities of screen handling, transaction management, recovery and concurrency are preserved without changes to the applications. LzOnline is capable of processing thousands of online customer transactions per second using commercial off-the-shelf hardware.
  • Major Subsystem Compatibility: The LzLabs Software Defined Mainframe (SDM) safeguards 100% compatibility with existing job control syntax, and also enables job submission via network connected nodes that support conventional job entry protocols. LzBatch provides a full spool capability that enables output to be managed and routed in familiar ways. Use of conventional job submission models, with standard job control, also means existing batch scheduling can operate with minimal changes.  Other solutions include LzRelational for Relational Database Management System (RDBMS) support and LzSecure, an authentication and authorization subsystem using security rules migrated from the incumbent IBM Mainframe platform.
  • Application Code Stability: An innovative approach that avoids the requirement to recompile or rewrite legacy COBOL or PLI application source code. Leveraging from functionality delivered by Cobol-IT and Eranea, a simple and straightforward process to convert and potentially modernize existing application source code to Java.

The realm of possibility exists and there are likely to be a number of existing IBM Mainframe users that find themselves with challenges, whether retiring workforce or back level application code based.  The Software Defined Mainframe (SDM) solution provides them with a potential option of simplifying a transition process, with seemingly minimal risk, while eradicating any significant dependence on another Distributed Systems platform supplier, during the arduous application source and data migration process.

From my viewpoint, I hope that this innovative LzLabs approach is a wake-up call for IBM themselves, who continue to deliver a strategic Enterprise Class System z platform, with all of its long term challenges, primarily cost based and the intricate and over complicated sub-capacity software pricing structure.  Without doubt, any new workload can easily be accommodated for low cost via the recent LinuxONE offering, but somewhere along the line, IBM perhaps overlooked a number of Small to Medium sized customers, who once might have used entry level or plug-compatible platforms, including and not limited to S/390 Integrated Server, MP3000, FLEX-ES zFrame, T3 Liberty, et al.  Equally from a dispassionate viewpoint, I welcome the competition of the LzLabs Software Defined Mainframe (SDM) offering and I would encourage all CIO and indeed other CxO personnel to consider the merits of this solution.

Java: Is System z A Viable Server Platform?

As long ago as 1997, IBM integrated Java into their IBM Mainframe platform, in those days via the then flagship OS/390 Operating System. As with any new technology, perhaps the initial OS/390 Java integration offerings were not perfect, but some ~20 years later, a lot has changed…

In 2000, IBM Java SDK 1.3.1 delivered z/OS and Linux on z support, quickly followed by 64-bit z/OS support in 2003 via SDK 1.4. In 2004 Java Virtual Machine (JVM) and JIT (Just-In-Time) compiler technology support was provided, while Java code has always exploited IBM specialty engines, primarily zAAP initially and now via zIIP and the zAAP on zIIP capability. Put simply, IBM continues to invest aggressively in Java for System z, demonstrating a history of innovation and performance improvements, up to and including the latest z13 server.

So why should a 21st century business consider the System z platform for Java workloads?

Arguably the primary reason is a rapidly emerging requirement for the true 24*7*365 workload, which cannot accommodate a batch window, where Java is ideally placed to serve both batch and OLTP workloads. Put another way, the need to process batch work has not gone away, whereas a requirement to process batch work concurrently with OLTP services has emerged. Of course, traditionally the typical System z enterprise might have two sets of IT staff for OLTP and batch workloads, typically in the IT Support and Application Management teams, whereas via Java and a workload centric approach, separate batch and OLTP support personnel are not necessarily required.

For the System z platform, Java support has always been incorporated into the core architectural building blocks, namely z/OS, CICS, DB2, IMS, WebSphere, Batch Runtime, et al. Therefore there are no functional reasons why new applications or indeed existing applications cannot be engineered using the pervasive Java programming language and deployed on the System z platform.

Quite simply, Java is a critically important language for IBM System z. Java has become foundational for data serving and transaction serving, the traditional strengths of IBM System z. WebSphere applications written in Java and processing via System z, benefit from a key advantage through co-location. This delivers better response times, greater throughput and reduced system complexity when driving CICS, DB2 and IMS transactions.

Java is also critical for enabling next generation workloads in the IBM defined Cloud, Analytics, Mobile & Security (CAMS) framework. Cloud and mobile applications can access z/OS data and transactions via z/OS Connect and other WebSphere solutions, all inherently Java based. Java on System z also provides a full set of cryptographic functions to implement secure solutions. A key strength of Java applications is the ability to immediately benefit from the latest hardware performance improvements using the Just In-Time (JIT) compiler incorporated in the latest IBM Java SDK releases.

Let’s not forget, there are many other good reasons why Java might be considered as a viable application programming language:

  • Personnel Skills Availability: Java is typically ranked in the top 3 of most widely used programming languages; therefore personnel availability is abundant and cost efficient.
  • Application Code Portability: Recognizing Java bytecode and associated JVM functionality, no matter what the platform (E.g. Wintel, X86 Linux, UNIX, z/OS, Linux on System z, et al), the Java application code should process without consideration.
  • Application Tooling Support: Application Development tools have evolved to the point of true platform independence, where Application Programmers just create their code, they don’t necessarily know or sometimes care, where that code will execute. Let’s not forget the simplification of Java code for OLTP and batch workloads, reducing associated IT lifecycle support costs.
  • TCO Efficiencies: Simplified Application Development and deployment reduces associated cost, while reducing implementation time for mission-critical workloads. Java exploitation of the zAAP (zAAP on zIIP) safeguards low software costs and optimized processing times (I.E. Sub-Capacity specialty engines run at full speed).

With the announcement of the zEC12 server, notable Java enhancements included:

  • Hardware Transaction Memory (HTM) – Better concurrency for multi-threaded applications
  • Run-Time Instrumentation (RI) – Innovation of a new hardware facility designed for managed runtimes
  • 2 GB Page Frames – Improved performance targeting 64-bit heaps
  • Pageable 1 MB Large Pages (Flash Express) – Better versatility of managing memory
  • New Software Hints/Directives – Data usage intent improves cache management; Branch pre-load improves branch prediction
  • New Trap Instructions – Reduce implicit bounds/null checks overhead

In summary, System z users can expect up to 60% throughput performance improvement amongst Java workloads measured with zEC12 and the IBM Java 7 SR3 SDK.

IBM z13 and the IBM Java 8 SDK deliver improved Java performance, including Single Instruction Multiple Data (SIMD) vector engine, Simultaneous Multi-Threading (SMT) and improved CP Assist for Cryptographic Function (CPACF). Delivering up to 2X improvement in throughput-per-core for security-enabled applications and up to 50% improvement for other generic applications.

Other z13 Java functional and performance improvements include:

  • Secure Application Serving – Application serving with Secure Socket Layers (SSL) will exploit the new Java 8 Clear Key CPACF and SIMD vector instructions for string manipulation. An additional 75% performance improvement for Java 8 on z13 with SMT versus Java 8 on zEC12.
  • Business Rules Processing – Business rules processing with Java 8 takes advantage of the SIMD vector instructions and SMT for zIIP specialty engines on z13 to achieve significant improvements in throughput-per-core. An additional 37% performance improvement from z13 SMT zIIPs with Java 8 versus Java 8 on zEC12.
  • Specific z/OS Java 8 Exploitation of z13 SIMD – Java 8 exploits the new z13 SIMD vector hardware instructions for Java libraries and functions. These SIMD vector hardware instructions on z13 for improved performance, where specific idioms/operations were improved by between 2X and 60X. Performance benefits for real life Java applications will be dependent on how frequently these idiom/operations are used.

In conclusion, the IBM commitment to Java on System z is clearly evident and the cost, performance and security proposition becomes compelling on the latest zEC12 and z13 Mainframe servers. The pervasive deployment of Java as a universal IT programming language dictates that programmer availability will never be an issue, and platform independence dictates that Java applications can be created and processed on any platform. Let’s not forget, the strong single thread performance and I/O scalability of System z as a significant differentiator when comparing Java performance on any IT platform.

Moreover, as always, perhaps the business dictates what platform is the most suitable for business applications. The evolution to a combined OLTP and batch workload for the 21st Century 24*7*365 mission critical business application, ideally places Java as an eminently viable programming language. Therefore there is no requirement to reengineer any existing System z application, or to find an alternative platform for new business functions. As always, the System z Mainframe platform should never be overlooked…

CICS: The Best Enterprise Transaction Server & So Much More…

In one form or another, CICS has been available since the mid-1960’s, just about as long as the IBM Mainframe server that recently celebrated its 50th anniversary, released in 1964.  From a deployment viewpoint, 90%+ of Fortune 500 companies deploy CICS, primarily for its robust and often unbeatable ability to deliver sub-second response times for numerous application transactions.  Whether a large or small IBM Mainframe user, CICS delivers an enterprise class solution for a myriad of business types and arguably at one time or another, nearly every committed IBM Mainframe installation has implemented CICS.

In the past few decades I have encountered many failed IBM Mainframe migration projects and more often than not, the primary reason for platform migration failure was the inability of the target platform to deliver consistent sub-second transaction response times for a plethora of mission critical business applications.  Similarly, it often follows that if a non-Mainframe environment has been heavily configured to handle the CICS transaction workload, it often fails with the subsequent batch processing, suffering from significantly elongated elapsed times.

CICS has its foundations as an enterprise class transaction server, capturing data input for subsequent storage and retrieval in Database Management Subsystems, but let’s not forget, CICS can do so much more…

Let’s not forget, in 2001 CICS Transaction Server (TS) 2.1 for z/OS introduced the foundation for web services support, and a capability for CICS transactions to be invoked via HTTP requests.  There have been numerous enhancements since, too many to mention, which have evolved CICS into a fully-rounded family of solutions, allowing for cradle to grave application design and delivery.  However, let’s just take some time to review what CICS TS Version 5 has delivered, and how this might benefit the 21st Century business.

Recognizing the IBM defined Cloud, Analytics, Mobile, Social & Security (CAMSS) initiative, CICS is integral to such a business facing initiative, primarily from a Cloud interoperability viewpoint.

The CICS V5.2 Application Server has a capability to host multiple applications and multiple versions of the same application, simultaneously, primarily due to the substantial increase of platform scalability.  Similarly, a heterogeneous code environment offers application development personnel a single environment to work seamlessly with Java and other legacy languages, such as COBOL, C/C++, PL/I, et al.

Combining this heterogeneous code environment with Cloud enablement allows for new application version deployment, without a requirement to disable or remove the previous version from Production processing.  Regardless of the underlying application source code base, end users can access an application without service interruption, as they transition to the new application version.  Similarly, user workloads can be seamlessly redirected to a previous working version of application code, presuming new version errors.

The CICS V5.2 application server delivers a powerful hosting environment for all business applications, new or old.  Application Development teams can provision applications for design and testing within a “real life working” infrastructure, removing the application when testing is complete, or promoting said application ASAP, for mission critical business usage.  This delivers a significant business improvement in application availability, minimizing service downtime.  Therefore, applications stay as current and relevant as possible, reducing the risk of business service outages, delivering better and consistent end user experiences.

As per the IBM zSeries Mainframe heritage, a standard resilience feature of the CICS V5.2 Application Server is an inherent capability to perform, even in the event of a problem scenario.  Cloud enablement delivers a clustering capability, which handles both system and application level failure scenarios.  Seamless and timely problem resolution dictates less down time, delivering more business availability, instilling a high sense of confidence in end users and consumers alike.

Noting the Security aspect of the IBM CAMSS initiative and the ever present cybersecurity risk to us all, CICS Application Server also delivers on this front.  Safeguarding that application enhancements can be brought online ASAP and securely, CICS V5.2 Application Server seamlessly integrates with various security software and languages, including the latest WebSphere Application Server (WAS) Liberty Profile, allowing for the portability of Java Enterprise Edition Web applications.  Enhanced security capabilities also include Java Naming and Directory Interface (JNDI), Bean Validation, JDBC Type 2 Data Sources and the Java Transaction API (JTAPI).  SSL support incorporated within the Liberty JVM server HTTP listener is extended to support key certificates stored in System Authorization Facility (SAF) key rings, delivering SSL server authentication.

Like it or not, Cloud computing is a rapidly evolving technology, where the Cloud is integrating increasingly more applications and associated services, delivering cost savings and scalability accordingly.  Of course, for true enterprise class scalability and cost efficiency, arguably the IBM zSeries Mainframe is an ideal platform for Cloud technologies.  Therefore with some slightly modified thinking, organizations can deploy Cloud based solutions and benefit from application promotion benefits, especially with a technology such as CICS.

There is a great presentation named Five Compelling Reasons for Creating a CICS Cloud that provides robust working examples of how to increase application availability, with real-life application development scenarios.

In conclusion, CICS continues to evolve and not only is it the best Enterprise Class Transaction Server, the family of CICS products including its Application Server deliver a 21st Century Cloud computing compatible platform, for the most demanding of business requirements.  Whether considering the IBM defined Cloud, Analytics, Mobile, Social & Security (CAMSS) initiative or the more traditional Reliability, Availability and Serviceability (RAS) attributes of the zSeries Mainframe server, the latest version of CICS facilitates:

  • Rapid Application Development: Agile methodologies for rapid development, irrespective of programming language (E.g. Java, COBOL, C/C++, PL/I, et al).
  • Seamless & Timely Application Deployment: More frequent application updates, minimizing downtime and associated cost, while leveraging from Cloud functionality, to deploy new applications, application enhancements or bug fixes, side-by-side with existing real-life Production workloads.

Data Entry – Is Windows XP & Office 2003 End Of Support An Issue?

Recently somebody called me to say “do you realize your Assembler (ASM) programs are still running, some 25 years after you implemented them”?  Ouch, the problem with leaving comments and an audit trail, even in 1989!  It was a blast-from-the-past and a welcome acknowledgement, even though secretly, I can’t really remember the code.  We then got talking about how Mainframe programs can stand the test of time, through umpteen iterations of Operating System.  This article will consider whether you need a Mainframe to write application code that will stand the test of time.

Spoiler alert: No you don’t; nowadays a good application development environment, a competent software coder and most importantly of all, common sense, can achieve this, for Mainframe and Distributed Systems alike.  However, you might need to recompile the source code from time-to-time…

An aging industry report from Gartner Research revealed that “many Independent Software Vendors (ISVs) are unlikely to support new versions of applications on Windows XP in 2011; in 2012, it will become common.”  And it may stifle access to hardware innovation: Gartner Research further states that in 2012, “most PC hardware manufacturers will stop supporting Windows XP on the majority of their new PC models.

After several years of uncertainty, Microsoft have officially announced that support for Windows XP (SP3) & Office 2003 ends as of 8 April 2014.  Specifically, there will be no new security updates, non-security hotfixes, free or paid assisted support options or online technical content updates.  Furthermore, Microsoft state:

Running Windows XP SP3 and Office 2003 in your environment after their end of support date may expose your company to potential risks, such as:

  • Security & Compliance Risks: Unsupported and unpatched environments are vulnerable to security risks. This may result in an officially recognized control failure by an internal or external audit body, leading to suspension of certifications, and/or public notification of the organization’s inability to maintain its systems and customer information.
  • Lack of Independent Software Vendor (ISV) & Hardware Manufacturers Support:  A recent industry report from Gartner Research suggests “many independent software vendors (ISVs) are unlikely to support new versions of applications on Windows XP in 2011; in 2012, it will become common.” And it may stifle access to hardware innovation: Gartner Research further notes that in 2012, most PC hardware manufacturers will stop supporting Windows XP on the majority of their new PC models.

Looking at the big picture, anybody currently deploying Windows XP might want to consider the lifecycle of other Microsoft Operating System versions, for example, Windows Vista, Windows 7 & Windows 8.  As the Microsoft Windows Lifecycle Fact Sheet states, mainstream support for Windows 7 ends in January 2015, less than one year from now, and so arguably the only viable option is Windows 8.  The jump from Windows XP to Windows 8 is massive, not necessarily in terms of usability, but certainly and undoubtedly in terms of compatibility.

Those of us that experienced the Windows Vista, Windows 7 and more latterly Windows 8 upgrades, know from experience that each of these upgrades had interoperability challenges, whether hardware (I.E. Printers, Scanners, Removable Storage, et al), software (I.E. Bespoke, COTS, Utilities, et al) or even web browser (I.E. Internet Explorer, Firefox, Chrome, Safari, et al) related.  Although many of these IT resources might be considered standalone or technology commodities, where a technology refresh is straightforward and an operational benefit, the impact on the business for user facing applications might be considerable.  Of course, the most pervasive business application for capturing and processing customer information is typically classified as data entry related…

So, why might a business still be deploying Windows XP or Office 2003 today?  One typical reason relates to data entry systems, either in-house written or packaged in a Commercial Off the Shelf (COTS) software product.  In all likelihood, one way or another, these deployments have become unsupported from a 3rd party viewpoint, either because of the Microsoft software support ethos or the COTS ISV support policy.

Looking back to when Microsoft XP was first released, it offered an environment that allowed customers to think outside of the box for alternatives to traditional development methods, or put another way, Rapid Application Development (RAD) techniques.  Such a capability dictated that businesses could deploy their own bespoke or packaged systems for capturing data and thus automating the entirety of their business processes from cradle to grave with IT systems.  For a Small to Medium sized Enterprise (SME), this was a significant benefit, allowing them to compete, or at least enter their market place, without deploying a significant IT support infrastructure.

Therefore RAD and Microsoft Software Development Kit (SDK) techniques for GUI (E.g. .NET, Visual, et al) presentation, sometimes and more latterly browser based, were supplemented with structured data processing routines, vis-à-vis spreadsheet (CSV), database (SQL) and latterly more formalized data structure layouts (I.E. XML, XHMTL).  Let’s not forget, Excel 2003 and Access 2003 that offered powerful respective spreadsheet and database solutions, which could capture data, however crude that implementation might have been, while processing this data and delivering reports with a modicum of in-built high-level code.

However, as technology evolves, sometimes applications need to be revisited to support the latest and greatest techniques, and perhaps the SME that embraced this brave new world of RAD techniques, were left somewhat isolated, for whatever reasons; maybe business related, whether economic related (E.g. dot com or financial markets) or not.

Let’s not judge those business folks still running Windows XP or Microsoft Office 2003 today; there are probably many good reasons as to why.  When they developed their business systems using a Windows XP or Office 2003 software base, I don’t think they envisaged that the next Microsoft Operating system release might eradicate their original application development investments; requiring a significant investment to upgrade their infrastructure for subsequent Windows versions, but more notably, for interoperability resources (I.E. Web Browsers, .NET, Excel, Access, ODBC, et al).

So if you’re a business running Windows XP and maybe Office 2003 today, potentially PC (E.g. Desktop, Laptop) upgrade challenges can be separated into two distinct entities; firstly the hardware platform and operating system itself; where the “standard image” approach can simplify matters; and secondly, the business application, typically data entry and processing related.  Let’s not forget, those supported COTS software products, whether system utility (E.g. Security, Backup/Recovery/Archive, File Management, et al) or function (E.g. Accounting, ERP, SCM, et al) can be easily upgraded.  It’s just those bespoke in-house systems or unsupported systems that require a modicum of thought and effort…

We all know from our life experiences, if we only have lemons, let’s make lemonade!  It’s not that long ago that we faced the so-called Millennium Bug (Year 2000/Y2K) challenge.  So that could either be a problem or an opportunity.  The enlightened business faced up to the Year 2000 challenge, arguably overblown by media scare stories, and upgraded their IT infrastructures and systems, and perhaps for the first time, at least made an accurate inventory of their IT equipment.  So can similar attributes be applied to this Windows XP and Office 2003 challenge?

The first lesson is acceptance; so yes we have a challenge and we need to do something about it.  Even if your business has been running Windows XP or Office 2003, in an extended support mode for many years, in all likelihood, the associated business systems are no longer fit-for-purpose or could benefit from a significant face-lift to incorporate new logic and function that the business requires!

The second lesson is technology evolution; so just as RAD and SDK were the application development buzzwords of the Windows XP launch timeframe, today the term studio or application studio applies.  An application studio provides a complete, integrated and end-to-end package for the creation, including the design, test, debug and publishing activities of your business applications.  Furthermore, in the last decade or so, there has been a proliferation of modern language (E.g. XHTML, Java, C, C++, et al) programmers, whether formalized as IT professionals, or not (E.g. home coders).

The third lesson is as always, cost versus benefit; the option of paying for Windows XP or Office 2003 extended support ends as of April 2014.  So what is the cost of doing nothing?  As always, cost is never the issue, benefit is.  Investing in new systems that are fit-for-purpose, will of course deliver business benefit, and if the investment doesn’t pay for itself in Year 1, hopefully your business can build a several year business case to deliver the requisite ROI.

Finally, is remote data entry possible with a Windows XP based system?  Perhaps, but certainly not for each and every modern day device (E.g. Smartphone, Tablet, et al).  Therefore enhancing your data entry systems, with the latest presentation techniques, might deliver significant benefit, both for the business and its employees.  Remote working, whether field or home based, delivers productivity benefits, where such benefits can be measured in both business administration cost reduction and increased employee job satisfaction and associated working conditions.

So how easy can it be to replace an aging Windows XP and/or Office 2003 application?

Entrypoint is a complete application development package for creating high-performance data entry applications.  Entrypoint software is built around a scalable, client-server architecture that interfaces with SQL databases for data storage.  Entrypoint data entry software interfaces with standard communications products and commercial networks.

Entrypoint is a web based data entry system that includes Application Studio, a local development tool that allows the user to easily create any data entry system, based upon their specific and typically unique business requirements.  The Entrypoint thin and thick clients let the user enter their data either directly via web resources or via a local workstation (E.g. PC), as per their requirements, while being connected to the same database.

Entrypoint Benefits: Today’s 21st Century business is focussed on delivering tangible business benefit and cost efficient customer facing solutions that can be rapidly deployed, while being secure and compliant:

  • Flexible Data Entry: Whether via Intelligent Data Capture (IDC) and/or Electronic Data Capture (EDC), Entrypoint can accommodate any business requirement, either from scratch, or perhaps via conversion from a legacy platform (E.g. DOS).
  • Rapid Application Deployment: Entrypoint can be deployed in hours, sometimes and typically by non-application development personnel, safeguarding long-term management and associated TCO concerns.
  • Audit: The Entrypoint Audit Trail Facility (ATF) tracks all changes made to records from the time they are first entered into the case report form throughout all editing activity, regardless of the number of users working on them.  The audit facility can be enabled on an application-by-application basis for all users, groups of users or individual users.
  • Security: Entrypoint includes a variety of features that yield the highest levels of critical security required for Clinical Trials.  Its inbuilt security features let you create a customized and granular security policy specific to your needs.  Entrypoint uses ODBC to connect to SQL databases for data storage, which provides an additional level of security; database logins, passwords and even built-in encryption, not always available for other data entry solutions.  Optional 128-bit encryption protects all messages sent to or from the server delivering significantly greater protection, not always available for other data entry solutions.

Entrypoint is one of the simplest but most comprehensive data entry solutions that I have encountered and provides a cost-efficient solution for both the smallest and largest of businesses.  Furthermore, in all likelihood, and definitely in real-life, an entry-level employee or graduate with programming skills could rapidly develop a Data Entry system with Entrypoint to replace any existing Windows XP (or any other Windows OS) based solution.  This observation alone dictates that somebody who actually works for the business, not a 3rd party IT professional, can not only perform the technical work required, but more importantly, be a company employee that can easily relate to and sometimes learn about the end-to-end business.

In the IT world, change is inevitable, and sometimes change is forced upon us.  Whatever your thoughts regarding end of support for Windows XP and Office 2003, there are options for you and your business to embrace this change, move forward, and improve your processes.  You no longer have the option to pay Microsoft for extended support, and so why not use these money’s and invest in a system that can be easily supported, and easily adapted in the future, to provide long-term benefit for your business!

COBOL – A Viable Programming Language?

For the last twenty years or so I have encountered many scenarios where Mainframe users consider migration to a Distributed Systems (E.g. Wintel, UNIX/Linux, et al) platform, where more often than not the primary reasons seems to be “green screen” and/or “COBOL is a declining legacy language” based.

Going back to basics, COBOL is a Common Business Oriented Language, although the naysayers might say COBOL is a Completely Obsolete Business Oriented Language; we will perhaps try to be more dispassionate in this discussion…

Industry Analysts have stated that there are ~220 Billion lines of COBOL code and ~100,000 programmers and that COBOL applications process ~80% of business transactions daily, and that there are ~200 times more COBOL transactions processed daily, when compared with Google searches!  A lot of numbers and statistics, but seemingly COBOL is still widely used and accepted.  Even from a new development viewpoint, ~5 Billion lines of COBOL code per annum (~15% of Annual Global Development) is stated, suggesting that COBOL is not in any way obsolete or legacy, so why is COBOL perceived by some in a dubious manner?

Maybe because COBOL was introduced in 1959 and primarily it is deployed on the Mainframe, and so anything that is 50+ years old and has an association with the Mainframe just has to be dubious, doesn’t it?  Of course not, as this arguably “pioneering” or at least one of the first “widely deployed” programming languages allowed many global and significant businesses grow, in tandem with the IBM Mainframe platform, automating and streamlining business processes, increasing productivity and so on.  So depending on your viewpoint, COBOL was either in the right place at the right time, stimulating the Data Processing (DP) and Information Technology (IT) revolution, or COBOL just got lucky, it was “Hobson’s Choice”…

Although there have been several iterations of COBOL standards (I.E. COBOL-68, COBOL-74, COBOL-85), primarily associated with the American National Standards Institute (ANSI) and more latterly COBOL 2002 (ISO), a COBOL program that was written and compiled on an IBM Mainframe several decades ago, will most likely still run on the latest generation IBM Mainframe.  Put another way, its backwards compatibility ability has been significant, and although there were some migration considerations associated with the Language Environment (LE), the original COBOL Application Development investment has generated a readily usable Return On Investment (ROI) over and over again.  How true is this for other programming languages and computing platforms?  For the avoidance of doubt, a COBOL program that was written in 16-bit, can still run today on a 64-bit platform, and with a modicum of evolution, fully exploit the latest functionality and 64-bit performance, with minimal fuss.  While how many revolutionary or significant upgrades have been required for Commercial Off The Shelf (COTS) software and associated bespoke application development code, to upgrade non-Mainframe platforms from 16-32-64-bit?

So, is COBOL a viable programming language of the future?  One must draw one’s own conclusions, but we can look to recent functional enhancements and statements of direction from an IBM Mainframe viewpoint.

In recent years IBM have actually increased the number of COBOL R&D personnel by a factor of ~100%, while increasing allocated investment, commitment and interest accordingly.  This observation more than any other, suggests that at least from an IBM Mainframe viewpoint, COBOL is an important function.

From a technical function viewpoint, the realm of possibility exists with COBOL, interacting with all 21st century programming and function techniques, dismissing the notion that COBOL can only be considered as a traditional/legacy option for CICS-Batch applications and associated “green screen” environments, for example:

  • Support for CICS integrated translator
  • Support for latest SQL data types in syntax via DB2
  • Support for Java interoperability via object-oriented COBOL syntax
  • Support access for WebSphere enterprise beans
  • Support for Java SDK
  • Support for XML high speed parsing and validation (UTF-8, UTF-16 & various EBCDIC codepages)

From a strategic statement of direction viewpoint, IBM have declared the following major notable activities:

  • Performance and resource utilization optimization, reducing TCO accordingly
  • Improved middleware (I.E. CICS, DB2, IMS, WebSphere) programmability and problem determination
  • Improved capabilities (E.g. XML, Java, et al) for modernizing & creating business critical applications
  • Improved programmer (E.g. Usability and Problem Determination) productivity
  • Source and load (I.E. recompile not required) compatibility (E.g. old programs can call new and vice versa)

Even for those occasions where the IBM Mainframe platform might be decommissioned, COBOL can still be processed on alternative platforms via code migration techniques such as Micro Focus, where such functions and services can be Cloud based.  However, once again, isn’t the IBM Mainframe the ultimate “Cloud” platform, which has arguably been the case “forever thus”?

One must draw one’s own conclusions as to why the Mainframe platform and/or COBOL applications are often considered for replacement via migration, when the Mainframe platform is both strategic and cost efficient.  As with any technology decision, there is no “one size fits all” solution, but perhaps a little education can go a long way, and at least the acceptance that seeming “legacy” technologies are strategic and viable.