In February 2018 the computing world celebrated the 20th anniversary of “open-source”. Or rather, it celebrated the 20th anniversary of the definition of open-source being formally set down by the Open-source Initiative - a non-profit organisation dedicated to promoting open-source software.
In reality, open-source has roots that stretch back decades further. The notion that software could be shared, indeed should be shared, was commonplace in the early years of computing. Commercialisation of software development and the era of enterprise computing put pay to those early ideals. In truth, proprietary development of business-critical systems from large multinationals needed to happen at the beginning for the world of software development to build scale.
And build scale it did. The era of enterprise computing transformed everything, and it continues to evolve and transform. Since the OSI set down the definition of open-source in 1998, the software world and enterprise computing have made huge leaps and bounds towards openness. Companies such as Linux vendor Red Hat have built billion dollar businesses on open source. Even the likes of Microsoft are now embracing the ethos to such a degree that the firm is now the number one contributor to the open-source community.
Enabling digital transformation
Collaboration is at the heart of the open-source movement; it is this ability to collaborate on a massive scale that accelerates innovation in open-source. Contrast this innovation with the situation for many organisations which rely upon legacy applications locked into a proprietary operating environment. Innovation for these applications is constrained by the resources of their IT department or solutions provider. As born on the web firms disrupt the status quo and flourish, some of the world’s largest traditional organisations are trapped, dependent on legacy applications and supporting tools, used under expensive licensing models, which provide no more functionality than open-source alternatives; and are likely to progress at a slower rate.
When development capability is harnessed from around the world, rather than within just a single software vendor, innovation and integration accelerates dramatically. This in turn attracts more talent, which creates further innovation. A truly virtuous circle. For this reason, open-source software has become the preeminent model for technologies such as: webservers, big data, analytics, mobile development, cloud infrastructure, messaging systems and database management systems.
Why rehost mainframe workloads?
There are a range of reasons why an organisation would want to rehost its mainframe workloads. A primary driver is the requirement to extricate itself from the lock-in of proprietary software. By moving into an open-source world, organisations can take full advantage of modern, transformative, technologies such as containerisation.
Containers provide the perfect balance of isolation and resource usage. This isolation provides well-documented benefits for development speed, horizontal scaling, security and resource optimisation. Container technology is entirely absent on the traditional legacy mainframe operating system. Consequently, development and scaling pressures, which come from expanded digital internet services, cannot be satisfied on mainframes with anything like the efficiency of container-enabled platforms.
Rehosting also enables firms to surround legacy applications with modern DevOps toolchains and processes. By moving mainframe applications, which still have a development backlog, into this modern development architecture, an organisation can now tap into a pool of programmers that is several orders of magnitude larger; and can deliver updates faster and more regularly.
The intersection of DevOps and containers can also enable the use of microservices. Unlike traditional service-enabled mainframe applications, microservices are independently deployable, scale horizontally and are released with automated processes. Microservices are small and fine-grained from the perspective of the service consumer. Monolithic mainframe applications can only be transformed into microservices easily, and without code changes, if they can be containerised, delivered and deployed using a modern DevOps toolchain.
However, perhaps the most obvious advantage is that mainframe applications can now be hosted in conventional cloud environments. This option is compelling for organisations hitherto locked into the contractual and cost complexity of the mainframe.
Software Defined Mainframe Makes It All Possible
At LzLabs our mission is to create revolutionary software solutions, leveraging the creativity of open-source innovation and the power of cloud computing to reduce the risk of legacy application modernization. The LzLabs Software Defined Mainframe® (LzSDM), eliminates the need to modify and recompile mainframe application source code; and preserves mainframe data in its native encoding format when migrating mainframe applications to an open-source architecture.
LzLabs decided it would make no sense for LzSDM® to perpetuate the proprietary infrastructure that extends the lock-in to a mainframe, while providing no competitive differentiation for the applications. The use of more common open-source technologies makes the SDM environment look familiar to new hires. It helps organisations attract talent to sustain operations; more rapidly develop applications and reduce dependence on proprietary solutions. Finally, the LzSDM provides a reliable, manageable platform for enterprise-scale system of record applications that will continue to evolve thanks to the power of a global open-source development network.
Find out how you can rehost your mainframe and take advantage of open-source solutions by downloading the LzLabs white paper: The Power of Open.