Tag Archives: mainframe

IBM Z mainframes revived by Red Hat, AI and security

Mainframe systems could play a significant role in cybersecurity and artificial intelligence advancements in years to come and IBM is investing in those areas to ensure system Z mainframes have a stake in those growing tech markets.

IBM mainframe sales grew some 69% during the second quarter of this year, achieving the highest year-over-year percentage increase of any other business unit. Some industry observers attribute the unexpected performance to the fact the z15, introduced a year ago, is still in its anticipated upcycle. Typically, mainframe sales level off and dip after 12 to 18 months until the release of a new system. But that might not be the case this time around.

Ross Mauri, general manager of IBM’s Z and LinuxOne mainframe business, discussed some of the factors that could contribute to sustained growth of the venerable system, including IBM’s acquisition of Red Hat, the rise of open source software and timely technical enhancements.

Mainframe revenues in the second quarter were the fastest-growing of any IBM business unit, something analysts didn’t expect to see again. Is this just the typical upcycle for the latest system or something else at work? 

Ross MauriRoss Mauri

Ross Mauri: A lot of it has to do with the Red Hat acquisition and the move toward hybrid clouds. Consequently, mainframes are picking up new workloads, which is why you are seeing a lot more MIPS being generated. We set a record for MIPS in last year’s fourth quarter.

How much of it has to do with the increase in Linux-based mainframes and the growing popularity of open source software?

Mauri: Yes, there is that plus all the more strategic applications [OpenShift, Ansible] going to the cloud. What also helped was our Capacity On Demand program going live in the second quarter, providing users with four times the [processor] capacity they had a year ago.

Some industries are in slumps, but online sales are up and that means credit card and banking systems are more active than normal. They liked the idea of being able to turn on ‘dark’ processors remotely.

Some analysts think mainframes are facing the same barrier Intel-based machines are with Moore’s Law. Are you running out of real estate on mainframe chips to improve performance?

Mauri: What we have done is made improvements in the instruction set. So, with things like Watson machine learning, users can work to a pretty high level of AI, taking greater advantage of the hardware. We’ve not run out of real estate on the chips, or out of performance, and I don’t think we will. If you think that, we will prove you wrong.

But with the last couple of mainframe releases performance improvements were in the single digits, compared to 30% to 40% performance improvements of Power systems.

Mauri: In terms of Z [series mainframes], they are running as fast as Power. We know where [mainframes] are going to be running in the future. As we move to deep learning inference engines in the future, you’ll see more AI running on the system to help with fraud analytics and real-time transactions. We haven’t played out our whole hand yet. The AI market is still nascent; we are very much at the beginning of it. For instance, we’re not anywhere near what we can do with the security of the system.

As we move to deep learning inference engines in the future, you’ll see more AI running on [mainframes] to help with fraud analytics and real-time transactions. We haven’t played out our whole hand yet.
Ross MauriGeneral manager, IBM’s Z and LinuxOne mainframes

We have started to put quantum encryption algorithms in the system already, to make sure security was sound given what’s going on in the world of cybersecurity. You’ll see us continue to invest more in the future when it comes to AI. We’ll build on that machine learning base we have already.

Is IBM Research investigating other technologies that would sit between existing mainframes and quantum computers in terms of improving performance?

Mauri: Our [mainframe] systems group is working closely with the quantum team as well as with IBM Research. We are still in the research phase; no one’s using them for production.

What we’re exploring with IBM Research and clients is trying to determine what algorithms run well on a quantum computer for solving business problems and business processes that now run on mainframes. For instance, we’re looking at big financial institutions where we can make use of quantum computers as closely coupled accelerators for the mainframe. We think it can greatly reduce costs and improve business processing speed. It’s actually not that complex to do. We’re doing active experiments with clients now.

What are you looking at to increase performance?

Mauri: We are looking at a whole range of options right now. We have something we do with clients called Enterprise Design Thinking where they are involved throughout an entire process to make sure we’re not putting some technology in that’s not going to work for them. We have been doing that since the z14 [mainframe].

Go to Original Article

IBM z15 mainframe secures data across multi-cloud environments

IBM today premiered the latest member of its mainframe lineup, containing improved security software that builds on its existing Pervasive Encryption offering, improved methods of building cloud-native applications and added processing power that can support 2.4 million Linux containers on a single system.

The IBM z15 mainframe features the newly minted Data Privacy Passports technology, which provides IT mainframe administrators with more control over how data is stored and shared. It also offers the ability to provision data and revoke access to that data across hybrid multi-cloud environments, no matter where that data travels.

Addressing the issue of data increasingly either constantly moving around or residing in siloed environments, IBM introduced Trusted Data Objects (TDO) features designed to provide data-centric protection by staying attached to the data whenever it travels from point to point. The offering builds on the Pervasive Encryption technology that came bundled with the z14 mainframe over two years ago.

Middleware that controls and travels with data

It is this constant movement of data circulating among a user’s business partners and other third parties that causes a majority of the recent data breaches, along with the growing adoption of multi-cloud environments, IBM said.

With Data Privacy Passports, users can enforce a companywide data privacy policy capable of surfacing different views of data to different sets of users on a need-to-know basis. The TDO technology can also be used to prevent collusion among data owners, which could lead to critical data falling into the hands of hackers.

“Think of it [Data Privacy Passports and TDO] as middleware that controls and travels with the data,” said Ross Mauri, general manager of IBM Z. “Our clients might need to access data and analytic insights not in minutes, but maybe in a fraction of a second, along with the ability to control the privacy of that data at a very granular level.”

Most analysts believe IBM is taking a step in the right direction by offering added protection for increasingly complicated cloud-based environments.

“A week doesn’t go by when there’s yet another security disaster exposing the data of millions of people,” said Charles King, president and principal analyst at Pund-IT. “Data Privacy Passport appears to be a way to extend the security [of the z14’s Pervasive Encryption scheme] to sensitive information like intellectual property or data subject to compliance protocols and regulations.”

Reinforcing loyalty

Is it a quantum leap ahead of the z14? No. But the message should resonate with C-level executives. It assures them they are still investing in the system.
Mike ChubaManaging vice president, Infrastructure and Operations group, Gartner

Another analyst agreed that enhancing IBM’s existing security technology is a good, if not necessary, thing to keep long-time mainframe users loyal to the platform — particularly in a time when there are alternative technologies offered by a number of cloud-based competitors.

“The mainframe has long been a platform where security and transactional integrity has been paramount,” said Mike Chuba, managing vice president in Gartner’s Infrastructure and Operations group. “With this announcement, they continue to innovate, but is it a quantum leap ahead of the z14? No. But the message should resonate with C-level executives. It assures them they are still investing in the system.”

Capable of carrying out 1 trillion web transactions a day, the IBM z15 mainframe performs 14% faster per core and offers 25% more system capacity than the z14, Mauri noted. The system also has 25% more memory, 20% more I/O connectivity and an availability of 99.999999%, the equivalent of three seconds of downtime per year, he added.

This combination of added raw processing power, increased reliability and the ability to handle millions of containers across multiple environments could help keep the mainframe relevant in the voraciously competitive hybrid cloud market, a key area of focus for the newly merged IBM and Red Hat.

Depending on the success the combined company has in delivering compelling cloud-based products and services over the next year, it could help stimulate mainframe sales and take shares away from the dominant share Intel-based servers have among large corporate data centers.

But Gartner’s Chuba is not optimistic that whatever success IBM-Red Hat has will result in many net-new zOS-based mainframe sales, that most of the z15’s potential success will be among users interested in Linux.

“Almost all of the new accounts over the past couple of years are users running Linux,” Chuba said. “They are not attracting many zOS users in any meaningful way. It is clearly an uphill battle to win more of those users.”

What could draw interest among both Linux and zOS-based mainframe users is Red Hat’s OpenShift, expected to be available on the new system by the end of this year. Given the improved speed and capacity of the IBM z15 mainframe, some analysts said it might serve as a showpiece for how well it can run OpenShift and other strategically important software in the hybrid cloud.

“[The z15] will be positioned as the performance platform to run the Red Hat software stack,” said Frank Dzubeck, president of Communications Network Architects Inc. “This system gives mainframe users a valid alternative to the Power series, which has grabbed a lot of attention lately because of its higher performance.”

IBM’s Sierra and Summit supercomputers, both powered by IBM’s Power 9 chip, are currently the first and third fastest supercomputers in the world.

Besides improving the z15’s chip speed, IBM has come up with a new compression technology that allows corporate users to get huge amounts of data on and off the mainframe. The new compression offering, called the Integrated Accelerator for z Enterprise Data Compression, delivers 30 times lower latency and up to 28 times less processor utilization by compressing web transaction data before it is encrypted.

“It’s not unusual for significant cost to be incurred as data is moved on and off mainframes,” Pund-IT’s King said. “If you can dramatically reduce the size of the files you are sending, you can take a big bite out of the time it takes to move those huge chunks of data from one place to another.”

Go to Original Article

IBM DS8882F converges array and mainframe in one rack

Talk about converged infrastructure — IBM just embedded an all-flash array inside mainframe server racks.

IBM today launched a rack-mounted IBM DS8882F array for IBM Z ZR1 and LinuxOne Rockhopper II “skinny” mainframes that rolled out earlier in 2018. The 16U DS8882F is the smallest of IBM’s high-end DS8880 enterprise storage family designed for mainframes. The new mainframes install in a standard 19-inch rack. The IBM DS8882F array inserts into the same rack and scales from 6.4 TB to 368.64 TB of raw capacity.

The IBM DS8882F is part of a large IBM storage rollout that features mostly software and cloud storage updates, including the following:

  • IBM Spectrum Protect1.6 data protection software now supports automatic tiering to object storage and ransomware protection for hypervisor workloads. The software generates email warnings pointing to where an infection may have occurred. Spectrum Protect supports Amazon Web Services, IBM Cloud and Microsoft Azure.
  • IBM Spectrum Protect Plus1.2 virtual backup now supports on-premises IBM Cloud Object Storage, IBM Cloud and AWS S3. It also supports VMware vSphere 6.7, encryption of vSnap repositories, and IBM Db2 databases.
  • IBM Spectrum Scale0.2 added file audit logging, a watch folder and other security enhancements, along with a GUI and automated recovery features. Spectrum Scale on AWS now enables customers to use their own AWS license and supports a single file system across AWS images.
  • The IBM DS8880 platform supports IBM Cloud Object Storage and automatically encrypts data before sending it to the cloud.

The products are part of IBM’s third large storage rollout this year. It added an NVMe FlashSystem 9100 and Spectrum software in July, and cloud-based analytics and block-based deduplication in May.

Steve McDowell, senior technology analyst at Moor Insights & Strategy, said IBM has become the most aggressive of the large storage vendors when it comes to product delivery.

“IBM storage is marching to a cadence and putting out more new products faster than its competitors,” McDowell said. “We’re seeing announcements every quarter, and their products are extremely competitive.”

IBM ended a string of 22 straight quarters of declining storage revenue in early 2017 and put together four quarters of growth until declining again in the first quarter of 2018. IBM’s storage focus has been around its Spectrum software family and all-flash arrays.

IBM’s focus on footprint

McDowell called the IBM DS8882F “a nice piece of hardware.” “The zSeries is moving towards a more standard rack, and this fits right in there with almost 400 TB of raw capacity in a 19-inch rack,” he said. “It’s about capacity density and saving floor space. If I can put a zSeries and a rackmount of storage unit side by side, it makes a nice footprint in my data center.”

“The days of an EMC VMAX spanning across your data center are gone. With flash, it’s how many terabytes or petabytes I can put into half a rack and then co-locate all of that with my servers.”

Eric Herzog, chief marketing officer for IBM storage, said reducing the footprint was the main driver of the array-in-the-mainframe.

“We created a mini-array that literally screws into the same 19-inch mainframe rack,” Herzog said. “This frees up rack space and floor space, and gives you a smaller, lower-cost entry point.”

Competing in a crowded market

IBM’s DS8880 series competes with the Dell EMC PowerMax — the latest version of the VMAX — and the Hitachi Vantara Virtual Storage Platform as mainframe storage platforms.

IBM storage revenue rebounded to grow in the second quarter this year, but the market remains crowded.

IBM’s Herzog said the storage market “is fiercely competitive in all areas, including software. It’s a dog-eat-dog battle out there. Software is just as dog-eat-dog as the array business now, which is unusual.”

The new products are expected to ship by the end of September.

Enterprise IT struggles with DevOps for mainframe

The mainframe is like an elephant in many large enterprise data centers: It never forgets data, but it’s a large obstacle to DevOps velocity that can’t be ignored.

For a while, though, enterprises tried to leave mainframes — often the back-end nerve center for data-driven businesses, such as financial institutions — out of the DevOps equation. But DevOps for mainframe environments has become an unavoidable problem.

“At companies with core back-end mainframe systems, there are monolithic apps — sometimes 30 to 40 years old — operated with tribal knowledge,” said Ramesh Ganapathy, assistant vice president of DevOps for Mphasis, a consulting firm in New York whose clients include large banks. “Distributed systems, where new developers work in an Agile manner, consume data from the mainframe. And, ultimately, these companies aren’t able to reduce their time to market with new applications.”

Velocity, flexibility and ephemeral apps have become the norm in distributed systems, while mainframe environments remain their polar opposite: stalwart platforms with unmatched reliability, but not designed for rapid change. The obvious answer would be a migration off the mainframe, but it’s not quite so simple.

“It depends on the client appetite for risk, and affordability also matters,” Ganapathy said. “Not all apps can be modernized — at least, not quickly; any legacy mainframe modernization will go on for years.”

Mainframes are not going away. In fact, enterprises plan to increase their spending on mainframe systems. Nearly half of enterprises with mainframes expect to see their usage increase over the next two years — an 18% increase from the previous year, according to a Forrester Research Global Business Technographics Infrastructure Survey in late 2017. Only 14% expected mainframe usage to decrease, compared to 24% in the previous survey.

Whatever their long-term decision about mainframes, large enterprises now compete with nimble, disruptive startups in every industry, and that means they must find an immediate way for mainframes to address DevOps.

Bridging DevOps for mainframe gaps

Credit bureau Experian is one enterprise company that’s stuck in limbo with DevOps for mainframe environments. Its IBM z13 mainframes play a crucial role in a process called pinning, which associates credit data to individuals as part of the company’s data ingestion operations. This generates an identifier that’s more unique and reliable than Social Security numbers, and the mainframe handles the compute-intensive workload with high performance and solid reliability — the company hasn’t had a mainframe outage on any of its six mainframe instances in more than three years.

However, Experian has also embarked on a series of Agile and DevOps initiatives, and the mainframe now impedes developers that have grown accustomed to self-service and infrastructure automation in production distributed systems.

“IBM has recognized what’s happening and is making changes to [its] z/OS and z Systems,” said Barry Libenson, global CIO for Experian, based in Costa Mesa, Calif. IBM’s UrbanCode Deploy CI/CD tool, for example, supports application deployment automation on the mainframe. “But our concern is there aren’t really tools yet that allow developers to provision their own [production infrastructure], or native Chef- or Puppet-like configuration management capabilities for z/OS.”

Chef supports z Systems mainframes through integration with LinuxONE, but Experian’s most senior mainframe expert frowns on Linux in favor of z/OS, Libenson said. Puppet also offers z/OS support, but Libenson said he would prefer to get those features from native z/OS management tools.

IBM’s z Systems Development and Test Environment V11 offers some self-service capabilities for application deployment in lower environments, but Experian developers have created their own homegrown tools for production services, such as z Systems logical partitions (LPARs). The homegrown tools also monitor the utilization of LPARs, containers and VMs on the mainframe, and tools either automatically shut them off once they’re idle for a certain amount of time or alert mainframe administrators to shut them off manually.

“That’s not the way these systems are designed to behave, and it’s expensive. In commodity hardware, I have lots of options, but if I run out of horsepower on the mainframe, buying additional engines from IBM is my only choice,” Libenson said. “It’s also increasingly difficult for us to find people that understand that hardware.”

Experian is fortunate to employ a mainframe expert who doesn’t fit the stereotype of a parochial back-end admin resistant to change, Libenson said. But he’s not an infinite resource and won’t be around forever.

“I tell him, ‘If you try to retire before I do, I will kill you,'” Libenson said.

Ultimately, Experian plans to migrate away from the mainframe and has ceased product development on mainframe applications, Libenson said. He estimated the mainframe migration process will take three to five years.

DevOps for mainframe methods evolve

For some companies with larger, older mainframes, even a multiyear mainframe migration is expensive.

If you don’t have a good reason to get off the mainframe platform, there are ways to do a lot of DevOps-specific features.
Christopher Gardneranalyst, Forrester Research

“One insurance firm client told me it would cost his company $30 million,” said Christopher Gardner, an analyst at Forrester Research. “If you don’t have a good reason to get off the mainframe platform, there are ways to do a lot of DevOps-specific features.”

Mainframe vendors, such as CA, IBM and Compuware, have tools that push DevOps for mainframe closer to an everyday reality. IBM’s UrbanCode Deploy agents offer application deployment automation and orchestration workflows for DevOps teams that work with mainframes. The company also recently added support for code deployments to z Systems from Git repositories and offers a z/OS connector for Jenkins CI/CD, as well. In addition to Jenkins, CI/CD tools from Electric Cloud and XebiaLabs support mainframe application deployments.

CA offers mainframe AIOps support in its Mainframe Operational Intelligence tool. And in June 2018, it introduced a SaaS tool, Mainframe Resource Intelligence, which scans mainframe environments and offers optimization recommendations. Compuware has tools for faster updates and provisioning on mainframes and hopes to lead customers to mainframe modernization by example; it underwent its own DevOps transformation over the last four years.

Vendors and experts in the field agree the biggest hurdle to DevOps for mainframe environments is cultural — a replay of cultural clashes between developers and IT operations, on steroids.

Participation from mainframe experts in software development strategy is crucial, Ganapathy said. His clients have cross-functional teams that decide how to standardize DevOps practices across infrastructure platforms, from public cloud to back-end mainframe.

“That’s where mainframe knowledge has the greatest value and can play a role at the enterprise level,” Ganapathy said. “It’s important to give mainframe experts a better say than being confined to a specific business unit.”

Mainframes may never operate with the speed and agility of distributed systems, but velocity is only one metric to measure DevOps efficiency, Forrester’s Gardner said.

“Quality and culture are also part of DevOps, as are continuous feedback loops,” he said. “If you’re releasing bugs faster, or you’re overworking your team and experiencing a lot of employee turnover, you’re still not doing your job in DevOps.”

Helping customers compete and accelerate innovation with the cloud, AI and a new approach to Talent

Ever since the mainframe, customers have been dealing with business apps that were little more than forms, over siloed data, tied to monolithic CRM and ERP suites that were hard to customize. We believe we can do better. Today, at our Business Forward event in Chicago, Judson Althoff, James Philips and I are excited to meet with business leaders from a range of industries to share how Microsoft Dynamics 365’s modern, unified, intelligent and adaptive business apps can help them innovate and compete.

Dynamics 365 is already helping more organizations, than ever before – in fact, more than 60 percent of the Fortune 500 industrial companies. Last week, we highlighted how several of these organizations plan to accelerate their own digital transformation, including HP, Inc., who will be using Dynamics 365 AI solutions for intelligent customer care and service, along with the U.S. Department of Veterans Affairs and the Seattle Seahawks who are using Dynamics 365 to reinvent ways they connect with people and deliver improved experiences.

United Technologies enhances customer experiences with Dynamics 365 intelligence

Today we’re excited to welcome United Technologies Corporation (UTC), to that family of customers, and announce a strategic agreement designed to help UTC leverage Dynamics 365 and Azure to optimize its sales, customer care and field service operations.  For example, UTC will use Dynamics 365 to empower service technicians and sales teams in its Otis Elevator business. This will provide them with a unified view of the customer relationship and real-time elevator health data to enable predictive maintenance, dynamic field dispatching and a more seamless customer experience.

UTC manufactures and services millions of products for customers in the commercial aerospace and building industry that move the world forward. We are excited to partner with them to respond more quickly to sales opportunities with automation and intelligence, predict maintenance by operationalizing product data and empower thousands of service technicians in the field with mobile connectivity.

Make modern talent experiences a competitive differentiator

Extending our momentum with customers – and the incredible customer experiences we can deliver together – I’m also excited to share new innovations we are delivering to help our customers transform how they empower their people – arguably their most important resource.

Today’s human capital management systems are subject to the same challenges I see in most business applications – siloed, complicated and difficult to extend to embrace new opportunities. With Dynamics 365 for Talent, we take a modern approach helping you start where you are, allowing you to transform HR at your pace. We augment your existing systems of record with modular, intelligent cloud apps focused on critical scenarios. The result: immediate benefits, little to no disruption, and the flexibility to add new capabilities however and whenever you want.

Generally available today – Dynamics 365 for Talent: Attract and Onboard modular apps

These new modular applications can help you capture the best talent and shorten time to impact by offering a smooth and transparent candidate and new hire experience, then streamlining the hire to onboarding experience for employees and hiring managers.

Dynamics 365 for Talent: Attract puts LinkedIn knowledge, Office 365 integration and Microsoft AI to work to help hiring managers capture top talent and streamline a time-consuming, expensive hiring process, from initial candidate application to offer acceptance. Attract offers a simple out-of-the box process flow, configurable for more sophisticated needs, with centralized candidate profiles, scheduling intelligence, and engaging web and mobile experiences for both interviewers and candidates alike.

Dynamics 365 for Talent: Onboard integration with Office 365 and LinkedIn helps set new employees up for success. With it, you can create personalized onboarding experiences that get new hires engaged before they even join the team. A clear onboarding checklist, important resources and a pre-built network of key contacts help accelerate their ability to deliver impact and we give you an up-to-date view of their experience through a consolidated HR profile.

Customers can purchase and deploy the modular apps on their own to rapidly augment the rest of your human resources technology. They are also included as part of our comprehensive human capital management platform, Dynamics 365 for Talent.

Start with your own team and evolve your company’s conversation about talent

You can even get started using Attract and Onboard for just your team, or department, while continuing to work with HR and official systems of record by signing up for a free 60 day trial of Dynamics 365 for Talent: Attract or Dynamics 365 for Talent: Onboard.

Attract and Onboard are available for purchase directly from the web:

In closing

From new modular applications to transform how you engage with new hires, to cutting edge AI deployed against streams of data from IoT devices that quite literally move the world around us, Dynamics 365 offers our customers a platform to transform their business, one critical business process at a time. Learn how we can help by visiting our comprehensive overview of Dynamics 365 or finding out more about new capabilities we are bringing to market through our long term Dynamics 365 Roadmap.