Tag Archives: Enterprise

DevOps pros rethink cloud cost with continuous delivery tool

Enterprise DevOps pros can slash cloud resource overallocations with a new tool that shows them how specific app resources are allocated and used in the continuous delivery process.

The tool, Continuous Efficiency (CE), became generally available this week from Harness.io, a continuous delivery (CD) SaaS vendor. It can be used by itself or integrated with the company’s CD software, which enterprises use to automatically deploy and roll back application changes to Kubernetes infrastructure.

In either case, CE correlates cloud cost information with specific applications and underlying microservices without requiring manual tagging, which made it easy for software engineers at beta tester companies to identify idle cloud resources.

Jeff GreenJeff Green

“The teams running applications on our platform are distributed, and there are many different teams at our company,” said Jeff Green, CTO at Tyler Technology, a government information systems software maker headquartered in Plano, Texas. “We have a team that manages the [Kubernetes] cluster and provides guidelines for teams on how to appropriately size workloads, but we did find out using CE that we were overallocating resources.”

In beta tests of CE, Tyler Technologies found that about one-third of its cloud resources were not efficiently utilized — capacity had been allocated and never used, or it was provisioned as part of Kubernetes clusters but never allocated. Developers reduced the number of Kubernetes replicas and CPU and memory allocations after this discovery. Green estimated those adjustments could yield the company some $100,000 in cloud cost savings this year.

Harness Continuous Efficiency dashboard
Harness Continuous Efficiency tool correlates cloud costs to applications, services and Kubernetes clusters.

DevOps puts cloud cost on dev to-do list

Tyler Technologies has used Harness pipelines since 2017 to continuously deploy and automatically roll back greenfield applications that run on Kubernetes clusters in the AWS cloud. The full lifecycle of these applications is managed by developers, who previously didn’t have direct visibility into how their apps used cloud resources, or experience with cloud cost management. CE bridged that gap without requiring developers to manage a separate tool or manually tag resources for tracking.

This has already prompted developers at Tyler Technologies to focus more on cost efficiencies as they plan applications, Green said.

“That wasn’t something they really thought about before,” he said. “Until very recently, we followed a more traditional model where we had dedicated operations people that ran our data centers, and they were the ones that were responsible for optimizing and tuning.”

While developer visibility into apps can be helpful, a tool such as CE doesn’t replace other cloud cost management platforms used by company executives and corporate finance departments.

“It’s good for developers to be cognizant of costs and not feel like they’re being blindsided by impossible mandates from a perspective they don’t understand,” said Charles Betz, analyst at Forrester Research. “But in large enterprises, there will still be dedicated folks managing cloud costs at scale.”

The Harness CD tool deploys delegates, or software agents, to each Kubernetes cluster to carry out and monitor app deployments. CE can use those agents to identify the resources that specific apps and microservices use and compare this information to resource allocations in developers’ Kubernetes manifests, identifying idle and unallocated resources.

If users don’t have the Harness CD tool, CE draws on information from Kubernetes autoscaling data and associates it with specific microservices and applications. In either case, developers don’t have to manually tag resources, which many other cloud cost tools require.

This was a plus for Tyler Technologies, but Betz also expressed concern about the reliability of auto-discovery. 

“There’s no way to map objective tech resources to subjective business concepts without some false negatives or positives that could result in the wrong executive being charged for the wrong workload,” Betz said. “Tagging is a discipline that organizations ultimately can’t really get away from.”

Harness roadmap includes cloud cost guidance

Tyler Technologies plans to add the CE product to Harness when it renews its license this year but hasn’t yet received a specific pricing quote for the tool. Harness officials declined to disclose specific pricing numbers but said that CE will have a tiered model that charges between 1% and 5% of customers’ overall cloud spending, depending on whether the cloud infrastructure is clustered or non-clustered.

“It’s not quite free money — there is a charge for this service,” Green said. “But it will allow us to save costs we wouldn’t even be aware of otherwise.”

It will allow us to save costs we wouldn’t even be aware of otherwise.
Jeff GreenCTO, Tyler Technologies

Harness plans to add recommendation features to CE in a late July release, which will give developer teams hints about how to improve cloud cost efficiency. In its initial release, developers must correct inefficiencies themselves, which Tyler’s Green said would be easier with recommendations. 

“We use an AWS tool that recommends savings plans and how to revise instances for cost savings,” Green said. “We’d like to see that as part of the Harness tool as well.”

Other Harness users that previewed CE, such as Choice Hotels, have said they’d also like to see the tool add proactive cloud cost analysis, but Green said his team uses CE in staging environments to generate such estimates ahead of production deployments.

Harness plans to add predictive cost estimates based on what resources are provisioned for deployments, a company spokesperson said. The Continuous Efficiency platform already forecasts cloud costs for apps and clusters, and later releases will predict usage based on seasonality and trends.

Go to Original Article
Author:

Data growth spawns enterprise data management system challenges

Organizations are creating and consuming more data than ever before, spawning enterprise data management system challenges and opportunities.

A key challenge is volume. With enterprises creating more data, they need to manage and store more data. Organizations are now also increasingly relying on the cloud for enterprise data management system storage needs because of the cloud’s scalability and low cost.

IDC’s Global DataSphere Forecast currently estimates that in 2020, enterprises will create and capture 6.4 zettabytes of new data. In terms of what types of new data is being created, productivity data — or operational, customer and sales data and embedded data — is the fastest-growing category, according to IDC. 

“Productivity data encompasses most of the data we create on our PCs, in enterprise servers or on scientific computers,” said John Rydning, research vice president for IDC’s Global DataSphere.

Productivity data also includes data captured by sensors embedded in industrial devices and endpoints, which can be leveraged by an organization to reduce costs or increase revenue.

Rydning also noted that IDC is seeing growth in productivity-related metadata, which provides additional data about the captured or created data that can be used to enable deeper analysis.

Most enterprises have low data maturity, according to ESG/Splunk survey.
Ranking organizations by data maturity, an Enterprise Strategy Group survey sponsored by Splunk found that few organizations are data innovators.

Enterprise data management system challenges in a world of data growth

Looking ahead, Rydning sees challenges for enterprise data management. 

Perhaps the biggest is dealing with the growing volume of archived data. With archival data, organizations will need to decide whether that data is best kept on relatively accessible storage systems for artificial intelligence analysis, or if it is more economical to move the data to lower-cost media such as tape, which is less readily available for analysis.

Another challenge is handling data from the edge of the network, which is expected to grow in the coming years. There too the question will be where organizations should store reference data for rapid analysis.

“Organizations will increasingly need to be prepared to keep up with the growth of data being generated across a wider variety of endpoint devices feeding workflows and business processes,” Rydning said.

The data management challenge in the cloud

In 2019, 34% of enterprise data was stored in the cloud. By 2024, IDC expects that 51% of enterprise data will be stored in the cloud.

While the cloud offers organizations a more scalable and often easier way to store data than on-premises approaches, not all that data has the same value.

Companies are continuing to dump data into storage without thinking about the applications that need to consume it.
Monte ZwebenCo-founder and CEO, Splice Machine

“Companies are continuing to dump data into storage without thinking about the applications that need to consume it,” said Monte Zweben, co-founder and CEO of Splice Machine. “They just substituted cheap cloud storage, and they continue to not curate it or transform it to be useful. It is now a cloud data swamp.”

The San Francisco-based vendor develops a distributed SQL relational database management system with integrated machine learning capabilities. While simply dumping data into the cloud isn’t a good idea, that doesn’t mean Zweben is opposed to the idea of cloud storage.

Indeed, Zweben suggested that organizations use the cloud, since cloud storage is relatively cheap. The key is to make sure that instead of just dumping data, enterprises find way to use that data effectively.

“You may later realize you need to train ML [machine learning] models on data that you previously did not think was useful,” Zweben said.

Enterprise data management system lessons from data innovators

“Without a doubt, some companies are storing a lot of low-value data in the cloud,” said Andi Mann, chief technology advocate at Splunk, an information security and event management vendor. “But it is tough to say any specific dataset is unnecessary for any given business.”

In his view, the problem isn’t necessarily storing data that isn’t needed, but rather storing data that isn’t being used effectively.

Splunk sponsored a March 2019 study conducted by Enterprise Strategy Group (ESG) about the value of data. The report, based on responses from 1,350 business and IT decision-makers, segments users by data maturity levels, with “data innovators” being the top category.

“While many organizations do have vast amounts of data — and that might put them in the data innovator category — the real difference between data innovators and the rest is not how much data they have, but how well they enable their business to access and use it,” Mann said.

Among the findings in the report is that 88% of data innovators employ highly skilled data investigators. However, even skilled people are not enough, so 85% of these innovative enterprises use best-of-breed analytics tools, and make sure to provide easy access to them.

“Instead of considering any data unnecessary, look at how to store even low-value data in a way that is both cost-effective, while allowing you to surface important insights if or when you need to,” Mann suggested. “The key is to treat data according to its potential value, while always being ready to reevaluate that value.”

Go to Original Article
Author:

Improving enterprise data governance with data trust

Enterprise data governance isn’t just managing the data an organization company possesses, it’s also key to managing the data supply chain, according to Charles Link, director of data and analytics at Covanta.

Link detailed his views on data management during a technology keynote at the Talend Connect 2020 Virtual Summit on May 27. Executives from other Talend customers, including AutoZone, also spoke at the event.

Covanta, based in Morristown, N.J., is in the waste-to-energy business, operating 41 facilities across North America and Europe. Data is at the core of Covanta’s operations as a way to help make business decisions and improve efficiency, Link said.

“We’re never just pushing data; we’re never just handing off the reports,” Link said. “The outcome is not data; it is always a business result.”

Link said he’s often observed that there can be a disconnect between decision-makers and the data that should be used to help make decisions.

To help connect data with decisions, “you really need both the data use and data management strategy to drive business outcomes,” Link said.

Talend user Charles Link, director of data and analytics at Covanta, at Talend Connect virtual conference.
Charles Link, director of data and analytics at Covanta, detailed the need for thinking about data supply chain management as part of enterprise data governance.

Enterprise data governance strategy defined

Link defined data use strategy as identifying business objectives for data and quantifying goals. The process includes key performance indicators to measure the success of data initiatives.

An enterprise data management strategy, on the other hand, is more tactical, defining the methods tools and technologies use to access, analyze, manage and share data, he said.

At Covanta, Link said enterprise data governance is essentially about the need to have what he referred to as data supply chain management.

We’re never just pushing data; we’re never just handing off the reports. The outcome is not data; it is always a business result.
Charles LinkDirector of data and analytics, Covanta

Link defined data supply chain management as data governance that manages where data comes from and helps ensure consistent quality from a reliable supplier.

For that piece, Covanta has partnered with Talend and is using the Talend Data Fabric, a suite of data integration and management tools that includes a data catalog that helps enable data supply chain management. With Talend as the technology base, Link said that his company has deployed a central hub for users within the organization to find and use trusted data.

“There is now a shared understanding across business and IT of what our data means,” Link said. “So now we trust the quality of the data we use to operate our facilities.”

The chaos of data demands driving AutoZone

For auto parts retailer AutoZone, managing the complexity of data and overcoming data challenges is a foundation of the company’s success, said Jason Vogel, IT manager of data management at AutoZone.

AutoZone has 6,400 stores and each store carries nearly 100,000 parts. In the background, AutoZone is moving data across its disparate data hubs and stores, making it available to the company’s business analysts. Data also helps ensure that AutoZone customers can get the parts they need quickly.

“We have 20 different types of databases — not instances, types,” Vogel emphasized. “We have thousands of instances and Talend serves as the glue to connect all these systems together.”

Vogel noted that AutoZone is looking to expand its real-time data processing so that it can do more in less time, getting parts to its customers faster. The company is also looking to expand operations overall.

“The only way to accomplish that is by moving more data, having more insight into how data is used and accomplishing it all faster,” Vogel said.

Many organizations continue to struggle with data

AutoZone isn’t the only organization that is trying to deal with data coming from many different sources. In another keynote at Talend Connect, Stewart Bond, research director of data integration and data intelligence software at IDC, provided some statistics about the current state of data integration challenges.

Bond cited a 2019 IDC survey of enterprise users’ experience with data integration and integrity that found most organizations are integrating up to six different types of data.

Those data types include transaction, file, object, spatial, internet of things and social data. Furthering adding to the complexity, the same study found that organizations are using up to 10 different data management technologies.

While enterprises are managing a lot of data, Bond said the survey shows that not all the organizations are using the data effectively. Data workers are wasting an average of 15 hours per week on data search, preparation and governance processes, IDC found. To improve efficiency, Bond suggested that organizations better manage and measure how data is used.

“Measurements don’t need to be complex; they can be as simple as measuring how much time people spend on data-oriented activity,” Bond said. “Set a benchmark and see if you can improve over time.”

Improving enterprise data governance with data trust

During her keynote, Talend CEO Christal Bemont emphasized that data quality and trust are keys to making the most efficient use of data.

She noted that it’s important to measure the quality of data, to make sure that organizations are making decisions based on good information. Talend helps its users enable data quality with a trust score for data sources, as part of the Talend Data Fabric.

“When you think about what Talend does, you know, you think of us as an integration company,” Bemont said. “Quite frankly we put equal, and maybe even in some cases more, importance on not only just being able to have a lot of data, but also having complete data.”

Go to Original Article
Author:

SAP HANA database turns 10

The launch of SAP HANA in 2010 is seen as the Big Bang in SAP’s intelligent enterprise journey.

The intelligent enterprise is SAP’s vision of businesses building on the digital assets provided by SAP’s foundational ERP systems to make them more flexible, with faster response times to business events and customer requirements. The concept relies on marrying transactional data with analytic data to provide results in near real time, but this requires processing speed and data management that traditional databases lack.

Enter SAP HANA and the technical breakthrough of the in-memory database in 2010, which provides the storage ability and processing speed to enable the intelligent enterprise. In this Q&A, Gerrit Kazmaier, president of SAP HANA & Analytics, discusses the first 10 years of SAP HANA and explains why SAP HANA Cloud, a relatively new database-as-a-service offering, is so critical to the next 10.

After joining SAP in 2009, Kazmaier was on the team that developed SAP HANA, and was prominent in creating HANA’s high-performance spatial engine. He also built the foundations for SAP Analytics Cloud and SAP Digital Boardroom.

What were the origins of SAP HANA database?

Gerrit Kazmaier: It all started with a big realization on SAP’s side — specifically with [SAP co-founder and former CEO] Dr. Hasso Plattner — recognizing that everyone was suffering from the state of data management and how databases operated. The SAP HANA database was quite an unusual invention. It was not seen as something that the database community pushed and was cast as a fantasy by some. The only way you can really understand HANA is if you put yourself into Plattner’s shoes, who saw that, in order to build a modern business application with speed, that scales, that allows you to do the processing that will be required, you have to rethink the database concept. You had to throw out some of the prevailing dogmas.

What were some of those prevailing dogmas?

Gerrit Kazmaier, SAP president of SAP HANA & AnalyticsGerrit Kazmaier

Kazmaier: If you go back to the beginning of HANA at SAP, there were transactional data systems, there were analytical systems, some customers had geospatial systems, and there was a realization that all of that stuff matters and it’s all intertwined. So, one of the founding ideas was to say that transactional processing and analytical data processing is actually a shared concern for businesses. You don’t want to have long lead times from insight to action — or vice versa, from action to understanding its consequences. The other founding idea of HANA was that databases are super slow and everyone hated that everything was so complex to build and it took so much time in preparation. The idea was to just compute answers dynamically without all of the complex maintenance and upfront modeling. Why can’t it just respond in real time?

What were some of the most important developments in HANA as an application platform?

Kazmaier: In 2012, we moved and optimized our Business Warehouse application to HANA. In 2015, our ERP system followed and the whole S/4HANA movement started. Since then, SAP has moved more and more of its applications onto the HANA database. One of the highlights was putting SuccessFactors [SAP’s HCM platform] on top of HANA, which processes on a peak day more than a billion transactions on HANA. One of the biggest shifts was the move to the cloud, and SAP becoming a cloud company powered by HANA specifically.

What were some of the biggest challenges in getting people to understand and accept the HANA value proposition?

Kazmaier: When SAP HANA started, it basically established a new baseline in database technology and data processing. When you try to shift the baseline, you can be surrounded by those who are not used to that. It required the market and academia a substantial amount of time to grasp the concept, to adopt it, to understand and embrace it. That’s why SAP did not just exchange a database, but also started to build new and different applications like S/4HANA [an ERP platform]. Simplification and speed are the essence of HANA, and ultimately, it has created a digital innovation system. Data has become so incredibly important, everyone understands now that they have digital assets that they want to turn into value. They want to provide analytics — predictive, and machine learning, and they want to incorporate customer data into it.

SAP HANA has been called ‘translytical’ over the years. What does this mean and why is it important?

Kazmaier: Transactions and analytics are usually thought of like water and oil, they don’t mix well because it was thought that they were on opposing ends of optimizing data. Transactions are small amounts of data and writing, while analytics are large amounts of data and reading. What HANA did was unify both workload patterns into a common paradigm for data storage and data processing, which enabled them both to be processed at the same time. That created the notion of translytics, and suddenly application developers understood that in the translytical world, they can change how they can do transactions because they can make them analytics-based.

For instance, you can search to find the right product for a real-time sales promotion, or you can run a simulation to find the best depreciation model. On the analytics side, people realized that it’s not just about getting to an answer, it’s about triggering the right action, such as a promotion or employee engagement reach-out. That changes how you understand data processing by bringing two pieces together into one common form of data processing. The next piece will be when analytics and transactions unify with predictive analytics and machine learning, so that you can look into the future and find pattern correlations on the fly and make machine learning part of analytics processing.

What are some of the challenges that SAP HANA has to overcome to be used more across enterprises?

Kazmaier: The innovation of HANA is now interwoven with the innovation in the cloud. HANA as such a potent enterprise technology was always [most appropriate for] a large IT operation because of the infrastructure it requires. So, as a consequence, it was not democratic in the sense that everyone could have easy access at low costs, without worrying about infrastructure. Making it more democratic also includes using it for all sorts of data, not just high value data for which you would be willing to pay the premium of an in-memory database, but it could be any type of data like IoT low-value, low-density data.

So, is cloud the democratizing force?

Kazmaier: Because of the cloud, everyone can get the value and the potential of HANA without worrying about any of the infrastructure provisioning, about any of the data integration aspects, they basically subscribe to a service. There’s hundreds of functions inside of HANA — including geospatial, predictive, machine learning, transactions, OLAP — that are all available immediately and it scales to any size of data. This is why the release of SAP HANA Cloud now is fitting for the 10th anniversary of HANA and will usher in the next 10 years of HANA innovation because it makes it more democratic.

Go to Original Article
Author:

Android 11 features zero in on security, privacy

New Android 11 features will likely not represent a major shift for the enterprise, but industry observers believe they will help IT professionals better manage mobile devices.

Google released the first developer preview of the updated OS last month, with a final release expected in the third quarter of 2020. Among the changes are a few items — including improved biometric support and limited-time permissions for applications — that experts said would affect businesses.

Eric Klein, an independent analyst, said the improvements reflect Google’s larger efforts to appeal to enterprise customers.

Eric KleinEric Klein

“The way in which they’re approaching their overall strategy as an organization — from Chrome to the cloud and G Suite [productivity applications] — they’re continuing to refine their assets for business use,” he said.

A focus on privacy and security

Android 11, per the preview, includes changes intended to bolster privacy and security. One feature offers users greater control over what applications can do; it lets users — or IT administrators — give apps one-time-only permissions to access such things as location data or a phone’s camera and microphone.

According to Google, this builds on an Android 10 feature, in which users could permit an application to access such data and features, but only while the app was in use.

Andrew HewittAndrew Hewitt

Forrester analyst Andrew Hewitt said the granular data control offered by this feature is in line with modern enterprise security.

“[It] is more philosophically aligned [than before] with a zero-trust strategy — where a user only has access to what they need, and nothing more,” he said.

Klein said the feature will work as part of an overall device management strategy to help prevent bad actors from taking user data.

“There are many ways enterprises are protecting themselves that are well-known, basic security hygiene: restricting application usage, blacklisting apps — things of that nature,” he said, adding that controlling app permissions is a further step along that journey.

Android 11 will also reportedly include greater biometric support, notably by making it easier to integrate biometric authentication into apps and allowing developers to determine which biometric inputs — like fingerprints, iris scans and face scans — they consider strong or weak.

Hewitt said such a feature will interest IT professionals as they look to eliminate passwords — a frequent pain point in ensuring enterprise security.

“While passwordless authentication still remains immature in adoption, it’s certainly on the minds of many mobility management professionals,” he said.

Other effects on the enterprise

While security improvements are an integral part of Android 11, they are not the only ones set to have an impact on companies.

Holger MuellerHolger Mueller

Holger Mueller, vice president and principal analyst at Constellation Research, said he saw changes like improved 5G support — including a feature that determines whether a device is on a metered or unmetered network and adjusts data traffic accordingly — as new and necessary steps for Android.

The implementation of new messaging and chat “bubbles” — notifications that float on top of other applications and thus enable text conversations while multi-tasking — was taken as a heartening sign for productivity.

“[It’s] good to see Google not giving up on messaging,” he said. “The new messaging will likely improve [the] everyday user experience on Android.”

Hewitt said that with Android 11, Google has implemented new processes and options to ensure OS updates do not break app compatibility. Google announced methods, for example, to help developers test for compatibility by turning changes on or off — making it easier to determine which new OS behavior might pose problems.

“[Compatibility] has been a perennial issue in enterprise mobility,” Hewitt said.

Competing with iOS

Klein said the improvements in Android 11 — especially those related to privacy and security — reflect Google’s desire to compete for the enterprise. He noted Android’s reputation for security has long lagged behind that of iOS.

“There’s a perception that it’s just not secure — that hasn’t gone away yet,” he said. “Many [administrators] will say, ‘I’m not trusting an Android device. I’m not trusting my employees with Android devices.’ That perception is still there, and it’s something Google has to overcome. I think they are overcoming it.”

Google, Klein said, has historically faced criticism for the cadence of its security patches and its reliance on partners to push out those patches. The company has been working to improve that process, he said.

“In order to [compete] effectively — to ensure that peace of mind IT requires for mass rollouts — they’re going to have to … show they’re serious about security and privacy,” he said.

Go to Original Article
Author:

EG Enterprise v7 focuses on usability, user experience monitoring

Software vendor EG Innovations will release version 7 of its EG Enterprise software, its end-user experience monitoring tool, on Jan. 31.

New features and updates have been added to the IT monitoring software with the goal of making it more user-friendly. The software focuses primarily on monitoring end-user activities and responses.

“Many times, vendor tools monitor their own software stack but do not go end to end,” said Srinivas Ramanathan, CEO of EG Innovations. “Cross-tier, multi-vendor visibility is critical when it comes to monitoring and diagnosing user experience issues. After all, users care about the entire service, which cuts across vendor stacks.”

Ramanathan said IT issues are not as simple as they used to be.

“What you will see in 2020 is now that there is an ability to provide more intelligence to user experience, how do you put that into use?” said Mark Bowker, senior analyst at Enterprise Strategy Group. “EG has a challenge of when to engage with a customer. IT’s a value to them if they engage with the customer sooner in an end-user kind of monitoring scenario. In many cases, they get brought in to solve a problem when it’s already happened, and it would be better for them to shift.”

New features in EG Enterprise v7 include:

  • Synthetic and real user experience monitoring: Users can create simulations and scripts of different applications that can be replayed to further help diagnose a problem and notifies IT operations teams of impending problems.
  • Layered monitoring: Enables users to monitor every tier of an application stack via a central console.
  • Automated diagnosis: Lets users use machine learning and automation to find root causes to issues.
  • Optimization plan: Users can customize optimization plans through capacity and application overview reports.

“Most people look at user experience as just response time for accessing any application. We see user experience as being broader than this,” Ramanthan said. “If problems are not diagnosed correctly and they reoccur again and again, it will hurt user experience. If the time to resolve a problem is high, users will be unhappy.”

Pricing for EG Enterprise v7 begins at $2 per user per month in a digital workspace. Licensing for other workloads depends on how many operating systems are being monitored. The new version includes support for Citrix and VMWare Horizon.

Go to Original Article
Author:

New Oracle Enterprise Manager release advances hybrid cloud

In a bid to meet customers’ needs for hybrid cloud deployments, Oracle has injected its Oracle Enterprise Manager system with new capabilities to ease cloud migration and hybrid cloud database management.

The software giant unveiled the new Oracle Enterprise Manager release 13.4 on Wednesday, with general availability expected by the end of the first quarter.

The release includes new analytics features for users to make the most of a single database and optimize performance. Lifecycle automation for databases gets a boost in the new release. The update also provides users with new tools to enable enterprises to migrate from an on-premises database to one in the cloud.

“Managing across hybrid on-prem and public cloud resources can be challenging in terms of planning and executing database migrations,” said Mary Johnston Turner, research vice president for cloud management at IDC. “The new Migration Workbench addresses this need by providing customers with guided support for updating and modernizing across platforms, as appropriate for the customer’s specific requirements.”

Beyond helping with migration, Turner noted that Oracle Enterprise Manager 13.4 supports customer choice by enabling consistent management across Oracle Cloud and traditional on-premises resources, which is a recognition that most enterprises are adopting multi-cloud architectures.

The other key addition in Oracle Enterprise Manager 13.4 is advanced machine learning analytics, Turner noted.

“Prior to this release the analytics capabilities were mostly limited to Oracle Management Cloud SaaS [software as a service] solutions, so adding this capability to Enterprise Manager is significant,” she said.

Oracle Enterprise Manager 13.4 features

Nearly all large Oracle customers use Enterprise Manager already, said Mughees Minhas, vice president of product management at Oracle. He said Oracle doesn’t want to force a new management tool on customers that choose to adopt the cloud, which is why the vendor is increasingly integrating cloud management features with Oracle Enterprise Manager.

Managing across hybrid on-prem and public cloud resources can be challenging in terms of planning and executing database migrations.
Mary Johnston TurnerResearch vice president for cloud management, IDC

As users decide to move data from on-premises deployments to the cloud, it’s rarely just an exercise in moving an application from one environment to another without stopping to redesign the workflow, Minhas said.

The migration tool in the new enterprise manager update includes a SQL performance analyzer feature to ensure that database operations are optimized as they move to the cloud. The tool also includes a compatibility checker to verify that on-premises database applications are compatible with the autonomous versions of Oracle database that runs in the cloud.

Migrating to new databases with Enterprise Manager 13.4

Helping organizations migrate to new database versions is one of the key capabilities of the latest version of Oracle Enterprise Manager.

“Normally, you would create a separate test system on-prem where you would install it and then once you’re done with the testing, then you’d upgrade the actual system,” Minhas said. “So we are promoting these use cases to Enterprise Manager through the use of real application testing tools, where we let you create a new database in the cloud to test.”

Intelligent analytics

The new Oracle Enterprise Manager release also benefits from Exadata Warehouse technology, which now enables analytics for Oracle database workloads.

“The goal of a great admin or cloud DBA [database administrator] is that they want to avoid problems before they happen, and not afterwards,” Minhas said. “So we are building analytical capabilities and some algorithms, so they can do some forecasting, so they know limits and are able to take action.”

Minhas said hybrid management will continue to be Oracle’s focus for Oracle Enterprise Manager.

“Over time, you’ll see us doing more use cases where we also let you do the same thing you’re doing on premises in the cloud, using the same APIs users are already familiar with,” Minhas said.

Go to Original Article
Author:

5G connectivity won’t be enterprise-ready in 2020

Despite considerable hype, 5G connectivity likely won’t have a big impact on the enterprise this year.

Although several 5G-ready, enterprise-focused devices were announced recently, including the Dell Latitude 9510 and the 2020 HP Elite Dragonfly G2 model, analysts said the fifth generation cellular network technology itself is not enterprise-ready.

This slower rollout may be to the benefit of IT professionals, however. With 5G connectivity looming on the horizon, experts said, there is an opportunity for IT admins to contribute to strategic plans for how the technology will be implemented when it does arrive.

Limiting factors

Bill Menezes, senior principal analyst at Gartner, said the expansion of nationwide cellular and data 5G networks is still a work in progress. Such networks, he said, have been built out in some smaller countries like South Korea, where there is limited landmass to cover and a regulatory impetus to upgrade. In the U.S. and Europe, the networks have grown, but coverage is not as ubiquitous as it needs to be for businesses to rely on it.

Bill MenezesBill Menezes

Device support is another limiting factor. Menezes said iOS has the dominant share of mobile devices in the business world; until an iPad or iPhone supports the technology, organizations using iOS will be without a 5G connectivity option. With 5G-capable laptops, he said, IT should weigh the benefit of having the technology available on those relatively long-lived devices against whether their employees would see any short-term gains through it.

Holger Mueller, principal analyst at Constellation Research Inc., said he did not expect much of a 5G effect on the enterprise in 2020.

“The only impact [this year] is that customers will have to shell out more money for an iPhone, if an iPhone comes out that supports 5G,” he said.

The technology has the potential to bring about some fundamental changes, whether it involves controlling thousands of devices on the factory floor or guiding self-driving cars, but substantial work must take place before that occurs, according to Mueller. He anticipates that the rollout will be slower than the upgrade from 3G to 4G, and the early benefits will be seen in the more densely populated areas rather than nationwide.

Holger MuellerHolger Mueller

While 4G relied on cell towers covering a wide radius, 5G is expected to use a larger number of smaller stations, as the millimeter wave spectrum upon which it operates only works over short distances. Mueller said a substantial amount of buildout must take place before the service is available across the country.

Forrester Research analyst Dan Bieler described 2020 as the early days of 5G; IT, at this point, may be just starting to think about the technology and the impact it could have on operations.

“5G is one of the technologies they need to keep an eye on, in terms of the use cases they need to support,” he said.

When it does arrive, how will businesses use 5G?

The most credible interest in 5G currently, Bieler said, comes from the manufacturing sector. The technology, given its low latency, would be able to replace the data cables leading to factory machinery. In one scenario he mentioned, stationary and mobile robots could work in concert to complete tasks; given the need for perfect harmony in such an instance, a difference of a few milliseconds of latency is important.

Dan BielerDan Bieler

Bieler said he saw 5G as part of a mix of technologies — including cloud computing, artificial intelligence and virtual reality/augmented reality — that are expected to see increased enterprise adoption in the coming years. The really convincing use cases for 5G, he said, would likely involve some mix of those developments.

Menezes said several industries, such as healthcare, are poised to benefit from 5G connectivity.

“There are some more compelling use cases than others, [like] anything related to large, dense data files that people need to upload or download in a mobile setting,” he said.

Remote medical care is one such use; sending high-resolution images to an off-site physician could speed up diagnoses and treatments.

Like Bieler, Menezes also suggested VR and AR could be better powered by 5G connectivity. A worker repairing or maintaining field equipment could use augmented reality to access documentation or seek real-time human assistance in resolving a problem.

As with 5G, AR may yet bring about a shift in business, but several obstacles remain in rolling out the technology. The software managing such devices, for instance, must be able to handle proprietary information securely. Some firms have begun to offer management tools for AR headsets, like Lenovo with its ThinkReality platform.

How will 5G impact IT admins?

Menezes said implementing 5G would entail a learning curve for IT professionals, although the projected slow adoption of the technology could help with that. If 5G connectivity brings about more sophisticated and demanding applications for workers, IT admins could have the chance to participate in the ramp-up.

According to Bieler, IT’s role in bringing 5G into the enterprise will reflect a general change to the profession’s character.

The days are gone when someone tells them to implement a technology and they’re just responsible for the rollout.
Dan BielerAnalyst, Forrester Research

“I think it’s part of a broader shift in what is expected out of IT professionals,” he said. “The days are gone when someone tells them to implement a technology and they’re just responsible for the rollout.”

Bieler said companies now expect IT to take part in planning and strategy, as part of the overall effort to achieve business objectives and target priorities. Instead of merely selecting a carrier, he said, IT professionals are being asked to evaluate where mobile makes sense and what kind of processes need to be supported.

“IT managers and teams have to become much more strategically involved,” he said.

Mueller said he anticipated 5G would affect some parts of an IT admin’s job — tracking devices, for example — but did not foresee a fundamental shift in the profession. The technology, he said, has been overhyped thus far, although he would be happy to be proven wrong.

Go to Original Article
Author:

Major storage vendors map out 2020 plans

The largest enterprise storage vendors face a common set of challenges and opportunities heading into 2020. As global IT spending slows and storage gets faster and frequently handles data outside the core data center, primary storage vendors must turn to cloud, data management and newer flash technologies.

Each of the major storage vendors has its own plans for dealing with these developments. Here is a look at what the major primary storage vendors did in 2019 and what you can expect from them in 2020.

Dell EMC: Removing shadows from the clouds

2019 in review: Enterprise storage market leader Dell EMC spent most of 2019 bolstering its cloud capabilities, in many cases trying to play catch-up. New cloud products include VMware-orchestrated Dell EMC Cloud Platform arrays that integrate Unity and PowerMax storage, coupled with VxBlock converged and VxRail hyper-converged infrastructure.

The new Dell EMC Cloud gear allows customers to build and deploy on-premises private clouds with the agility and scale of the public cloud — a growing need as organizations dive deeper into AI and DevOps.

What’s on tap for 2020: Dell EMC officials have hinted at a new Power-branded midrange storage system for several years, and a formal unveiling of that product is expected in 2020. Then again, Dell initially said the next-generation system would arrive in 2019. Customers with existing Dell EMC midrange storage likely won’t be forced to upgrade, at least not for a while. The new storage platform will likely converge features from Dell EMC Unity and SC Series midrange arrays with an emphasis on containers and microservices.

Dell will enhance its tool set for containers to help companies deploy microservices, said Sudhir Srinivasan, the CTO of Dell EMC storage. He said containers are a prominent design featured in the new midrange storage. 

“Software stacks that were built decades ago are giant monolithic pieces of code, and they’re not going to survive that next decade, which we call the data decade,” Srinivasan said. 

Hewlett Packard Enterprise’s eventful year

2019 in review: In terms of product launches and partnerships, Hewlett Packard Enterprise (HPE) had a busy year in 2019. HPE Primera all-flash storage arrived in late 2019,  and HPE expects customers will slowly transition from its flagship 3PAR platform. Primera supports NVMe flash, embedding custom chips in the chassis to support massively parallel data transport on PCI Express lanes. The first Primera customer, BlueShore Financial, received its new array in October.

HPE bought supercomputing giant Cray to expand its presence in high-performance computing, and made several moves to broaden its hyper-converged infrastructure options. HPE ported InfoSight analytics to HPE SimpliVity HCI, as part of the move to bring the cloud-based predictive tools picked up from Nimble Storage across all HPE hardware. HPE launched a Nimble dHCI disaggregated HCI product and partnered with Nutanix to add Nutanix HCI technology to HPE GreenLake services while allowing Nutanix to sell its software stack on HPE servers.

It capped off the year with HPE Container Platform, a bare-metal system to make it easier to spin up Kubernetes-orchestrated containers on bare metal. The Container Platform uses technology from recent HPE acquisitions MapR and BlueData.

What’s on tap for 2020: HPE vice president of storage Sandeep Singh said more analytics are coming in response to customer calls for simpler storage. “An AI-driven experience to predict and prevent issues is a big game-changer for optimizing their infrastructure. Customers are placing a much higher priority on it in the buying motion,” helping to influence HPE’s roadmap, Singh said.

It will be worth tracking the progress of GreenLake as HPE moves towards its goal of making all of its technology available as a service by 2022.

Hitachi Vantara: Renewed focus on traditional enterprise storage

2019 in review: Hitachi Vantara renewed its focus on traditional data center storage, a segment it had largely conceded to other array vendors in recent years. Hitachi underwent a major refresh of the Hitachi Virtual Storage Platform (VSP) flash array in 2019. The VSP 5000 SAN arrays scale to 69 PB of raw storage, and capacity extends higher with hardware-based deduplication in its Flash Storage Modules. By virtualizing third-party storage behind a VSP 5000, customers can scale capacity to 278 PB.

What’s on tap for 2020: The VSP5000 integrates Hitachi Accelerated Fabric networking technology that enables storage to scale out and scale up. Hitachi this year plans to phase in the networking to other high-performance storage products, said Colin Gallagher, a Hitachi vice president of infrastructure products.

“We had been lagging in innovation, but with the VSP5000, we got our mojo back,” Gallagher said.

Hitachi arrays support containers, and Gallagher said the vendor is considering whether it needs to evolve its support beyond a Kubernetes plugin, as other vendors have done. Hitachi plans to expand data management features in Hitachi Pentaho analytics software to address AI and DevOps deployments. Gallagher said Hitachi’s data protection and storage as a service is another area of focus for the vendor in 2020.

IBM: hybrid cloud, with cyber-resilient storage

2019 in review: IBM brought out the IBM Elastic Storage Server 3000, an NVMe-based array packaged with IBM Spectrum Scale parallel file storage. Elastic Storage Server 3000 combines NVMe flash and containerized software modules to provide faster time to deployment for AI, said Eric Herzog, IBM’s vice president of world storage channels.

In addition, IBM added PCIe-enabled NVMe flash to Versastack converged infrastructure and midrange Storwize SAN arrays.

What to expect in 2020: Like other storage vendors, IBM is trying to navigate the unpredictable waters of cloud and services. Its product development revolves around storage that can run in any cloud. IBM Cloud Services enables end users to lease infrastructure, platforms and storage hardware as a service. The program has been around for two years, and will add IBM software-defined storage to the mix this year. Customers thus can opt to purchase hardware capacity or the IBM Spectrum suite in an OpEx model. Non-IBM customers can run Spectrum storage software on qualified third-party storage.

“We are going to start by making Spectrum Protect data protection available, and we expect to add other pieces of the Spectrum software family throughout 2020 and into 2021,” Herzog said.

Another IBM development to watch in 2020 is how its $34 billion acquisition of Red Hat affects either vendor’s storage products and services.

NetApp: Looking for a rebound

2019 in review: Although spending slowed for most storage vendors in 2019, NetApp saw the biggest decline. At the start of 2019, NetApp forecast annual sales at $6 billion, but poor sales forced NetApp to slash its guidance by around 10% by the end of the year.

NetApp CEO George Kurian blamed the revenue setbacks partly on poor sales execution, a failing he hopes will improve as NetApp institutes better training and sales incentives. The vendor also said goodbye to several top executives who retired, raising questions about how it will deliver on its roadmap going forward.

What to expect in 2020: In the face of the turbulence, Kurian kept NetApp focused on the cloud. NetApp plowed ahead with its Data Fabric strategy to enable OnTap file services to be consumed, via containers, in the three big public clouds.  NetApp Cloud Data Service, available first on NetApp HCI, allows customers to consume OnTap storage locally or in the cloud, and the vendor capped off the year with NetApp Keystone, a pay-as-you-go purchasing option similar to the offerings of other storage vendors.

Although NetApp plans hardware investments, storage software will account for more revenue as companies shift data to the cloud, said Octavian Tanase, senior vice president of the NetApp OnTap software and systems group.

“More data is being created outside the traditional data center, and Kubernetes has changed the way those applications are orchestrated. Customers want to be able to rapidly build a data pipeline, with data governance and mobility, and we want to try and monetize that,” Tanase said.

Pure Storage: Flash for backup, running natively in the cloud

2019 in review: The all-flash array specialist broadened its lineup with FlashArray//C SAN arrays and denser FlashBlade NAS models. FlashArray//C extends the Pure Storage flagship with a model that supports Intel Optane DC SSD-based MemoryFlash modules and quad-level cell NAND SSDs in the same system.

Pure also took a major step on its journey to convert FlashArray into a unified storage system by acquiring Swedish file storage software company Compuverde. It marked the second acquisition in as many years for Pure, which acquired deduplication software startup StorReduce in 2018.

What to expect in 2020: The gap between disk and flash prices has narrowed enough that it’s time for customers to consider flash for backup and secondary workloads, said Matt Kixmoeller, Pure Storage vice president of strategy.

“One of the biggest challenges — and biggest opportunities — is evangelizing to customers that, ‘Hey, it’s time to look at flash for tier two applications,'” Kixmoeller said.

Flexible cloud storage options and more storage in software are other items on Pure’s roadmap items. Cloud Block Store, which Pure introduced last year, is just getting started, Kixmoeller said, and is expected to generate lots of attention from customers. Most vendors support Amazon Elastic Block Storage by sticking their arrays in a colocation center and running their operating software on EBS, but Pure took a different approach. Pure reengineered the backend software layer to run natively on Amazon S3.

Go to Original Article
Author:

2019 storage mergers and acquisitions covered by clouds

Most of the enterprise storage-related mergers and acquisitions that happened or closed in 2019 had a cloud twist.

Take IBM’s $34 billion blockbuster acquisition of Red Hat. That was about “resetting the hybrid cloud landscape” with access to the “world’s largest open source community,” IBM CEO Ginni Rometty said in October 2018 of the proposed deal. The acquisition closed in July 2019.

Although storage was hardly the impetus for the acquisition, IBM now has Red Hat’s open source-based storage portfolio. That includes the Gluster file system, Ceph multiprotocol software-defined storage and OpenShift Container Storage and Hyperconverged Infrastructure products that are well suited to cloud use.

OpenText’s $1.45 billion purchase of cloud-based data protection, disaster recovery (DR) and endpoint security provider Carbonite in November heads the list of 2019 backup acquisitions. The Waterloo, Canada-based information management vendor completed the acquisition on Dec. 24. 

Earlier in the year, Carbonite factored into another one of the biggest 2019 storage-related mergers and acquisitions. The Boston-based provider bought cybersecurity firm Webroot for $618.5 million to address ransomware threats and bolster endpoint protection.

Cloud providers’ mergers and acquisitions

Public cloud providers AWS and Google each acquired multiple startups specializing in data storage or migration. Amazon purchased Israel-based startup CloudEndure, an AWS Advanced Technology Partner, to expand its capabilities in application workload and data migration, backup and DR. CloudEndure’s key technologies include continuous data replication to speed DR in the cloud.

AWS scooped up another Israeli startup, NVMe flash specialist E8 Storage, over the summer. E8’s arrays feature NVMe solid-state drives (SSDs) to target analytics and other data-intensive workloads requiring low latency. The startup’s technology includes an NVMe-over-TCP implementation integrated into the operating system. E8 also sold its software for use with various industry-standard servers.

Google also bought a pair of Israeli startups in 2019. In July, Google fortified its enterprise-class file storage with the acquisition of Elastifile. Google previously collaborated with the startup on a managed file storage service that Elastifile CEO Erwan Menard said would provide higher performance, greater scale-out capacity and more enterprise-grade features than Google’s Cloud Filestore. Google said engineers would integrate the Elastifile and Cloud Filestore technology.

Earlier in 2019, Google picked up Alooma for its enterprise data migration capabilities. The transaction happened less than a year after Google added Velostrata, another Israeli startup that specializes in cloud migration. Alooma’s tool focuses on shifting data from databases and enterprise applications to a single data warehouse, whereas Velostrata can move entire VM-based databases and applications to the cloud.  

HPE buys MapR, Cray

Hewlett Packard Enterprise’s August purchase of struggling Hadoop distributor MapR included a cloud angle. HPE said MapR’s enterprise-grade file system and cloud storage services would complement its BlueData container platform it acquired in November 2019. HPE said the combination will enable users to combine artificial intelligence (AI), machine learning and analytics data pipelines across on-premises, hybrid and multi-cloud environments. 

HPE’s biggest 2019 transaction with a storage component was its $1.4 billion acquisition of supercomputing heavyweight Cray. HPE identified high-performance computing (HPC) as a key component of its strategic direction to target organizations that run AI, machine learning and big data analytics workloads.

Flash-related mergers and acquisitions

Flash played a key role in several 2019 storage-related mergers and acquisitions. Pure Storage bought Swedish file software startup Compuverde for $48 million in April to turn its flagship FlashArray into a unified storage system. Pure said the unified FlashArray would target workloads such as enterprise file sharing, databases over the NFS and SMB file protocols, and VMware over NFS.

Compuverde was Pure’s second acquisition since August 2018, when the flash pioneer bought data deduplication software startup StorReduce. Pure integrated the StorReduce technology into its GPU-based FlashBlade, which targets AI, machine learning, analytics and HPC workloads.

DataDirect Networks (DDN) continued its storage expansion with the September acquisition of Western Digital’s IntelliFlash business unit. The IntelliFlash purchase adds NVMe- and SAS-based flash hardware and accompanying software. Western Digital, a leading disk and solid-state drive (SSD) vendor, said it no longer plans to sell storage systems.

Although DDN’s roots are in storage for HPC environments, the vendor has been broadening its portfolio through acquisition. DDN bought Nexenta in May for its software-defined, hardware-agnostic file, block and object services. In September 2018, DDN completed its $60 million purchase of hybrid flash array vendor Tintri, less than three months after buying Intel’s Lustre File System business.

In mid-2019, StorCentric tacked on sagging NVMe flash system startup Vexata and small and midsize business (SMB) backup software provider Retrospect a week apart. Formed in August 2018, StorCentric is also the parent company of Drobo and Nexsan. Drobo, Nexsan, Retrospect and Vexata operate as separate divisions under StorCentric. Drobo sells direct-attached NAS and iSCSI SAN systems for SMBs, while Nexsan focuses on block and unified storage and secure archiving.

In August, Toshiba Memory (now known as Kioxia) announced plans to acquire the flash-based SSD business of Taiwan-based Lite-On Technology for $165 million. Acting president and CEO Nobuo Hayasaka said the Lite-On technology would help the company “to meet the projected growth in demand for SSDs in PCs and data centers being driven by the increased use of cloud services.”

Also in August, Virtual Instruments completed its purchase of Metricly, which was formerly called Netuitive. In October, Virtual Instruments changed its name to Virtana and introduced a new SaaS-based CloudWisdom monitoring and cost analysis tool that uses the Metricly technology.

Backup mergers and acquisitions

Data protection vendors kept busy on the mergers and acquisitions front in 2019.

OpenText’s $1.45 billion deal for Carbonite in November was the largest data protection transaction and followed months of rumors about a possible sale. Carbonite’s subscription-based cloud backup protects servers, endpoints and SaaS applications for businesses and consumers.

In September, Commvault spent $225 million on software-defined storage startup Hedvig to converge primary and secondary storage and address the problem of data fragmentation. Hedvig’s scale-out Distributed Storage Platform runs on commodity servers and supports provisioning and management of block, file and object storage across private and public clouds. Commvault plans a phased rollout of the Hedvig software on its HyperScale data protection appliance, with full integration in mid-2021.

Veritas Technologies strengthened its storage analytics and monitoring capabilities through its March acquisition of Aptare. Aptare’s IT Analytics suite includes storage, backup, capacity, fabric, replication and virtualization management components, in addition to file analytics. Aptare IT Analytics will complement the popular Veritas NetBackup and Backup Exec data protection products and InfoScale storage management software.

Other data protection-related mergers and acquisitions in 2019 included: 

  • Cohesity’s May purchase of Imanis Data to enable customers to back up and recover Hadoop and NoSQL workloads and distributed databases, such as MongoDB, Cassandra, Cloudera and Couchbase DB.
  • Druva’s July acquisition of CloudLanes, a hybrid cloud data protection and migration startup, to let customers securely ingest data from on-premises systems, move it to the cloud and restore it locally.
  • Acronis’ December buy of Microsoft Hyper-V and Azure cloud management and security provider 5nine to complement its cyber protection capabilities. Acronis plans to integrate 5nine’s technology into its Cyber Platform and offer new services.
  • Mainframe software specialist Compuware’s December purchase of Innovation Data Processing’s enterprise data protection, business continuance and storage resource management assets. The transaction was Compuware’s sixth mainframe-related software or services acquisition in the last three years.

Go to Original Article
Author: