Tag Archives: market

Major funding, SaaS trends top data backup news in 2019

In the backup market, 2019 started with a financial bang.

On back-to-back days in January, Rubrik announced a $261 million funding round, while Veeam disclosed that Insight Venture Partners invested an additional $500 million in the data backup and management vendor.

That data backup news set the tone for a busy year of more funding rounds, acquisitions, CEO changes, new products and key trends in a market that is constantly evolving.

Backup business busy with acquisitions, funding, leadership changes

Much like recent years, backup was big business in 2019.

Carbonite had one of the busiest years of all. In March, the data protection vendor acquired cybersecurity firm Webroot for $618.5 million, with a focus on fighting ransomware. In July, CEO Mohamad Ali left to take the same job at tech media company International Data Group, with board chairman Steve Munford filling the role at Carbonite on an interim basis. Then in November, following months of rumors of a possible sale, content management provider OpenText acquired Carbonite for $1.42 billion, to help expand its cloud offerings.

Commvault also transitioned to a new leader, as longtime CEO Bob Hammer stepped down and former Puppet CEO Sanjay Mirchandani stepped in. The company made its first acquisition in September, buying software-defined storage vendor Hedvig for $225 million to help converge primary and secondary storage for better data management.

Headshot of Commvault's Sanjay MirchandaniSanjay Mirchandani

Acronis became the latest unicorn, closing a $147 million funding round in September at a valuation of more than $1 billion. The company has shifted from a backup-focused product portfolio to a more comprehensive cyber protection platform. Like Carbonite, Acronis now has a major emphasis on cybersecurity.

Druva and its cloud-focused backup and recovery product set received a $130 million funding haul. Just a month later, the vendor acquired CloudLanes and its cloud migration technology.

Veeam Software, which is on the lookout for acquisitions, actually did the reverse this year. The vendor sold back AWS data protection provider N2WS, a company it acquired two years ago, to the original founders. Veeam is launching its own products focused on AWS and Azure backup.

In other data backup news developments:

  • Veritas Technologies acquired Aptare to improve its storage analytics and monitoring.
  • Cohesity made its first acquisition, choosing Imanis Data for NoSQL database protection.
  • Spencer Kupferman took over as CEO of AWS data protection provider Cloud Daddy, a recent entrant into the market.
  • OwnBackup secured $23.25 million, its largest funding round, for expansion of its Salesforce data protection.
  • David Bennett, previously the chief revenue officer at Webroot, became the new CEO of backup and disaster recovery vendor Axcient.

SaaS backup continues its ascent

The software-as-a-service backup market remains one of the hottest in tech. The word is out that SaaS applications such as Salesforce, Google’s G Suite and Microsoft Office 365 need backup because these vendors typically have protection for their own infrastructure but not for your individual files.

Clumio came out of stealth in August with its cloud-based backup as a service. Noting that “SaaS is taking over,” Clumio CEO Poojan Kumar described his company’s founding vision as “building a data management platform on top of the public cloud.” The vendor originally offered protection for VMware on premises, VMware Cloud on AWS and native AWS services. While closing a $135 million funding round in November, Clumio pledged support for more public clouds, SaaS applications and containers, starting with Amazon Elastic Block Store protection.

Clumio backup dashboard
Clumio, which provides backup as a service, came out of stealth in August.

Commvault launched a SaaS backup subsidiary, Metallic, with an emphasis on protecting servers and VMs, Office 365 and endpoints. The data protection vendor is aiming Metallic at smaller businesses than its usual enterprise customers.

Much like recent years, backup was big business in 2019.

In other notable data backup news on the SaaS front:

  • Druva enhanced its SaaS backup capabilities, adding restore options to its Office 365 protection and introducing backup for Slack and Microsoft Teams conversations.
  • Odaseva, a data protection vendor focused on Salesforce, unveiled a high-availability option for the customer relationship management provider.
  • The newly launched Actifio Go SaaS platform offers direct-to-cloud backup to AWS, Azure, Google, IBM and Wasabi public clouds.
  • Arcserve updated its Unified Data Protection product to provide granular, file-level backup and recovery for Office 365.
  • Veeam enhanced its Backup for Microsoft Office 365, the fastest growing product in the history of the company, to back up directly to the cloud to either Azure or AWS.

Container backup takes the spotlight

One area that emerged in 2019 is backup of containers. As Kubernetes workloads in particular increase in popularity, organizations will need specifically targeted protection. Newer vendors, including Kasten, Robin and Portworx, focus on Kubernetes protection and management. Products from other vendors, including IBM Spectrum Protect, tackle Kubernetes protection in addition to other capabilities.

Container and SaaS backup will likely increase in 2020. Organizations should continue to keep an eye on data backup news, as products and businesses are evolving at a dramatic pace.

Go to Original Article

As hiring booms, HR survey raises warning for recruiters

The U.S. labor market data Friday, which showed a gain of 266,000 jobs, called out a few areas as high growth. It cited “notable job gains” in healthcare and professional and technical services. They are two areas that pose big challenges for recruiters, according to a new HR survey.

Professional services firm PwC surveyed 10,000 U.S. workers — all professionals — either in the workforce or looking for work. A key finding in the HR survey data was this: Almost half, 49%, of the professionals surveyed have turned down an offer due to a bad recruiting experience.

They identified themselves as prospective candidates who were made offers but still rejected the job, the HR survey reported.

There are reasons for the high rejection rate, according to Bhushan Sethi, the global people and organization leader at PwC. It could be the result of a lengthy recruiting process, inconsistencies in the attitudes of the people prospective employees met with, or repeated requests for documentation, such as work authorizations.

Sethi said that candidate experience is often thought of as a technology problem. But it’s the more human elements, such as a timely follow-up with candidates, that could be the source of the blame. Candidates, especially those who “probably have multiple choices for their future employment,” want human interaction throughout the recruiting process, Sethi said.

Professional jobs see large gains

Will there be a sector of the workforce that will trade their transparency — so you understand everything about them — for a better opportunity?
Bhushan SethiGlobal people and organization leader, PwC

The government labor report said that employment in professional and technical services increased by 31,000 last month and by 278,000 over the last 12 months. The labor category comprises workers who perform professional, scientific and technical activities for others. It includes accounting, architectural and engineering services, as well as computer design services, among others.

Sethi said organizations need consistent and transparent recruiting processes.

Separately, 62% of professionals want to work at firms committed to improving workplace diversity and inclusion. Social issues, such as climate change, are part of this, and employee activism has been on the rise, publicly and internally.

Candidates are willing to sacrifice higher pay for opportunities like these, Sethi said. They are “willing to trade salary for good work experience and a place that aligns with their own values.” This includes upskilling or training opportunities.

Candidates are willing to make tradeoffs

A surprising finding in the HR survey data is the willingness of candidates to share social media data, if it helps get a good job. Just over 60% said they would give a prospective employer access to social media data now kept private.

“Will there be a sector of the workforce that will trade their transparency — so you understand everything about them — for a better opportunity?” Sethi said. The survey suggested candidates might be willing to make that tradeoff, but few firms are asking for social media access, he said.

Companies that invest in virtual reality as part of the candidate experience may see a payoff. The survey indicated that 62% “say they’re more likely to consider taking a role if they had a chance to experience the actual job through technology.”

Go to Original Article

KPMG’s digital shift fuels AI-empowered audits and more, reducing risk across every industry | Transform

Envision this: It’s another frenetic morning in the stock market as an army of traders at one company chat with their clients by phone – counseling and cautioning, buying and selling.

The outcomes of those calls and transactions carry no guarantees, of course. There will be some winners, some losers. But before the closing bell rings, the traders’ company – an advisory client of KPMG – is sure of one outcome: the engagements were analyzed and potential risks surfaced.

How can the company be so certain? It deployed KPMG’s trader-risk-analytics platform, a solution that applies Azure Cognitive Services to help reduce risk and meet rising regulatory requirements within the financial services industry.

The platform is just one example of a solution jointly developed in the KPMG and Microsoft Digital Solution Hub, and a testament to KPMG’s drive to digitize its customer offerings across advisory, tax and audit by implementing Microsoft’s intelligent cloud.

En employee walks through a hallway at KPMG.
An employee at KPMG.

To accelerate KPMG’s move to the cloud, KPMG and Microsoft have signed a five-year agreement that will allow KPMG and its clients to benefit from Microsoft innovations, including a strong focus on AI, risk and cyber security.

As one of the “Big Four” organizations, KPMG’s services and solutions encompass all industries – from government to banking to health care. That wide-ranging impact means KPMG also provides a potent business case for the potential of Microsoft technology to enhance and revitalize customers’ businesses across every sector, says Microsoft CEO Satya Nadella.

“Together with KPMG, we’re accelerating digital transformation across industries by bringing the latest advances in cloud, AI and security to highly regulated workloads in audit, tax and advisory,” Nadella says.

To grasp the scope and reach of KPMG’s digital evolution, take a closer look at one of the platforms it has launched for a core business line – audit. Better yet, just meet KPMG Clara.

KPMG is bolstering audit quality by infusing the process with data analytics, AI and Azure Cognitive Services, allowing audit professionals to use company data to bring more relevance to their audit findings and continue to meet increasing regulatory requirements and standards. KPMG uses Azure Cognitive Services to provide more continuous, holistic and deeper insights and value on audit-relevant data.

The company’s smart audit platform, KPMG Clara, is automated, agile, intelligent and scalable – ushering in what KPMG calls a new era for the audit. KPMG is deploying KPMG Clara globally, allowing clients access to real-time information arising from the audit process and communication with the audit team.

A KPMG building is shown from outside with grass in the foreground.
A KPMG building.

In addition, KPMG Clara will integrate with Microsoft Teams, providing a platform for audit professionals to work together on a project, centrally managing and securely sharing audit files, tracking audit-related activities and communicating using chat, voice and video meetings. This will simplify the auditors’ workflow, enabling them to stay in sync throughout the entire process and drive continuous communication with the client.

“Technology is disrupting organizations across the globe,” says Bill Thomas, global chairman of KPMG International. “Clients are turning to us like never before to help them implement, manage and optimize the digital transformation of their organizations.”

In fact, 65% of CEOs believe that AI will create more jobs than it eliminates, according to a survey of 1,300 CEOs conducted by KPMG for its 2019 “Global CEO Outlook” report.

The survey also found that 50% of CEOs expect to see significant a return on their AI investments in three to five years, while 100% have piloted or implemented AI to automate processes.

Through its tech expansion, KPMG’s clients will benefit from “consistent global service delivery, greater speed of deployment and industry-leading security standards to safeguard their data,” the company says.

At the same time, KPMG professionals will gain access to an arsenal of cloud-based tools to build business solutions and managed services that are embedded with AI and machine learning capabilities.

And with robotic process automation (RPA), they can utilize AI-infused software that completes the types of high-volume, repeatable tasks that once drained hours from their work weeks.

Two people inside a KPMG building enter a stairwell.
Two people entering a KPMG member firm.

“Technology and data-driven business models are disrupting the business landscape,” says KPMG global chairman Thomas. “Our multi-year investment in digital leadership will help us remain at the forefront of this shift and further strengthen our position as the digital transformation partner of choice for our clients.”

KPMG also is modernizing its workplace for 207,000 employees across 153 member firms, using the Microsoft 365 suite of cloud-based collaboration and productivity tools, including Microsoft Teams.

KPMG deployed Dynamics 365 for more than 30,000 of their professionals across 17 member firms. This equips them with modern customer-relationship applications to quickly and efficiently manage both client requests and client demand.

Says Nadella: “KPMG’s deep industry and process expertise, combined with the power of our trusted cloud – spanning Azure, Dynamics 365 and Microsoft 365 – will bring the best of both organizations together to help customers around the world become more agile in an increasingly complex business environment.”

Top photo: Two people sitting in a KPMG lobby. (All photos courtesy of KPMG)

Go to Original Article
Author: Microsoft News Center

AWS Outposts brings hybrid cloud support — but only for Amazon

LAS VEGAS — AWS controls nearly half of the public IaaS market today, and based on the company’s rules against use of the term ‘multi-cloud,’ would be happy to have it all, even as rivals Microsoft and Google make incremental gains and more customers adopt multi-cloud strategies.

That’s the key takeaway from the start of this year’s massive re:Invent conference here this week, which was marked by the release of AWS Outposts for hybrid clouds and a lengthy keynote from AWS CEO Andy Jassy that began with a tongue-in-cheek invite to AWS’ big tent in the cloud.

“You have to decide what you’re going to bring,” Jassy said of customers who want to move workloads into the public cloud. “It’s a little bit like moving from a home,” he added, as a projected slide comically depicted moving boxes affixed with logos for rival vendors such as Oracle and IBM sitting on a driveway.

“It turns out when companies are making this big transformation, what we see is that all bets are off,” Jassy said. “They reconsider everything.”

For several years now, AWS has used re:Invent as a showcase for large customers in highly regulated industries that have made substantial, if not complete, migrations to its platform. One such company is Goldman Sachs, which has worked with AWS on several projects, including Marcus, a digital banking service for consumers. A transaction banking service that helps companies manage their cash in a cloud-native stack on AWS is coming next year, said Goldman Sachs CEO David Solomon, who appeared during Jassy’s talk. Goldman is also moving its Marquee market intelligence platform into production on AWS.

Along with showcasing enthusiastic customers like Goldman Sachs, Jassy took a series of shots at the competition, some veiled and others overt.

“Every industry has lots of companies with mainframes, but everyone wants to move off of them,” he claimed. The same goes for databases, he added. Customers are trying to move away from Oracle and Microsoft SQL Server due to factors such as expense and lock-in, he said. Jassy didn’t mention that similar accusations have been lodged at AWS’ native database services.

Jassy repeatedly took aim at Microsoft, which has the second most popular cloud platform after AWS, albeit with a significant lag. “People don’t want to pay the tax anymore for Windows,” he said.

But it isn’t as if AWS would actually shun Microsoft technology, since it has long been a host for many Windows Server workloads. In fact, it wants as much as it can get. This week, AWS introduced a new bring-your-own-license program for Windows Server and SQL Server designed to make it easier for customers to run those licenses on AWS, versus Azure.

AWS pushes hybrid cloud, but rejects multi-cloud

One of the more prominent, although long-expected, updates this week is the general availability of AWS Outposts. These specialized server racks provided by AWS reside in customers’ own data centers, in order to comply with regulations or meet low-latency needs. They are loaded with a range of AWS software, are fully managed by AWS and maintain continuous connections to local AWS regions.

The company is taking the AWS Outposts idea a bit further with the release of new AWS Local Zones. These will consist of Outpost machines placed in facilities very close to large cities, giving customers who don’t want or have their own data centers, but still have low-latency requirements, another option. Local Zones, the first of which is in the Los Angeles area, provide this capability and tie back to AWS’ larger regional zones, the company said.

Outposts, AWS Local Zones and the previously launched VMware Cloud on AWS constitute a hybrid cloud computing portfolio for AWS — but you won’t hear Jassy or other executives say the phrase multi-cloud, at least not in public.

In fact, partners who want to co-brand with AWS are forbidden from using that phrase and similar verbiage in marketing materials, according to an AWS co-branding document provided to SearchAWS.com.

“AWS does not allow or approve use of the terms ‘multi-cloud,’ ‘cross cloud,’ ‘any cloud, ‘every cloud,’ or any other language that implies designing or supporting more than one cloud provider,” the co-branding guidelines, released in August, state. “In this same vein, AWS will also not approve references to multiple cloud providers (by name, logo, or generically).”

An AWS spokesperson didn’t immediately reply to a request for comment.

The statement may not be surprising in context of AWS’s market lead, but does stand in contrast to recent approaches by Google, with the Anthos multi-cloud container management platform, and Microsoft’s Azure Arc, which uses native Azure tools, but has multi-cloud management aspects.

AWS customers may certainly want multi-cloud capabilities, but can protect themselves by using portable products and technologies, such as Kubernetes at the lowest level with a tradeoff being the manual labor involved, said Holger Mueller, an analyst with Constellation Research in Cupertino, Calif.

“To be fair, Azure and Google are only at the beginning of [multi-cloud],” he said.

Meanwhile, many AWS customers have apparently grown quite comfortable moving their IT estates onto the platform. One example is Cox Automotive, known for its digital properties such as Autotrader.com and Kelley Blue Book.

In total, Cox has more than 200 software applications, many of which it accrued through a series of acquisitions, and the company expects to move it all onto AWS, said Chris Dillon, VP of architecture, during a re:Invent presentation.

Cox is using AWS Well-Architected Framework, a best practices tool for deployments on AWS, to manage the transition.

“When you start something new and do it quickly you always run the risk of not doing it well,” said Gene Mahon, director of engineering operations. “We made a decision early on that everything would go through a Well-Architected review.”

Go to Original Article

Redis Labs eases database management with RedisInsight

The robust market of tools to help users of the Redis database manage their systems just got a new entrant.

Redis Labs disclosed the availability of its RedisInsight tool, a graphical user interface (GUI) for database management and operations.

Redis is a popular open source NoSQL database that is also increasingly being used in cloud-native Kubernetes deployments as users move workloads to the cloud. Open source database use is growing quickly according to recent reports as the need for flexible, open systems to meet different needs has become a common requirement.

Among the challenges often associated with databases of any type is ease of management, which Redis is trying to address with RedisInsight.

“Database management will never go out of fashion,” said James Governor, analyst and co-founder at RedMonk. “Anyone running a Redis cluster is going to appreciate better memory and cluster management tools.”

Governor noted that Redis is following a tested approach, by building out more tools for users that improve management. Enterprises are willing to pay for better manageability, Governor noted, and RedisInsight aims to do that.

RedisInsight based on RDBtools

The RedisInsight tool, introduced Nov. 12, is based on the RDBTools technology that Redis Labs acquired in April 2019. RDBTools is an open source GUI for users to interact with and explore data stores in a Redis database.

Database management will never go out of fashion.
James GovernorAnalyst and co-founder, RedMonk

Over the last seven months, Redis added more capabilities to the RDBTools GUI, expanding the product’s coverage for different applications, said Alvin Richards, chief product officer at Redis.

One of the core pieces of extensibility in Redis is the ability to introduce modules that contain new data structures or processing frameworks. So for example, a module could include time series, or graph data structures, Richards explained.

“What we have added to RedisInsight is the ability to visualize the data for those different data structures from the different modules,” he said. “So if you want to visualize the connections in your graph data for example, you can see that directly within the tool.”

RedisInsight overview dashboard
RedisInsight overview dashboard

RDBTools is just one of many different third-party tools that exist for providing some form of management and data insight for Redis. There are some 30 other third-party GUI tools in the Redis ecosystem, though lack of maturity is a challenge.

“They tend to sort of come up quickly and get developed once and then are never maintained,” Richards said. “So, the key thing we wanted to do is ensure that not only is it current with the latest features, but we have the apparatus behind it to carry on maintaining it.”

How RedisInsight works

For users, getting started with the new tool is relatively straightforward. RedisInsight is a piece of software that needs to be downloaded and then connected to an existing Redis database. The tool ingests all the appropriate metadata and delivers the visual interface to users.

RedisInsight is available for Windows, macOS and Linux, and also available as a Docker container. Redis doesn’t have a RedisInsight as a Service offering yet.

“We have considered having RedisInsight as a service and it’s something we’re still working on in the background, as we do see demand from our customers,” Richards said. “The challenge is always going to be making sure we have the ability to ensure that there is the right segmentation, security and authorization in place to put guarantees around the usage of data.”

Go to Original Article

When to Use SCVMM (And When Not To)

Microsoft introduced Hyper-V as a challenge to the traditional hypervisor market. Rather than a specialty hallmark technology, they made it into a standardized commodity. Instead of something to purchase and then plug into, Microsoft made it ubiquitously available as something to build upon. As a side effect, administrators manage Hyper-V using markedly different approaches than other systems. In this unfamiliar territory, we have a secondary curse of little clear guidance. So, let’s take a look at the merits and drawbacks of Microsoft’s paid Hyper-V management tool, System Center Virtual Machine Manager.

What is System Center Virtual Machine Manager?

“System Center” is an umbrella name for Microsoft’s datacenter management products, much like “Office” describes Microsoft’s suite of desktop productivity applications. System Center has two editions: Standard and Datacenter. Unlike Office, the System Center editions do not vary by the number of member products that you can use. Both editions allow you to use all System Center tools. Instead, the different editions differ by the number of systems that you can manage. We will not cover licensing in this article; please consult your reseller.

System Center Virtual Machine Manager, or SCVMM, or just VMM, presents a centralized tool to manage multiple Hyper-V hosts and clusters. It provides the following features:

  • Bare-metal deployment of Hyper-V hosts
  • Pre-defined host and virtual switch configuration combinations
  • Control over clusters, individual hosts, and virtual machines
  • Virtual machine templating
  • Simultaneous deployment of multiple templates for automated setup of tiered services
  • Granular access controls (control over specific hosts, VMs, deployments, etc.)
  • Role-based access
  • Self-service tools
  • Control over Microsoft load balancers
  • Organization of offline resources (ISOs, VHDXs, etc.)
  • Automatic balancing of clustered virtual machines
  • Control over network virtualization
  • Partial control over ESXi hosts

In essence, VMM allows you to manage your datacenter as a cloud.

Can I Try VMM Before Buying?

You can read the list above to get an idea of the product’s capabilities. But, you can’t distinguish much about a product from a simple bulleted list. You learn the most about a tool by using it. To that end, you can download an evaluation copy of the System Center products. I created a link to the current long-term version (2019). If you scroll below that, you will find an evaluation for the semi-annual channel releases. Because of the invasive nature of VMM, I highly recommend that you restrict it to a testbed of systems. If you don’t have a test environment, then it presents you with a fantastic opportunity to try out nested virtualization.

Why Should I Use VMM to Manage my Hyper-V Environment?

Rather than trying to take you through a world tour of features that you could more easily explore on your own, I want to take this up to a higher-level view. Let’s settle one fact upfront: not everyone needs VMM. To make a somewhat bolder judgment, very few Hyper-V deployments need it. So, let’s cover the ones that do.

VMM for Management at Scale

The primary driver of VMM use has less to do with features than with scale. Understand that VMM does almost nothing that you cannot do yourself with freely-available tools. It can make tasks easier. The more hosts you have, the more work to do. So, if you’ve got many hosts, it doesn’t hurt to have some help. Of course, the word “many” does not have a universal meaning. Where do we draw the line?

For starters, we would not draw any line at all. If you’ve gone through the evaluation, you like what VMM has to offer, and the licensing cost does not drive you away, then use VMM. If you go through the effort to configure it properly, then VMM can work for even a very small environment. We’ll dive deeper into that angle in the section that discusses the disincentives to use VMM.

Server hosting providers with dozens or hundreds of clients make an obvious case for VMM. VMM does one thing easy that nothing else can: role-based access. The traditional tools allow you to establish host administrators, but nothing more granular. If you want a simple tool to establish control for tenants, VMM can do that.

VMM solves another problem that makes the most sense in the context of hosting providers: network virtualization. The term “network virtualization” could have several meanings, so let’s disambiguate it. With network virtualization, we can use the same IP addresses in multiple locations without collision. In many contexts, we can provide that with network address translation (NAT) routers. But, for tenants, we need to separate their traffic from other networks while still using common hardware. We could do that with VLANs, but that gives us two other problems. First, we have a hard limit on the number of VLANs that can co-exist. Second, customers may want to stretch their networks, including their VLANs, into the hosted environment. With current versions of Hyper-V, we have the ability to manage network virtualization with PowerShell, but VMM still makes it easier.

So, if you manage very large environments that can make use of VMM’s tenant management, or if you have a complicated networking environment that can benefit from network virtualization, then VMM makes sense for you.

VMM for Cloud Management

VMM for cloud management really means much the same thing as the previous section. It simply changes the approach to thinking about it. The common joke goes, “the cloud is just someone else’s computer”. But, how does that change when it’s your cloud? Of course, that joke has always represented a misunderstanding of cloud computing.

A cloud makes computing resources available in a controlled fashion. Prior to the powers of virtualization, you would either assign physical servers or you’d slice out access to specific resources (like virtual servers in Apache). With virtualization, you can create virtual machines of particular sizes, which supplants the physical server model. With a cloud, at least the way that VMM treats it, you can quickly stand up all-new systems for clients. You can even give them the power to do deploy their own.

Nothing requires the term “client” to apply only to external, paying customers. “Client” could easily mean internal teams. You can have an “accounting cloud” and a “sales cloud” and whatever else you need. Hosting providers aren’t the only entities that need to easily provide computing resources.

Granular Management Capabilities

I frequently see requests for granular control over Hyper-V resources. Administrators want to grant access to specific users to manage or connect to particular virtual machines. They want helpdesk staff to be able to reboot VMs, but not change settings. They want to allow different administrators to perform different functions based on their roles within the organization. I also think that some people just want to achieve a virtual remote desktop environment without paying the accompanying license fees.

VMM enables all of those things (except the VDI sidestep, of course). Some of these things are impossible with native tools. With difficulty, you can achieve some in other ways, such as with Constrained PowerShell Endpoints. VMM does it all, and with much greater ease.

The Quick Answer to Choosing VMM

I hope that all of this information provides a clearer image. When you have a large or complex Hyper-V environment, especially with multiple stakeholders that need to manage their own systems, VMM can help you. If you read through all of the above and did not see how any of that could meaningfully apply to your organization, then the next section may fit you better.

Reasons NOT to Use SCVMM?

We’ve seen the upsides of VMM. Now it’s time for a look at the downsides.

VMM Does Not Come Cheap – or Alone

You can’t get VMM by itself. You must buy into the entire suite or get nothing at all. I won’t debate the merits of the other members of this suite in this article. Whether you want them or not, they all come as a set. That means that you pay for the set. If you get the quote and feel any hesitation at paying it, then that’s a sign that it might not be worth it to you.

VMM is Heavy

Hyper-V’s built-in management tools require almost nothing. The PowerShell module and MMC consoles are lightweight. They require a bit of disk space to store and a spot of memory to operate. They communicate with the WMI/CIM interfaces to do their work.

VMM shows up at the opposite end. It needs a server application install, preferably on a dedicated system. It stores all of its information in a Microsoft SQL database. It requires an agent on every managed host.

VMM Presents its Own Challenges

VMM is not easy to install, configure, or use. You will have questions during your first install that the documentation does not cover. It does not get easier. I have talked with others that have different experiences from mine; some with problems that I did not encounter, and others that have never dealt with things that routinely irritate me. I will limit this section to the things that I believe every potential VMM customer will need to prepare for.

Networking Complexity

We talked about the powers of network virtualization earlier. That technology necessitates complexity. However, VMM makes things difficult even when you have a simple Hyper-V networking design. In my opinion, it’s needlessly complicated. You have several configuration points. If you miss one, something will not work. To tell the full story, a successful network configuration can be easily duplicated to other systems, even overwriting existing configurations. However, in smaller deployments, the negatives can greatly outweigh the positives.

General Complexity

I singled out networking in its own section because I feel that VMM’s designers could have created an equally capable networking system with a substantially simpler configuration. But, I think they can justify most of the rest of the complexity. VMM was built to enable you to run your own cloud – multiple clouds, even. That requires a bit more than the handful of interfaces necessary to wrangle a couple of hosts and a handful of VMs.

Over-Eager Problem Solving

When VMM detects problems, it tries to apply fixes. That sounds good, except that the “fixes” are often worse than the disease – and sometimes there aren’t even any problems to fix. I’ve had hosts drained of their VMs, sitting idle, all because VMM suddenly decided that there was a configuration problem with the virtual switch. Worse, it wouldn’t specify what it didn’t like about that virtual switch or propose how to remedy the problem. You’ll see unspecified problems with hosts and virtual machines that VMM won’t ignore and require you to burn time in tedious housekeeping.

Convoluted Error Messaging

A point of common frustration that you’ll eventually run into: the error messages. VMM often leaves cryptic error messages in its logs. I’ve encountered numerous messages that I could not understand or find any further information about. These cost time and energy to research. Inability to uncover what triggered something or even find an actual problem – these things eventually lead to “alarm fatigue”. You simply ignore the messages that don’t seem to matter, thereby taking a risk that you’ll miss something that does matter.

Mixed Version Limitations

With the introduction of changes in Hyper-V in the 2012 series, Microsoft directly addressed an earlier problem: simultaneous management of different versions of Hyper-V. You can currently use Hyper-V Manager and Failover Cluster Manager in the Windows 8+ and Windows Server 2012+ versions to control any version of Hyper-V that employs the v2 namespace. Officially, Microsoft says that any given built-in management tool will work with the version they were released with, any lower version that supports v2, and one version higher. They can only manage the features that they know about, of course, but they’ll work.

Conversely, I have not seen any version of VMM that can control a higher-level Hyper-V version. VMM 2016 controls 2016 and lower, but not 2019. Furthermore, System Center rarely releases on the same schedule as Windows Server. VMM-reliant shops that wanted to migrate to Hyper-V in Windows Server 2019 had to wait several months for the release of VMM 2019.

The Quick Answer to Choosing Against VMM

As mentioned a few earlier times in this article, the decision against VMM will largely rest on the scale of your deployment. Whether or not the problems that I mentioned above matter to you – or even apply to you – you will need to invest time and effort specifically for managing VMM. If you do not have that time, or if that effort is simply not worth it to you, then do not use VMM.

Remember that you have several free tools available: Hyper-V Manager, Failover Cluster Manager, their PowerShell modules, and Windows Admin Center.

Addressing the Automatic Recommendation for VMM

Part of the impetus behind writing this article was the oft-repeated directive to always use VMM with Hyper-V. For some writers and forum responders, it’s simply automatic. Unfortunately, it’s simply bad advice. It’s true that VMM provides an integrated, all-in-one management experience. But, if you’ve only got a handful of hosts, you can get a lot of mileage out of the free management tools. Where the graphical tools prove functionally inadequate, PowerShell can pick up the slack. I know that some administrators resist using PowerShell or any other command-line tools, but they simply have no valid reasons.

I will close this out by repeating what I said earlier in the article: get the evaluations and try out VMM. Set up networking, configure hosts, deploy virtual machines, and build-out services. You should know quickly if it’s all worth it to you. Decide for yourself. And remember to come back and tell us your experiences! Good luck!

Go to Original Article
Author: Eric Siron

SAP sees S/4HANA migration as its future, but do customers?

The first part of our 20-year SAP retrospective examined the company’s emerging dominance in the ERP market and its transition to the HANA in-memory database. Part two looks at the release of SAP S/4HANA in February 2015. The “next-generation ERP” was touted by the company as the key to SAP’s future, but it ultimately raised questions that in many cases have yet to be answered. The issues surrounding the S/4HANA migration remain the most compelling initiative for the company’s future.

Questions about SAP’s future have altered in the past year, as the company has undergone an almost complete changeover in its leadership ranks. Most of the SAP executives who drove the strategy around S/4HANA and the intelligent enterprise have left the company, including former CEO Bill McDermott. New co-CEOs Jennifer Morgan and Christian Klein are SAP veterans, and analysts don’t think the change in leadership will make for significant changes in the company’s technology and business strategy.

But they will take over the most daunting task SAP has faced: convincing customers of the business value of the intelligent enterprise, a data-driven transformation of businesses with S/4HANA serving as the digital core. As part of the transition toward intelligence, SAP is pushing customers to move off of tried and true SAP ECC ERP systems (or the even older SAP R/3), and onto the modern “next-generation ERP” S/4HANA. SAP plans to end support for ECC by 2025.

Dan LahlDan Lahl

S/4HANA is all about enabling businesses to make decisions in real time as data becomes available, said Dan Lahl, SAP vice president of product marketing and a 24-year SAP veteran.

“That’s really what S/4HANA is about,” Lahl said. “You want to analyze the data that’s in your system today. Not yesterday’s or last week’s information and data that leads you to make decisions that don’t even matter anymore, because the data’s a week out. It’s about giving customers the ability to make better decisions at their fingertips.”

S/4HANA migration a matter of when, not if

Most SAP customers see the value of an S/4HANA migration, but they are concerned about how to get there, with many citing concerns about the cost and complexity of the move. This is a conundrum that SAP acknowledges.

“We see that our customers aren’t grappling with if [they are going to move], but when,” said Lloyd Adams, managing director of the East Region at SAP America. “One of our responsibilities, then, is to provide that clarity and demonstrate the value of S/4HANA, but to do so in the context of the customers’ business and their industry. Just as important as showing them how to move, we need to do it as simply as possible, which can be a challenge.”

Lloyd AdamsLloyd Adams

S/4HANA is the right platform for the intelligent enterprise because of the way it can handle all the data that the intelligent enterprise requires, said Derek Oats, CEO of Americas at SNP, an SAP partner based in Heidelberg, Germany that provides migration services.

In order to build the intelligent enterprise, customers need to have a platform that can consume data from a variety of systems — including enterprise applications, IoT sensors and other sources — and ready it for analytics, AI and machine learning, according to Oats. S/4HANA uses SAP HANA, a columnar, in-memory database, to do that and then presents the data in an easy-to-navigate Fiori user interface, he said.

“If you don’t have that ability to push out of the way a lot of the work and the crunching that has often occurred down to the base level, you’re kind of at a standstill,” he said. “You can only get so much out of a relational database because you have to rely on the CPU at the application layer to do a lot of the crunching.”

S/4HANA business case difficult to make

Although many SAP customers understand the benefits of S/4HANA, SAP has had a tough sell in getting its migration message across to its large customer base. The majority of customers plan to remain on SAP ECC and have only vague plans for an S/4HANA migration.

Joshua GreenbaumJoshua Greenbaum

“The potential for S/4HANA hasn’t been realized to the degree that SAP would like,” said Joshua Greenbaum, principal at Enterprise Applications Consulting. “More companies are really looking at S/4HANA as the driver of genuine business change, and recognize that this is what it’s supposed to be for. But when you ask them, ‘What’s your business case for upgrading to S/4HANA?’ The answer is ‘2025.’”

The real issue with S/4HANA is that the concepts behind it are relatively big and very specific to company, line of business and geography.
Joshua GreenbaumPrincipal, Enterprise Applications Consulting

One of the problems that SAP faces when convincing customers of the value of S/4HANA and the intelligent enterprise is that no simple use case drives the point home, Greenbaum said. Twenty years ago, Y2K provided an easy-to-understand reason why companies needed to overhaul their enterprise business systems, and the fear that computers wouldn’t adapt to the year 2000 led in large measure to SAP’s early growth.

“Digital transformation is a complicated problem and the real issue with S/4HANA is that the concepts behind it are relatively big and very specific to company, line of business and geography,” he said. “So the use cases are much harder to justify, or it’s much more complicated to justify than, ‘Everything is going to blow up on January 1, 2000, so we have to get our software upgraded.'”

Evolving competition faces S/4HANA

Jon Reed, analyst and co-founder of ERP news and analysis firm Diginomica.com, agrees that SAP has successfully embraced the general concept of the intelligent enterprise with S/4HANA, but struggles to present understandable use cases.

Jon ReedJon Reed

“The question of S/4HANA adoption remains central to SAP’s future prospects, but SAP customers are still trying to understand the business case,” Reed said. “That’s because agile, customer-facing projects get the attention these days, not multi-year tech platform modernizations. For those SAP customers that embrace a total transformation — and want to use SAP tech to do it — S/4HANA looks like a viable go-to product.”

SAP’s issues with driving S/4HANA adoption may not come from the traditional enterprise competitors like Oracle, Microsoft and Infor, but from cloud-based business applications like Salesforce and Workday, said Eric Kimberling, president of Third Stage Consulting, a Denver-based firm that provides advice on ERP deployments and implementations.

Eric KimberlingEric Kimberling

“They aren’t direct competitors with SAP; they don’t have the breadth of functionality and the scale that SAP does, but they have really good functionality in their best-of-breed world,” Kimberling said. “Companies like Workday and Salesforce make it easier to add a little piece of something without having to worry about a big SAP project, so there’s an indirect competition with S/4HANA.”

SAP customers are going to have to adapt to evolving enterprise business conditions regardless of whether or when they move to S/4HANA, Greenbaum said.

“Companies have to build business processes to drive the new business models. Whatever platform they settle on, they’re going to be unable to stand still,” he said. “There’s going to have to be this movement in the customer base. The question is will they build primarily on top of S/4HANA? Will they use an Amazon or an Azure hyperscaler as the platform for innovation? Will they go to their CRM or workforce automation tool for that? The ‘where’ and ‘what next’ is complicated, but certainly a lot of companies are positioning themselves to use S/4HANA for that.”

Go to Original Article

Microsoft Ignite 2019 conference coverage

Editor’s note

For Microsoft, it’s all cloud, all the time.

No matter the market — education, developers or enterprise — the technology giant continues to expand and heavily market its cloud offerings to those groups. And for good reason. Sales related to cloud products continue to go through the roof, totaling $11 billion in a recent earnings release.

It’s not all from traditional server workloads running on Azure. Microsoft says it has more than 180 million users on its Office 365 collaboration platform. The company expects its investments in AI, a good portion of which will have its roots in Azure, will pay off in the future.

This guide highlights the company’s recent moves in the marketplace. Check back during the Microsoft Ignite 2019 conference, being held Nov. 4 to 8 in Orlando, Fla., for articles about product updates and new offerings.

Go to Original Article

VMware’s Bitnami acquisition grows its development portfolio

The rise of containers and the cloud has changed the face of the IT market, and VMware must evolve with it. The vendor has moved out of its traditional data center niche and — with its purchase of software packager Bitnami — has made a push into the development community, a change that presents new challenges and potential. 

Historically, VMware delivered a suite of system infrastructure management tools. With the advent of cloud and digital disruption, IT departments’ focus expanded from monitoring systems to developing applications. VMware has extended its management suite to accommodate this shift, and its acquisition of Bitnami adds new tools that ease application development.

Building applications presents difficulties for many organizations. Developers spend much of their time on application plumbing, writing software that performs mundane tasks — such as storage allocation — and linking one API to another.

Bitnami sought to simplify that work. The company created prepackaged components called installers that automate the development process. Rather than write the code themselves, developers can now download Bitnami system images and plug them into their programs. As VMware delves further into hybrid cloud market territory, Bitnami brings simplified app development to the table.

Torsten Volk, managing research director at Enterprise Management AssociatesTorsten Volk

“Bitnami’s solutions were ahead of their time,” said Torsten Volk, managing research director at Enterprise Management Associates (EMA), a computer consultant based out of Portsmouth, New Hampshire. “They enable developers to bulletproof application development infrastructure in a self-service manner.”

The value Bitnami adds to VMware

Released under the Apache License, Bitnami’s modules contain commonly coupled software applications instead of just bare-bones images. For example, a Bitnami WordPress stack might contain WordPress, a database management system (e.g., MySQL) and a web server (e.g., Apache).

Bitnami takes care of several mundane programming chores. Its keeps all components up-to-date — so if it finds a security problem, it patches that problem — and updates those components’ associated libraries. Bitnami makes its modules available through its Application Catalogue, which functions like an app store.

The company designed its products to run on a wide variety of systems. Bitnami supports Apple OS X, Microsoft Windows and Linux OSes. Its VM features work with VMware ESX and ESXi, VirtualBox and QEMU. Bitnami stacks also are compatible with software infrastructures such as WAMP, MAMP, LAMP, Node.js, Tomcat and Ruby. It supports cloud tools from AWS, Azure, Google Cloud Platform and Oracle Cloud. The installers, too, feature a wide variety of platforms, including Abante Cart, Magento, MediaWiki, PrestaShop, Redmine and WordPress. 

Bitnami seeks to help companies build applications once and run them on many different configurations.

“For enterprise IT, we intend to solve for challenges related to taking a core set of application packages and making them available consistently across teams and clouds,” said Milin Desai, general manager of cloud services at VMware.

Development teams share project work among individuals, work with code from private or public repositories and deploy applications on private, hybrid and public clouds. As such, Bitnami’s flexibility made it appealing to developers — and VMware.

How Bitnami and VMware fit together

[VMware] did not pay a premium for the products, which were not generating a lot of revenue. Instead, they wanted the executives, who are all rock stars in the development community.
Torsten VolkManaging Research Director, EMA

VMware wants to extend its reach from legacy, back-end data centers and appeal to more front-end and cloud developers.

“In the last few years, VMware has gone all in on trying to build out a portfolio of management solutions for application developers,” Volk said. VMware embraced Kubernetes and has acquired container startups such as Heptio to prove it.

Bitnami adds another piece to this puzzle, one that provides a curated marketplace for VMware customers who hope to emphasize rapid application development.

“Bitnami’s application packaging capabilities will help our customers to simplify the consumption of applications in hybrid cloud environments, from on-premises to VMware Cloud on AWS to VMware Cloud Provider Program partner clouds, once the deal closes,” Desai said.

Facing new challenges in a new market

However, the purchase moves VMware out of its traditional virtualized enterprise data center sweet spot. VMware has little name recognition among developers, so the company must build its brand.

“Buying companies like Bitnami and Heptio is an attempt by VMware to gain instant credibility among developers,” Volk said. “They did not pay a premium for the products, which were not generating a lot of revenue. Instead, they wanted the executives, who are all rock stars in the development community.”  

Supporting a new breed of customer poses its challenges. Although VMware’s Bitnami acquisition adds to its application development suite — an area of increasing importance — it also places new hurdles in front of the vendor. Merging the culture of a startup with that of an established supplier isn’t always a smooth process. In addition, VMware has bought several startups recently, so consolidating its variety of entities in a cohesive manner presents a major undertaking.

Go to Original Article

Oracle CDP moves beyond marketing data

SAN FRANCISCO — Oracle has entered the customer data platform market, but some observers wonder whether the category is already headed into obsolescence.

The Oracle Customer Data Platform (CDP) joins Adobe’s, released earlier this year. Salesforce and SAP have their own CDPs in development. While they attempt to solve a difficult technical problem — matching, updating and deduplicating customer records across marketing, sales, service and e-commerce systems — CDPs are difficult to explain to c-suite leaders who sign off on large IT purchases.

Moreover, for CIOs, a CDP represents another tool to support and secure in already complex cloud enterprise application stacks.

No one disputes the need for B2C and B2B companies to aggregate customer data to drive faster, more precisely personalized sales and promotions, said Paul Gaynor, a technology consulting leader for PwC, a tax and audit services firm based in London. But when selling clients on CX initiatives, he said, his team leaves the CDP discussion to the developers, instead focusing on outcomes and bottom-line potential.

“We don’t make it about the data,” Gaynor said. “The data is the currency, a really important part of the equation, for sure. But the infrastructure and how it has to pass from platform to platform to drive AI- or human-based decisions … that’s just part of workflow.”

That said, he sees potential for the Oracle CDP to derive more specific, usable insights from many more data sources than customer experience platforms , even reaching into supply-chain systems to shape personalized customer offers.

Oracle EVP Rob Tarkoff presenting at OpenWorld.
Rob Tarkoff, Oracle EVP, delivers the CX keynote at Oracle OpenWorld.

Oracle CDP goes beyond marketing

When Oracle talks to customers in advertising-heavy sectors, those users believe that CDPs are the technology answer to melding customer data from third-party advertising platforms with their own marketing data, said Oracle EVP Rob Tarkoff. In other sectors, CDPs are less important, Tarkoff said.

Yet Oracle bills its CX Unity platform as “more than a CDP,” able to reach past marketing systems and draw deeper insights from ERP and other peripheral data systems. In Tarkoff’s mind, current CDPs tend to be limited to marketing automation. Yet in conversations with some customers, CDPs “are coming up all the time,” Tarkoff said.

Whatever the platform is called, profile veracity — the ability to dedupe, normalize and resolve different data sets to real identity, at scale — is a big challenge for these data platforms.

“That, and in every industry, there’s a different schema for how you want to represent a customer profile,” Tarkoff said. “A bank has a different set of attributes than an insurance company, a communications company or a retailer.”

Data wrangling to remain difficult

Some observers, such as Deloitte Digital Principal and CTO Sam Kapreilian, believe that despite the difficulty of easily explaining CDPs — let alone their value — customer data platforms will become bedrock technology to garner data insights and drive revenue in the years to come. Rather than headed toward obsolescence, Deloitte’s customers see the potential of new versions of the tool like Oracle CDP.

[CDP is] an ongoing project, it’s going to take years. It’s like the journey to self-improvement — it never ends.
Michael KrigsmanAnalyst and founder, CXOTalk

“This stuff wasn’t possible two to three years ago,” Kapreilian said. “It just wasn’t affordable.”

Michael Krigsman, analyst and founder of CXOTalk, said whatever future platforms perform the processes currently assigned to CDPs will have to solve the same problem: Figure out how to find and track revenue in data that often is far removed from the final sales process, aggregate it in a single platform and ultimately assign a value to AI-fueled data personalization.

“It’s an ongoing project; it’s going to take years,” Krigsman said. “It’s like the journey to self-improvement — it never ends.”

Go to Original Article