Tag Archives: acquisition

Veeam acquisition puts heat on rivals

Although at least one analyst warned that the Veeam Software acquisition by Insight Partners should have its competitors wary, two of Veeam’s closest backup rivals said they will benefit from the deal.

After disclosing the deal last week, Veeam and private equity firm Insight executives said the backup vendor will shift its strategy toward hybrid cloud data protection and expanding its U.S. presence.

Christophe Bertrand, senior analyst at Enterprise Strategy Group, said Veeam’s competitors should be wary of a competitive shakeup. Unlike the scenario where a larger tech player buys up Veeam to bolster its product offerings, Insight has stepped in solely to expand Veeam’s reach to generate a return on investment. Bertrand said Veeam now has the financial backing to acquire and accelerate on its expansion strategy.

“Clearly they have a sponsor here who is very focused on growth. It’s a private equity firm; they want a good return out of it,” Bertrand said.

However, cloud-based backup vendor Druva and data protection and management software vendor Commvault painted the Veeam acquisition in a positive light for them. Executives from those rivals said Veeam’s acquisition validates the importance of backup — and proves their technology and business strategies are on the right track.

Druva CTO Stephen Manley said his company is already established at cloud-based backup, which is part of Veeam’s new focus.

“We’ve got a head start in this market. They have a lot of work to get to where we are,” Manley said.

Druva launched its Druva Phoenix platform in 2014, enabling server backup to the cloud, built on AWS. The company acquired Cloud Ranger in 2018 for its AWS backup and disaster recovery features and received a $130 million funding round in June 2019, led by Viking Global Investors.

Manley said the Veeam acquisition will help Druva by drawing attention to cloud-based backup. When large, established vendors make a move into this industry, it generates buzz that customers will inevitably follow. Manley said this leads to customers eventually finding Druva, as well.

Ranga Rajagopalan, vice president of product management at Commvault, said the Veeam acquisition underscores the importance of the backup market. Since June 2018, investors have poured more than $1 billion into cloud data protection vendors Cohesity, Rubrik, Actifio, Druva and Clumio. Insight Partners also poured $500 million into Veeam in January 2019, a year before buying out the entire company.

Rajagopalan said backup is a way to gather all the data. While protecting it remains important, Rajagopalan said that investors and industry experts envision using that data to accelerate application development, analytics and other non-backup purposes.

“It’s more than just backup — it’s data management. There’s so much more customers can do with their data if they can manage it better,” Rajagopalan said.

Commvault’s strategy has shifted toward broadening beyond backup, according to Tom Broderick, vice president of strategy and chief of staff at Commvault. He said customers wanting to do more with their backup data has been a growing trend, and the Veeam acquisition shows it’s a trend investors are keen on riding and getting a return on.

“We see it as validation for backup and the broader data management scope,” Broderick said. “It’s very much an encouraging sign.”

Rajagopalan said Commvault has an advantage in the enterprise market because its software can be used across all environments. Veeam would have to get a similar breadth of coverage in order to land enterprise deals. Just protecting VMware environments nowadays is no longer sufficient, as enterprise customers use other hypervisors, as well. Rajagopalan also added that merely protecting the data is similarly insufficient.

Go to Original Article
Author:

With Time on its hands, Meredith drives storage consolidation

After Meredith Corp. closed its $2.8 billion acquisition of Time Inc. in January 2018, it adopted the motto “Be Bold. Together.”

David Coffman, Meredith’s director of enterprise infrastructure, took that slogan literally. “I interpreted that as ‘Drive it like you stole it,'” said Coffman, who was given a mandate to overhaul the combined company’s data centers that held petabytes of data. He responded with an aggressive backup and primary storage consolidation.

The Meredith IT team found itself with a lot of Time data on its hands, and in need of storage consolidation because a variety of vendors were in use. Meredith was upgrading its own Des Moines, Iowa, data center at the time, and Coffman’s team standardized technology across legacy Time and Meredith. It dumped most of its traditional IT gear and added newer technology developed around virtualization, convergence and the cloud.

Although Meredith divested some of Time’s best-known publications, it now publishes People, Better Homes and Gardens, InStyle, Southern Living and Martha Stewart Living. The company also owns 17 local television stations and other properties.

The goal is to reduce its data centers to two major sites in New York and Des Moines with the same storage, server and data protection technologies. The sites can serve as DR sites for each other. Meredith’s storage consolidation resulted in implementing Nutanix hyper-converged infrastructure for block storage and virtualization, Rubrik data protection and a combination of Nasuni and NetApp for file storage.

“I’ve been working to merge two separate enterprises into one,” Coffman said. “We decided we wanted to go with cutting-edge technologies.”

At the time of the merger, Meredith used NetApp-Cisco FlexPod converged infrastructure for primary storage and Time had Dell EMC and Hitachi Vantara in its New York and Weehawken, N.J. data centers. Both companies backed up with Veritas NetBackup software. Meredith had a mixture of tape and NetBackup appliances and Time used tape and Dell EMC Data Domain disk backup.

By coincidence, both companies were doing proofs of concept with Rubrik backup software on integrated appliances and were happy with the results.

Meredith installed Rubrik clusters in its Des Moines and New York data centers as well as a large Birmingham, Alabama office after the merger. They protect Nutanix clusters in all those sites.

“If we lost any of those sites, we could hook up our gear to another site and do restores,” Coffman said.

Meredith also looked at Cohesity and cloud backup vendor Druva while evaluating Rubrik Cloud Data Management. Coffman and Michael Kientoff, senior systems administrator of data protection at Meredith, said they thought Rubrik had the most features and they liked its instant restore capabilities.

Coffman said Cohesity was a close second, but he didn’t like that Cohesity includes its own file system and bills itself as secondary storage.

“We didn’t think a searchable file system would be that valuable to us,” Coffman said. “I didn’t want more storage. I thought, ‘These guys are data on-premises when I’m already getting yelled out for having too much data on premises.’ I didn’t want double the amount of storage.”

Coffman swept out most of the primary storage and servers from before the merger. Meredith still has some NetApp for file storage, and Nasuni cloud NAS for 2 PB of data that is shared among staff in different offices. Nasuni stores data on AWS.

Kientoff is responsible for protecting the data across Meredith’s storage systems.

“All of a sudden, my world expanded exponentially,” he said of the Time aftermath. “I had multiple NetBackup domains all across the world to manage. I was barely keeping up on the NetBackup domain we had at Meredith.”

Coffman and Kientoff said they were happy to be rid of tape, and found Rubrik’s instant restores and migration features valuable. Instead of archiving to tape, Rubrik moves data to AWS after its retention period expires.

Rubrik’s live mount feature can recover data from a virtual machine in seconds. This comes in handy when an application running in a VM dies, but also for migrating data.

However, that same feature is missing from Nutanix. Meredith is phasing out VMware in favor of Nutanix’s AHV hypervisor to save money on VMware licenses and to have, as Coffman put it, “One hand to shake, one throat to choke. Nutanix provided the opportunity to have consolidation between the hypervisor and the hardware.”

The Meredith IT team has petitioned for Nutanix to add a similar live mount capability for AHV. Even without it, though, Kientoff said backing up data from Nutanix with Rubrik beats using tapes.

“With a tape restore, calling backup tapes from off-site, it might be a day or two before they get their data back,” he said. “Now it might take a half an hour to an hour to restore a VM instead of doing a live mount [with VMware]. Getting out of the tape handling business was a big cost savings.”

The Meredith IT team is also dealing with closing smaller sites around the country to get down to the two major data centers. “That’s going to take a lot of coordinating with people, and a lot of migrations,” Coffman said.

Meredith will back up data from remote offices locally and move them across the WAN to New York or Des Moines.

Kientoff said Rubrik’s live restores is a “killer feature” for the office consolidation project. “That’s where Rubrik has really shone for us,” he said. “We recently shut down a sizeable office in Tampa. We migrated most of those VMs to New York and some to Des Moines. We backed up the cluster across the WAN, from Tampa to New York. We shut down the VM in Tampa, live mounted in New York, changed the IP address and put it on the network. There you go — we instantly moved VMs form one office to another.”

Go to Original Article
Author:

Salesforce acquisition of Tableau finally getting real

LAS VEGAS — It’s been more than five months since the Salesforce acquisition of Tableau was first revealed, but it’s been five months of waiting.

Even after the deal closed on Aug. 1, a regulatory review in the United Kingdom about how the Salesforce acquisition of Tableau might affect competition held up the integration of the two companies.

In fact, it wasn’t until last week on Nov. 5 after the go-ahead from the U.K. Competition and Markets Authority (CMA) — exactly a week before the start of Tableau Conference 2019, the vendor’s annual user conference — that Salesforce and Tableau were even allowed to start speaking with each other. Salesforce’s big Dreamforce 2019 conference is Nov. 19-22.

Meanwhile, Tableau didn’t just stop what it was doing. The analytics and business intelligence software vendor continued to introduce new products and update existing ones. Just before Tableau Conference 2019, it rolled out a series of new tools and product upgrades.

Perhaps most importantly, Tableau revealed an enhanced partnership agreement with Amazon Web Services entitled Modern Cloud Analytics that will help Tableau’s many on-premises users migrate to the cloud.

Andrew Beers, Tableau’s chief technology officer, discussed the recent swirl of events in a two-part Q&A.

In Part I, Beers reflected on Tableau’s product news, much of it centered on new data management capabilities and enhanced augmented intelligence powers. In Part II, he discusses the Salesforce acquisition of Tableau and what the future might look like now that the $15.7 billion purchase is no longer on hold.

Will the Salesforce acquisition of Tableau change Tableau in any way?

Andrew Beers: It would be naïve to assume that it wouldn’t. We are super excited about the acceleration that it’s going to offer us, both in terms of the customers we’re talking to and the technology that we have access to. There are a lot of opportunities for us to accelerate, and as [Salesforce CEO] Marc Benioff was saying [during the keynote speech] on Wednesday, the cultures of the two companies are really aligned, the vision about the future is really aligned, so I think overall it’s going to mean analytics inside businesses is just going to move faster.

Technologically speaking, are there any specific ways the Salesforce acquisition of Tableau might accelerate Tableau’s capabilities?

Andrew BeersAndrew Beers

Beers: It’s hard to say right now. Just last week the CMA [order] was lifted. There was a big cheer, and then everyone said, ‘But wait, we have two conferences to put on.’

Have you had any strategic conversations with Salesforce in just the week or so since regulatory restrictions were lifted, even though Tableau Conference 2019 is this week and Salesforce Dreamforce 2019 is next week?

Beers: Oh sure, and a lot of it has been about the conferences of course, but there’s been some early planning on how to take some steps together. But it’s still super early.

Users, of course, fear somewhat that what they love about Tableau might get lost as a result of the Salesforce acquisition of Tableau. What can you say to alleviate their worries?

Beers: The community that Tableau has built, and the community that Salesforce has built, they’re both these really excited and empowered communities, and that goes back to the cultural alignment of the companies. As a member of the Tableau community, I would encourage people to be excited. To have two companies come together that have similar views on the importance of the community, the product line, the ecosystem that the company is trying to create, it’s exciting.

Is the long-term plan — the long-term expectation — for Tableau to remain autonomous under Salesforce?

We’ve gone into this saying that Tableau is going to continue to operate as Tableau, but long-term, I can’t answer that question. It’s really hard for anyone to say.
Andrew BeersChief technology officer, Tableau

Beers: We’ve gone into this saying that Tableau is going to continue to operate as Tableau, but long-term, I can’t answer that question. It’s really hard for anyone to say.

From a technological perspective, as a technology officer, what about the Salesforce acquisition of Tableau excites you — what are some things that Salesforce does that you can’t wait to get access to?

Beers: Salesforce spent the past 10 or so years changing into a different company, and I’m not sure a lot of people noticed. They went from being a CRM company to being this digital-suite-for-the-enterprise company, so they’ve got a lot of interesting technology. Just thinking of analytics, they’ve built some cool stuff with Einstein. What does that mean when you bring it into the Tableau environment? I don’t know, but I’m excited to find out. They’ve got some interesting tools that hold their hold ecosystem together, and I’m interested in what that means for analysts and for Tableau. I think there are a lot of exciting technology topics ahead of us.

What about conversations you might have with Salesforce technology officers, learning from one another. Is that exciting?

Beers: It’s definitely exciting. They’ve been around — a lot of that team has different experience than us. They’re experienced technology leaders in this space and I’m definitely looking forward to learning from their wisdom. They have a whole research group that’s dedicated to some of their longer term ideas, so I’m looking forward to learning from them.

You mentioned Einstein Analytics — do Tableau and Einstein conflict? Are they at odds in any way, or do they meld in a good way?

Beers: It’s still early days, but I think you’re going to find that they’re going to meld in a good way.

What else can you tell the Tableau community about what the future holds after the Salesforce acquisition of Tableau?

Beers: We’re going to keep focused on what we’ve been focusing on for a long time. We’re here to bring interesting innovations to market to help people work with their data, and that’s something that’s going to continue. You heard Marc Benioff and [Tableau CEO Adam Selipsky] talk about their excitement around that [during a conference keynote]. Our identity as a product and innovation company doesn’t change, it just gets juiced by this. We’re ready to go — after the conferences are done.

Go to Original Article
Author:

SuccessFactors customers to see big Qualtrics impact

LAS VEGAS — At the SAP SuccessFactors customer conference, SAP’s $8 billion Qualtrics acquisition seemed like the tail wagging the dog. Employee experience was such a central theme that SuccessFactors may rebrand HCM as HXM — Human Experience Management.

It may have been a lot for SuccessFactors customers to take in.

Some SuccessFactors customers are measuring employee experience with deeper analysis of employee behavior, such as time to complete certain tasks. But others, who were not Qualtrics users, were still assessing its capabilities.

What SAP made clear is that Qualtrics is important to the future of SuccessFactors.

Qualtrics “allows us now to really rethink almost every transaction in every application that we’re investing in,” Greg Tomb, president of SAP SuccessFactors, said at a meeting with press and analysts at SuccessConnect 2019.

Qualtrics sells an “experience management” or XM platform. It captures and measures employee experience (EX), product experience (PX), customer experience (CX) and brand experience (BX). The platform can combine experience data with a company’s operational data. 

The use of sophisticated employee experience measuring was illustrated by Hernan Garcia, vice president of talent and culture at Tecnológico de Monterrey, a university in Mexico. Garcia’s team studies employee experience as well as the efficiency of a process, including the time it takes to complete something.

“We measure both how they feel, how they interact, but also how much time, how many clicks, how many people they need to touch” to complete something, Garcia said during a press and analyst meeting. The school can improve the experience of employees by directly making changes to processes that affect it, he said.

The university was awarded SAP’s annual 2019 Klaus Tschira HR Innovation Award on Tuesday, which is named after an SAP co-founder. The university has about 31,000 employees and 160,000 students.

SuccessFactors is delivering some Qualtrics integrations, such as with employee records. It is also building capability to integrate with SAP Analytics Cloud so that companies can include both “X” or experience data and “O” or operational data in their analytics programs, said Amy Wilson, head of products and application engineering at SuccessFactors.

The SuccessFactors and Qualtrics integration work will continue into next year. For now, SuccessFactors and Qualtrics are separate applications, but “seamless,” Wilson said. SAP’s ultimate plan is to embed Qualtrics into SuccessFactors, she said.

But the employee experience discussion can’t just focus on X and O data. It must reconcile with the major workforce changes looming, said Vera Cuevas, a SuccessFactors user and HRIS senior manager at a technology firm she asked not be named.

“There’s probably going to be a lot of jobs across a number of different industries that might go away, that might be automated,” Cuevas said. “It will be interesting to see how you retain that employee engagement while at the same time you are moving employees in different jobs, or in some cases eliminating industries.”

Another attendee, Catrena Hairston, a senior learning professional at a U.S. government agency, said the ability to use both experience and operational data makes sense and may be useful. But she will have to see it in action. “I’m not into vaporware, so I’ll have to see if it works with our data,” Hairston said.

Go to Original Article
Author:

CenturyLink acquires Streamroot, adding P2P CDN capabilities

CenturyLink is looking to grow its content delivery network capabilities with the acquisition of privately held Streamroot Inc. Financial terms of the deal were not disclosed.

Streamroot’s technology provides a peer-to-peer (P2P) mesh approach for video content delivery applications. The advantage of the P2P content delivery network (CDN) approach, according to Streamroot, is it can potentially reach underserved markets and enable an alternative system for content delivery.

The deal was made public on Tuesday.

P2P CDNs are a fairly small business right now, and CenturyLink’s acquisition of Streamroot won’t change the CDN landscape, said 451 Research analyst Craig Matsumoto. That said, for CenturyLink, a P2P CDN capability is a nice, low-risk way to extend reach into different markets, especially internationally, he added.

“Think of live sports. Someone broadcasting a World Cup match is probably going to use multiple CDNs. So, if CenturyLink can claim extended reach into underserved areas, that’s a differentiator,” Matsumoto said.

Overall, he said, it’s known that P2P CDN technology can work at scale; though, to date, it’s been more a matter of finding use cases where the need is acute enough.

“If the CenturyLink-Streamroot deal works out, I could see the other CDNs working out partnerships or acquisitions with the other P2P startups,” he said.

P2P CDN

In the past, the term P2P was often associated with BitTorrent as a network approach that uses the power of devices in the network to share data.

Streamroot’s P2P CDN is completely unlike BitTorrent, in that it allows premium content providers complete control to ensure only users who have accepted their terms of use can benefit from and contribute to the user experience improvements achieved by incorporating into a mesh of similarly licensed users, said Bill Wohnoutka, vice president of global internet and content delivery services at CenturyLink.

“Streamroot’s data science and client heuristics enable connected consumer devices, such as smart phones, tablets, computers, set-top consoles and smart TVs, to participate in the serving of premium content through a secure and private mesh delivery,” Wohnoutka said. “Mesh servers are made from users that demonstrate performance and are created within the boundaries of carrier and enterprise networks to minimize the negative impact of the traffic on the network.”

Streamroot and CenturyLink

While the acquisition is new, Wohnoutka noted that CenturyLink began reselling Streamroot’s mesh delivery service in April 2019. He added that, as over-the-top (OTT) content becomes more pervasive worldwide, CenturyLink felt now was the right time to accelerate innovation and acquire Streamroot.

Streamroot’s data science and client heuristics enable connected consumer devices … to participate in the serving of premium content through a secure and private mesh delivery.
Bill WohnoutkaVice president of global internet and content delivery services, CenturyLink

With the P2P CDN technology, Wohnoutka said the goal is enable customers to get the most out of CenturyLink’s CDN and other CDNs they may be using, supporting a hybrid CDN approach.

“It is a true last-mile solution that pushes edge computing all the way down to the user device to localize traffic and reduce the pressures that OTT content puts on carrier networks,” he said.

P2P CDNs will also likely benefit from the rollout of 5G access technology. Wohnoutka said, with 5G, there are inherent localization and traffic optimization algorithms embedded in the software, as well as a data science approach to ensure best performance during peak internet traffic and in hard-to-reach locations.

“The direct benefits are realized by the content customer, end user and, importantly, the ISPs [internet service providers] architecting their 5G networks for low latency, high performance and traffic efficiency,” he said.

Wohnoutka noted that CenturyLink’s fiber network already has more than 450,000 route miles of coverage. He added that the company’s CDN business is a key part of continued investment in edge computing capabilities that puts workloads closer to customers’ digital interactions.

“What we are bringing our customers with this acquisition is the advantage of data science and software to help them improve the user experience with rich media content during peak hours on the internet,” Wohnoutka said.

Go to Original Article
Author:

Carbon Black acquisition is ‘compelling’

SAN FRANCISCO — VMware’s acquisition of Carbon Black is “the most compelling security story” Steve Athanas has heard in a while.

“I don’t know any other vendor in the ecosystem that has more visibility to more business transactions happening than VMware does,” said Athanas, VMware User Group president and associate CIO at the University of Massachusetts Lowell.

At its annual user conference, VMware announced new features within Workspace One, its digital workspace product that enables IT to manage virtual desktops and applications, and talked up the enhanced security features the company will gain through its $2.1 billion Carbon Black acquisition. Like Athanas, VMworld attendees welcomed the news.

VMware CEO Pat Gelsinger
At the opening keynote for VMworld, VMware CEO Pat Gelsinger speaks about the recent Carbon Black acquisition.

In this podcast, Athanas said Carbon Black could provide endpoint security across an entire organization once the technology is integrated, a promise he said he’s still thinking through.

“Are [chief security officers] going to buy into this model of wanting security from one vendor? I’ve heard CSOs in the past say you don’t do that because if one fails, you want another application to be able to detect something,” he said. “I don’t know where the balance and benefit is between being able to see more through that single view from Carbon Black or to have multiple vendors.”

Aside from the Carbon Black acquisition, Athanas was drawn to newly unveiled features for Workspace One that are aimed at making day-to-day processes for end users, IT and HR admins easier. For IT admins, a new Employee Experience Management feature enables IT to proactively diagnose if an end user’s device has been compromised by a harmful email or cyberattack. The feature can prevent the employee from accessing more company applications, preventing the spread of a cyberattack.

Another feature is called Virtual Assistant, which can help automate some of the onboarding and device management aspects of hiring a new employee.

“The Virtual Assistant stuff is cool, but I’m going to reserve judgement on it, because there is a ton of work that needs to go into getting that AI to give you the right answer,” Athanas said.

Go to Original Article
Author:

Merger and acquisition activity is altering the BI landscape

From Qlik’s acquisition of Podium Data in July 2018 through Salesforce’s purchase of Tableau Software in June 2019, the last year in BI has been characterized by a wave of consolidation.

Merger and acquisition activity is altering the BI landscape

Capped by Salesforce’s $15.7 billion acquisition of Tableau Software on June 10, 2019, and Google’s $2.6 billion purchase of Looker just four days earlier, the BI market over the last year has been marked by a wave of merger and acquisition activity.

Qlik kicked off the surge with its acquisition of Podium Data in July 2018, and, subsequently, made two more purchases.

“To survive, you’re going to have to reach some kind of scale,” said Rick Sherman, founder and managing partner of Athena IT Solutions, in a SearchBusinessAnalytics story in July 2019. “Small vendors are going to be bought or merge with more focused niche companies to build a more complete product.”

It was a little more than a decade ago that a similar wave of merger and activity reshaped the BI landscape, highlighted by IBM buying Cognos Analytics and SAP acquiring Business Objects.

After the flurry of deals in the spring ending with the premium Salesforce paid for Tableau, the pace of mergers and acquisition activity has slowed since the start of the summer, but more could be coming soon as more vendors with a specialized purpose seek partners with complementary capabilities in an attempt to keep pace with competitors that have already filled out their analytics stack.

Go to Original Article
Author:

Google’s Elastifile buy shows need for cloud file storage

Google’s acquisition of startup Elastifile underscored the increasing importance of enterprise-class file storage in the public cloud.

Major cloud providers have long offered block storage for applications that customers run on their compute services and focused on scale-out object storage for the massively growing volumes of colder unstructured data. Now they’re also shoring up file storage as enterprises look to shift more workloads to the cloud.

Google disclosed its intention to purchase Elastifile for an undisclosed sum after collaborating with the startup on a fully managed file storage service that launched early in 2019 on its cloud platform. At the time, Elastifile’s CEO, Erwan Menard, positioned the service as a complement to the Google Cloud Filestore, saying his company’s technology would provide higher performance, scale-out capacity and enterprise-grade features than the Google option.

Integration plans

In a blog post on the acquisition, Google Cloud CEO Thomas Kurian said the teams would join together to integrate the Elastifile technology with Google Cloud Filestore. Kurian wrote that Elastifile’s pioneering software-defined approach would address the challenges of file storage for enterprise-grade applications running at scale in the cloud.

“Google now has the opportunity to create hybrid cloud file services to connect the growing unstructured data at the edge or core data centers to the public cloud for processing,” said Julia Palmer, a vice president at Gartner. She said Google could have needed considerably more time to develop and perfect a scale-out file system if not for the Elastifile acquisition.

Building an enterprise-level, high-performance NFS file system from scratch is “insanely difficult,” said Scott Sinclair, a senior analyst at Enterprise Strategy Group. He said Google had several months to “put Elastifile through its paces,” see that the technology looked good, and opt to buy rather than build the sort of file system that is “essential for the modern application environments that Google wants to sell into.”

Target workloads

Kurian cited examples of companies running SAP and developers building stateful container-based applications that require natively compatible file storage. He noted customers such as Appsbroker, eSilicon and Forbes that use the Elastifile Cloud File Service on Google Cloud Platform (GCP). In the case of eSilicon, the company bursts semiconductor design workflows to Google Cloud when it needs extra compute and storage capacity during peak times, Elastifile has said.

“The combination of Elastifile and Google Cloud will support bringing traditional workloads into GCP faster and simplify the management and scaling of data and compute intensive workloads,” Kurian wrote. “Furthermore, we believe this combination will empower businesses to build industry-specific, high performance applications that need petabyte-scale file storage more quickly and easily.”

Elastifile’s Israel-based engineering team spent four years developing the distributed Elastifile Cloud File System (ECFS). They designed ECFS for hybrid and public cloud use and banked on high-speed flash hardware to prevent metadata server bottlenecks and facilitate consistent performance.

Elastifile emerged from stealth in April 2017, claiming 25 customers, including 16 service providers. Target use cases it cited for ECFS included high-performance NAS, workload consolidation in virtualized environments, big data analytics, relational and NoSQL databases, high-performance computing, and the lift and shift of data and applications to the cloud. Elastifile raised $74 million over four funding rounds, including strategic investments from Dell Technologies, Cisco and Western Digital.

One open question is the degree to which Google will support Elastifile’s existing customers, especially those with hybrid cloud deployments that did not run on GCP. Both Google and Elastifile declined to respond.

Cloud NAS competition

The competitive landscape for the Elastifile Cloud File Service on GCP has included Amazon’s Elastic File System (EFS), Dell EMC’s Isilon on GCP, Microsoft’s Azure NetApp Files, and NetApp on GCP.

“Cloud NAS and cloud file systems are the last mile for cloud storage. Everybody does block. Everybody does object. NAS and file services were kind of an afterthought,” said Henry Baltazar, research director of storage at 451 Research.

But Baltazar said as more companies are thinking about moving their NFS-based legacy applications to the cloud, they don’t want to go through the pain and the cost of rewriting them for object storage or building a virtual file service. He sees Google’s acquisition of Elastifile as “a good sign for customers that more of these services will be available” for cloud NAS.

“Google doesn’t really make infrastructure acquisitions, so it says something that Google would make a deal like this,” Baltazar said. “It just shows that there’s a need.”

Go to Original Article
Author:

Adobe acquisition of Marketo could shake up industry

The potential Adobe acquisition of Marketo could unsettle the customer experience software market and give Adobe, which is mainly known for its B2C products, a substantial network of B2B customers from Marketo.

Adobe is in negotiations to acquire marketing automation company Marketo, according to reports.

“It’s a trend that B2B customers are trying to become more consumer-based organizations,” said Sheryl Kingstone, research director for 451 Research. “Marketo is maybe throwing in the towel in being a lead marketing vendor on its own.”

But, reportedly, talks between Adobe and Marketo’s holding company may not lead to a deal.

Ray Wang, founder of Constellation Research, said leaks could be coming from Vista Equity Partners Management, which bought Marketo in 2016 and took the company private, in the hopes of adding another bidder to the race to acquire Marketo.

“If people think Adobe would buy Marketo, maybe it would get SAP to think about it,” Wang said. “The question is, who needs marketing automation or email marketing? And when you think about the better fit at this moment, it’s SAP.”

When reached for comment, Adobe declined, adding that it does not comment on acquisition rumors or speculation.

Adobe expanding to B2B

Marketo said it had roughly 4,600 customers when it was acquired by Vista Equity. It’s unclear whether Adobe and Marketo have much overlap between customer bases, but there could be product overlap between the software vendors.

Marketo is maybe throwing in the towel in being a lead marketing vendor on its own.
Sheryl Kingstoneresearch director, 451 Research

Adobe has its Marketing Cloud system, and both vendors offer basic martech features, like lead scoring, lead segmentation, web tracking, SMS marketing, personalized web content and predictive analytics. But an Adobe acquisition of Marketo would allow Adobe to expand into a wider B2B market, while allowing Marketo to offer its users the ability to market more like a B2C vendor using Adobe’s expertise.

“It’s a huge benefit for Marketo when you look at Adobe,” Kingstone said.

“Marketo has struggled in a B2B sense when its customers try to implement an ABM [account-based marketing] strategy,” she said.

Despite any potential overlap with its own products’ marketing capabilities, Adobe could find the chance to break into a pool of nearly 5,000 B2B customers compelling.

“There’s a lot of value in Marketo, and Adobe has been gun shy about entering B2B,” Wang said.

Adobe’s alliance

If the Adobe acquisition reports turn out to be accurate, it would amplify what has already been a busy year for the vendor. In May, Adobe acquired commerce platform Magento for a reported $1.7 billion.

A Reuters report about the Adobe acquisition of Marketo said likely prices will well exceed the $1.8 billion that Vista paid for Marketo when it took Marketo private.

Over the past few years, industry-leading companies in the CRM and customer experience spaces have sought to build alliances with other vendors.

Adobe and Microsoft have built a substantial partnership and have even gone to market together with products, while Salesforce and Google unveiled their partnership and product integrations last year at Salesforce’s annual Dreamforce conference.

Marketo has been one of the few major martech vendors without an alliance. Combining its technologies with Adobe’s creative suite and potentially Microsoft’s B2B breadth could make a significant imprint on the industry.

“If this is real, then it means Adobe has gotten serious about B2B,” Wang said.

Editor’s note: TechTarget offers ABM and project intelligence data and tools services.

Amazon, Intel, NBCUniversal spill buying secrets at HR Tech 2018

LAS VEGAS — Amazon’s talent acquisition organization has more than 3,500 people, including 2,000 recruiters, and is very interested in testing out new technology. That is probably welcome news to vendors here at HR Tech 2018. But Amazon and other big HR technology users warned against being dazzled by vendors’ products and recommended following a disciplined and tough evaluation process.

“I think it’s important to stay abreast with what’s happening in the market,” said Kelly Cartwright, the head of recruiting transformation at Amazon. “I’m really, really passionate about doing experiments and pilots and seeing whether or not something can work,” she said, speaking on a talent acquisition technology panel at HR Tech 2018.

It’s important to “block out time and take those [vendor] calls and listen to what those vendors have to say because one of them actually might have a solution for you that can be a game changer,” Cartwright said.

A warning about new HR tech

But Cartwright also had a clear warning for attendees at the HR Tech 2018. It won’t help to make the investment in a new technology until “you really clarify” what it is you want to use it for, she said.

What has to happen first in investigating HR trends and new technologies is to “start with a clear problem that you’re trying to solve for,” Cartwright said. She illustrated her point with example questions: Is the problem improving diversity in the pipeline? Or is it ensuring that there are enough potential candidates visiting your recruiting website?

Endorsing this approach was Gail Blum, manager of talent acquisition operations at NBCUniversal, who appeared with Cartwright on the panel.

Blum said NBCUniversal may not always have the budget for a particular new HR technology, but vendors increasingly are offering free pilots. Companies can choose to take a particular problem “and see if that new tool or vendor has the ability to solve that,” she said.

Attendees walk through the expo area at the 2018 HR Technology Conference
New HR tech is in abundance at the 2018 HR Technology Conference & Expo

New tech that doesn’t integrate is next to useless

Critical to any new HR technology is its ability to integrate with existing talent systems, such as an applicant tracking system, Blum said. She wants to know: Will the system have a separate log-in? “That’s always something that we ask upfront with all of these vendors.”

“If you are requiring everyone to have to go to two different systems the usage probably isn’t going to be great,” Blum said, who said that was their experience from some previous rollouts. If the systems don’t integrate, a new technology addition “isn’t really going to solve your problem in the end,” she said.      

There was no disagreement on this panel at HR Tech 2018 about the need to be rigorous with vendors to avoid being taken in by a shiny new technology.

We ask really invasive questions of the vendors.
Allyn Baileytalent acquisition capability adoption transformation leader, Intel

If Intel is going to partner with a talent vendor “it’s a long-term play,” said Allyn Bailey, talent acquisition capability adoption transformation leader at the chipmaker.

“We ask really invasive questions of the vendors,” Bailey said. “The vendors really hate it when we do it,” she said.

But Bailey said they will probe a vendor’s stability, their financing and whether they are positioning themselves to gather some big-name customers and then sell the business. “That freaks me out because my investment with that vendor is around that partnership to build a very customized solution to meet my needs,” she said. 

TechTarget, the publisher of SearchHRSoftware, is a media partner for HR Tech 2018.