Tag Archives: vendors

Zoom Chat update takes vendor further beyond video

Zoom has launched a Zoom Chat update that demonstrates the video-conferencing vendor’s ambition to become a one-stop shop for business communications.

Zoom has made the Chat messaging app suitable for a broader range of collaboration scenarios. The more extensive capabilities bolster the product’s role in the vendor’s portfolio, which also includes the expanding cloud calling service, Zoom Phone.

The two products, combined with the vendor’s flagship video conferencing service, show that Zoom wants to become an all-in-one UC provider.

However, to reach its goal, Zoom still has to catch up with rivals. The updates to Zoom Chat unveiled this week underscore that the product remains far behind leading messaging apps with regards to features and integrations.

Zoom added support for reply threads and emoji reactions. The former is an essential tool for keeping messaging channels organized. The app also now gives users the ability to send notifications to everyone in a channel, and for admins to use channels to make announcements.

Users’ profiles are another new feature, showing each employee’s location, department and job title. Colleagues can link to those profiles by tagging their coworker in a message using “@.”

These are standard features that leading apps like Slack and Microsoft Teams have long supported. Nevertheless, by steadily adding features, Zoom is making its portfolio more attractive to businesses that want to buy communications services from a single vendor, analysts said.

Zoom is trying to get businesses to use more of its products by encouraging its video customers to adopt Zoom Phone. But more aggressively pushing Zoom Chat would bring the vendor into conflict with Slack, a close partner. That’s a scenario Zoom executives previously said they wanted to avoid.

“A Zoom Chat app, integrated with Phone and Meetings, makes a great deal of sense,” said Irwin Lazar, an analyst at Nemertes Research. “I think it will be interesting to see how aggressive Zoom is in the next year with adding features and integrations to its own chat app.”

Zoom is now a telephony service provider in six countries through Zoom Phone, with a beta service available in 11 additional locations. Outside of those countries, businesses can power Zoom Phone using third-party telephony services.

At Zoomtopia 2019, the vendor’s annual user conference in October, CEO Eric Yuan said Zoom Phone was raising Zoom Chat’s profile within the portfolio. 

“The more we focus on voice, we need chat,” Yuan said at the time. “As we sell more and more phone systems, I think chat will be key for us as well.”

Meanwhile, Zoom on Thursday delivered a better-than-expected earnings report for the quarter, bringing in $166 million in the three months ended Oct. 31. Valued at roughly $19 billion, Zoom now expects to generate revenue between $609 million and $610 million in the year.

Go to Original Article
Author:

4 SD-WAN vendors integrate with AWS Transit Gateway

Several software-defined WAN vendors have announced integration with Amazon Web Services’ Transit Gateway. For SD-WAN users, the integrations promise simplified management of policies governing connectivity among private data centers, branch offices and AWS virtual networks.

Stitching together workloads across cloud and corporate networks is complex and challenging. AWS tackles the problem by making AWS Transit Gateway the central router of all traffic emanating from connected networks.

Cisco, Citrix Systems, Silver Peak and Aruba, a Hewlett Packard Enterprise Company, launched integrations with the gateway this week. The announcements came after AWS unveiled the AWS Transit Gateway at its re:Invent conference in Las Vegas.

SD-WAN vendors lining up quickly to support the latest AWS integration tool didn’t surprise analysts. “The ease and speed of integration with leading IaaS platforms are key competitive issues for SD-WAN for 2020,” said Lee Doyle, the principal analyst for Doyle Research.

By acting as the network hub, Transit Gateway reduces operational costs by simplifying network management, according to AWS. Before the new service, companies had to make individual connections between networks outside of AWS and those serving applications inside the cloud provider.

The potential benefits of Transit Gateway made connecting to it a must-have for SD-WAN suppliers. However, tech buyers should pay close attention to how each vendor configures its integration.

“SD-WAN vendors have different ways of doing things, and that leads to some solutions being better than others,” Doyle said.

What the 4 vendors are offering

Cisco said its integration would let IT teams use the company’s vManage SD-WAN controller to administer connectivity from branch offices to AWS. As a result, engineers will be able to apply network segmentation and data security policies universally through the Transit Gateway.

Aruba will let customers monitor and manage connectivity either through the Transit Gateway or Aruba Central. The latter is a cloud-based console used to control an Aruba-powered wireless LAN.

Silver Peak is providing integration between the Unity EdgeConnect SD-WAN platform and Transit Gateway. The link will make the latter the central control point for connectivity.

Finally, Citrix’s Transit Gateway integration would let its SD-WAN orchestration service connect branch offices and data centers to AWS. The connections will be particularly helpful to organizations running Citrix’s virtual desktops and associated apps on AWS.

Go to Original Article
Author:

HCI storage adoption rises as array sales slip

The value and volume of data keep growing, yet in 2019 most primary storage vendors reported a drop in sales.

Part of that has to do with companies moving data to the cloud. It is also being redistributed on premises, moving from traditional storage arrays to hyper-converged infrastructure (HCI) and data protection products that have expanded into data management.

That helps explain why Dell Technologies bucked the trend of storage revenue declines last quarter. A close look at Dell’s results shows its gains came from areas outside of traditional primary storage arrays that have been flat or down from its rivals.

Dell’s storage revenue of $4.15 billion for the quarter grew 7% over last year, but much of Dell’s storage growth came from HCI and data protection. According to Dell COO Jeff Clarke, orders of VxRail HCI storage appliances increased 82% over the same quarter in 2018. Clarke said new Data Domain products also grew significantly, although Dell provided no revenue figures for backup.

Hyper-converged products combine storage, servers and virtualization in one box. VxRail, which relies on vSAN software from Dell-owned VMware running on Dell PowerEdge, appears to be cutting in on sales of both independent servers and storage. Dell server revenue declined around 10% year-over-year, around the same as rival Hewlett Packard Enterprise’s (HPE) server decline.

“We’re in this data era,” Clarke said on Dell’s earnings call last week. “The amount of data created is not slowing. It’s got to be stored, which is probably why we are seeing a slightly different trend from the compute side to the storage side. But I would point to VxRail hyper-convergence, where we’ll bring computing and storage together, helping customers build on-prem private clouds.”

The amount of data created is not slowing. It’s got to be stored.
Jeff ClarkeCOO, Dell

Dell is counting on a new midrange storage array platform to push storage revenue in 2020. Clarke said he expected those systems to start shipping by the end of January.

Dell’s largest storage rivals have reported a pause in spending, partially because of global conditions such as trade wars and tariffs. NetApp revenues have fallen year-over-year each of the last three quarters, including a 9.6% dip to $1.38 billion last quarter. HPE said its storage revenue of $848 million dropped 12% from last year. HPE’s Nimble Storage midrange array platform grew 2% and Simplivity HCI increased 14% year-over-year, a sign that 3PAR enterprise arrays fell and the vendor’s new Primera flagship arrays have not yet generated meaningful sales.

Jeff Clarke, Dell COO
Dell Technologies COO Jeff Clarke

IBM storage has also declined throughout the year, dropping 4% year-over-year to $434 million last quarter. Pure Storage’s revenue of $428 million last quarter increased 16% from last year, but Pure had consistently grown revenue at significantly higher rates throughout its history.

Meanwhile, HCI storage revenue is picking up. Nutanix last week reported a leveling of revenue following a rocky start to 2019. Related to VxRail’s increase, VMware said its vSAN license bookings had increased 35%. HPE’s HCI sales grew, while overall storage dropped. Cisco did not disclose revenue for its HyperFlex HCI platform, but CEO Chuck Robbins called it out for significant growth last quarter.

Dell/VMware and Nutanix still combine for most of the HCI storage market. Nutanix’s revenue ($314.8 million) and subscription ($380.0 million) results were better than expected last quarter, although both numbers were around the same as a year ago. It’s hard to accurately measure Nutanix’s growth from 2018 because the vendor switched to subscription billing. But Nutanix added 780 customers and its 66 deals of more than $1 million were its most ever. And the total value of its customer contracts came to $305 million, up 9% from a year ago.

Nutanix’s revenue shift came after the company switched to a software-centric model. It no longer records revenue from the servers it ships its software on. Nutanix and VMware are the dominant HCI software vendors.

“It’s just the two of us, us and VMware,” Nutanix CEO Dheeraj Pandey said in an interview after his company’s earnings call. “Hyper-convergence now is really driven by software as opposed to hardware. I think it was a battle that we had to win over the last three or four years, and the dust has finally settled and people see it’s really an operating system play. We’re making it all darn simple to operate.”

Go to Original Article
Author:

Container backup grows, following container adoption

The popularity of container deployments is reaching a tipping point where all backup vendors will eventually need to be able to support them, industry experts said.

As container technology adoption increases, the need for container backup grows. Until now, most containers have been stateless and required no backup.

“We’re going to be seeing more stateful containers, buoyed by the fact that now there’s ways to protect them,” said Steven Hill, senior analyst at 451 Research.

Tom Barton, CEO of container storage vendor Diamanti, said he is seeing more customers’ containers with persistent storage. Barton said when containers replace virtual machines, they require the same data protection and disaster recovery (DR) requirements.

“I think containers will generally displace VMs in the long-run,” Barton said.

Diamanti recently launched the beta version of its Spektra platform, a Kubernetes management layer designed for migrating Kubernetes workloads between on premises and cloud. Spektra enables high availability and DR for Kubernetes workloads, and Barton said Diamanti and its competitors partner with data protection vendors to provide container backup.

Other products that offer container backup include Veritas NetBackup, which introduced its Docker container support at the beginning of this year, and IBM Spectrum Protect, which has recently entered this space by rolling out snapshotting for Kubernetes users.

Hill shared similar beliefs about containers replacing VMs but stressed it will not be a one-for-one replacement. He said economics will always play a role. He said some applications and workloads will remain that make sense to keep on VMs while others will belong on containers. The situation will vary between organizations, but it won’t be fair to say containers are strictly better than VMs, or vice versa.

Screenshot of vProtect version 3.9
Storware vProtect supports a wide variety of hypervisors.

“You never do everything with just the one tool,” Hill said.

Hill also stressed that containers themselves aren’t a mature market or technology yet, and vendors are still waiting to see how organizations are using them. Customers putting mission-critical applications on containers have nudged demand for data protection, backup, recovery, availability and failover — the same kind of capabilities expected in any environment. Vendors are responding to this demand, but the tools aren’t ready yet.

“Protecting stateful containers is still relatively new. The numbers aren’t there to define a real market,” Hill said.

Marc Staimer, president of Dragon Slayer Consulting, said containers still lack the security, flexibility and resilience of VMs. He chalks that up to containers’ lack of maturity. As customers put containers into production, they will realize the technology’s shortcomings, and vendors will develop products and features to address those problems. Staimer said the industry has recently reached a tipping point where there’s enough container adoption to catch vendor interest.

Staimer acknowledged that when containers mature to the same point where hypervisors are now, there will be widespread replacement. Like Hill, he does not expect it to be a wholesale replacement.

“We like to believe these things are winner-takes-all, but they’re not,” Staimer said. “In tech, nothing goes away.”

Staimer said from a technical standpoint, container backup has unique problems that differentiate it from traditional server, VM and SaaS application backup. The core problem is that containers don’t have APIs to allow for backup software to take a snapshot of the state of the container. Most backup vendors install agents in containers to scan and capture what it needs to build a recoverable snapshot. This takes time and resources, which goes against the intent of containers being lightweight VMs.

Trilio CEO David Safaii said installing agents in containers also create extra hassle for developers because they have to go through an IT admin to conduct their backups. He said there’s a “civil war” between IT managers and DevOps. IT managers need to worry about data protection, security and compliance. These are all important and necessary measures, but they can get in the way of DevOps’s philosophy of continuous and agile development.

Trilio recently launched the beta program for its TrilioVault for Kubernetes, which is an agentless container backup offering. Asigra similarly performs container backup without using agents, as does Poland-based Storware’s vProtect.

Storware vProtect started in the container backup space by focusing on open platforms first, protecting Red Hat OpenShift and Kubernetes projects. Storware CTO Paweł Mączka said no one asked for container data protection in the early days because container workloads were microservices and applications.

Mączka saw customers now use containers as they would VMs. DevOps now put databases in containers, shifting them from stateless to stateful. However, Mączka doesn’t see containers taking over and proliferating to the same point as hypervisors such as VMware vSphere and Microsoft Hyper-V, which vProtect only started supporting in its latest version 3.9 update.
“I don’t think they’ll rule the world, but it’s important to have the [container backup] feature,” Mączka said.

Go to Original Article
Author:

Pure Storage cloud sales surge, but earnings miss the target

Add Pure Storage to the list of infrastructure vendors that sense a softening global demand. The all-flash pioneer put the best face on last quarter’s financial numbers, focusing on solid margins and revenue, while downplaying its second earnings miss in the last three quarters.

Demand for Pure Storage cloud services boosted revenue to $428.4 million for the quarter that ended Oct. 31. That’s up 15% year over year, but lower than the $440 million expectation on Wall Street.

Pure Storage launched as a startup in 2009 and has grown steadily to a publicly traded company with $1.5 billion in revenue. On Pure’s earnings call last week, CEO Charles Giancarlo blamed the revenue miss on declining flash prices. Giancarlo said U.S. trade tensions with China and uncertainty surrounding Brexit create economic headwinds for infrastructure vendors — concerns also voiced recently by rivals Dell EMC and NetApp.

Pure: Looking for bright spot in cloud

Like most major storage vendors, Pure is rebranding to tap into the burgeoning demand for hybrid cloud. Recent additions to the Pure Storage cloud portfolio include Cloud Block Store, which allows users to run Pure’s FlashArray systems in Amazon Web Services, and consumption-based Pure as a Service (ES2), formerly Pure Evergreen.

Pure said deferred licensing revenue of $643 million rose 39%, fueled by record growth of ES2 sales. The Pure Storage cloud strategy resonates with customers that want storage with cloudlike agility, company executives said.

“Data storage still remains the least cloudlike layer of technology in the data center. Delivering data storage in an enterprise is still an extraordinarily manual process with storage arrays highly customized and dedicated to particular workloads,” Giancarlo said.

Pure claims it added nearly 400 customers last quarter, bringing its total to more than 7,000. That includes cloud IT services provider ServiceNow, which implements Pure Storage all-flash storage to underpin its production cloud.

“Companies are realizing IT services are not their main line of business — that a cloud-hosted services model is generally better. We’re right in the middle of that. We build enterprise data services and do all the work to manage the cloud” for corporate customers, Keith Martin, ServiceNow’s director of cloud capacity engineering, told SearchStorage in an interview this year.

Pure will use its increased product margin — which jumped 4.5 points last quarter to 73% — to ensure it “won’t lose on price” in competitive deals, outgoing president David Hatfield said.

A strong pipeline of Pure Storage cloud and on-premises deals gives it the ability to bundle multiple products and sell more terabytes. “It’s just taking a little bit longer from a deal-push perspective, but our win rates are holding nicely,” Hatfield said.

Hatfield said he is stepping away from president duties to deal with a family health issue, but he will remain Pure’s vice chairman and special advisor to Giancarlo. Former Riverbed Technology CEO Paul Mountford was introduced as Pure’s new COO. Kevan Krysler, most recently VMware’s senior vice president of finance and chief accounting officer, will take over in December as Pure’s CFO. He will replace Tim Ritters, who announced his departure in August.

Go to Original Article
Author:

BI for mobile remains a challenge for vendors

While the demand for analytics software grows and vendors old and new race to come up with the next innovations, the majority of vendors’ research and development resources are being dedicated to desktop applications as opposed to BI for mobile devices.

A handful of vendors stand out as exceptions, but mobile apps remain largely underdeveloped by many others.

BI for mobile, simply, presents a conundrum for developers. Some have chosen to invest in mobile, attacked the challenge and made headway, while others have elected to just focus their attention on their desktop applications.

The problem is the screen.

Digestible data is largely visual — it’s charts and graphs, and often more than just one on a single dashboard. Once, it was numbers on a page, but that time is now in the distant past. 

Mobile screens, however, are tiny compared to computer screens. Recreating desktop dashboards doesn’t work particularly well on a mobile device. Recreating the analytic capabilities of desktop device, therefore, doesn’t work either.

Instead, the vendors who have developed successful BI for mobile apps have viewed phones and tablets as different entities than desktop computers, and they’ve created a different experience on their mobile apps.

“[The phone] is not an effective tool for doing data analysis,” said Donald Farmer, principal at TreeHive Strategy in Woodinville, Wash. “It’s an effective tool for conveying short, well-formatted, concise insights. The people who have done a good job … have focused on that. They pick out significant things to tell you on the phone in a format that works for a mobile device, but they’re not trying to give you an analytic tool on a mobile device – that wouldn’t be practical or helpful.”

[The phone] is not an effective tool for doing data analysis. It’s an effective tool for conveying short, well-formatted, concise insights. The people who have done a good job … have focused on that.
Donald FarmerPrincipal, TreeHive Strategy

Similarly, the vendors that have developed good mobile apps have developed their apps specifically for mobile devices, noted Mike Leone, a senior analyst with Enterprise Strategy Group in Milford, Mass.

“First and foremost, [a good mobile BI app is] one that is designed from the ground up for a mobile device,” he said. “I’ve seen all too often organizations try and port their applications and [user interfaces] to a mobile device and the results are underwhelming and in some cases unusable.”

The good and the bad

More than a decade ago, in 2008, Yellowfin introduced its first mobile app. The vendor updated it once, and it didn’t attract many users.

But, recently the vendor completely overhauled the app, instead of attempting to recreate the desktop experience, transforming it into a timeline that looks and acts much like social media feeds. The mobile interface now highlights what it deems to be the most pertinent information and presents it in a way mobile users can easily view it.

Yellowfin CEO Glen Rabie said that he realized BI for mobile could be effective only if “the content and experience being delivered are uniquely designed for mobile versus trying to force fit a dashboard desktop experience onto a phone.”

MicroStrategy, which started developing its app in 2009, is another vendor that’s invested aggressively in BI for mobile.

“For us, we’ve always been focused on intelligence everywhere, how to arm as many people as possible,” said Hugh Owen, senior vice president of product marketing at MicroStrategy. “Mobile opened up another opportunity to arm people who aren’t looking at BI on their desktop with BI.”

While some BI software vendors have invested in creating effective mobile apps, others have all but ignored mobile innovation or ineffectively tried to simply recreate the desktop experience on mobile devices.
While BI vendors have invested heavily in developing their desktop computing software, many have chosen not yet to make the same kind of investment in their mobile apps.

The vendor’s current BI for mobile capabilities enables clients to build custom apps. Retail customers, for example, are able to embed the ability to execute transactions.

Meanwhile, MicroStrategy offers HyperIntelligence for Mobile as part of its HyperIntelligence product line. The app, due to its augmented intelligence and machine learning capabilities, provides a level of contextual understanding and intuitively provides users with information cards.

“If you walk into a store, it gives you a card about the store without you asking,” Owen said. “It can look at your calendar, scan through words and invitations, and match it with cards and give you a push note. It’s proven to be a different approach, and it’s helped us stand out.”

Domo, according to Farmer, is another vendor that has learned how to adapt BI for mobile. So have Qlik and Oracle.

But there are many others that have struggled to develop an effective BI for mobile app and have “an unclear strategy with some mobile being done but nothing very exciting and nothing very compelling,” Farmer said.

Innovation

Despite the limits placed on BI for mobile by a phone’s miniscule screen, there remains room for growth.

Phones and tablets have unique capabilities that desktop computers don’t.

Among the features they possess that desktop devices don’t is GPS. And while desktop devices also have cameras, the cameras on mobile phones and tablets are, well, mobile, while the ones on desktops are rooted in place.

Thanks to GPS, for example, someone who has business in multiple locations can travel to a location and — using a well-designed BI for mobile app — get actionable data about that location delivered directly to their mobile device.

“A good mobile app leverages the physical appendages of the phone or tablet,” Owen said. “It’s aware of your location and takes advantage. It uses the camera to take a picture and scan a QR code or other bar code — you wouldn’t do that with a clamshell laptop.”

The next step in the evolution of BI for mobile, according to Owen, is becoming more proactive rather than reactive by using AI and machine learning and learning behavioral patterns.

“It’s presenting answers back to you before you know you need it,” he said.

Vendors will also need to address security as the development of BI for mobile apps progresses.

“All too often, workers will utilize their personal devices for work,” Leone said. “With security top of mind for virtually every organization, ensuring the right level of controls and governance are in place, not just based on a user, but based on the device, will be important going forward.”

Ultimately, however, mobile devices are tools to connect people so that they can converse. The vendors who view them for what they are and develop BI for mobile apps that take advantage of a mobile device’s unique powers are the ones who will set the pace for innovations.

“It’s not a device for deep contemplation and analysis — it’s a device to look at to glimpse to see what’s important — so a really good mobile app does two things,” Farmer said. “It enables the glimpse of what is important, and it enables the communication of that, because ultimately mobile devices are communication devices.”

Go to Original Article
Author:

Clumio backup seeks to simplify with SaaS

As a new vendor’s first customer, the IT leader of a city wouldn’t be faulted for worrying about the product.

But Cory Smith, CIO and CTO of Davenport, Iowa, said he didn’t have concerns with using Clumio for backup and recovery. Clumio, which is based in Santa Clara, Calif., came out of stealth Aug. 13 with its cloud-based backup as a service.

Smith said the city was looking for a new backup product earlier this year when Clumio contacted him about trying out a beta version. He said he felt more at ease with the product after using it in beta and performing backups and restores. The city purchased Clumio as soon as it became generally available April 30.

Though Davenport doesn’t have a major cloud initiative, Smith said going cloud-only for backup is a goal.

“This is one of those situations where the cloud is really good for us,” Smith said.

Striving for simplicity

Clumio CEO and co-founder Poojan Kumar aims for his company to do with backup what Salesforce has done with CRM and ServiceNow with service management. He wants to deliver a true service offering that’s completely run and managed in the cloud by Clumio.

“SaaS is taking over,” Kumar said. “Our founding vision was really around going and building a data management platform on top of the public cloud.”

SaaS is taking over. Our founding vision was really around going and building a data management platform on top of the public cloud.
Poojan KumarCEO, Clumio

Kumar said he wants customers to get away from the complex nature of installing software and hardware for backup. In addition, as workloads are moving to the cloud, the practice of using multiple accounts, regions and clouds is increasing complexity.

“We saw all of this as an opportunity for simplification,” Kumar said.

To start, Clumio protects on-premises and cloud-based VMware environments on top of AWS. It also provides native AWS backup for accounts that run Elastic Compute Cloud and Elastic Block Store.

The majority of backup vendors were “born in the world of on premises,” delivering protection through software, hardware or both, which the customer has to manage, Kumar said. He said legacy backup players cannot take advantage of the public cloud “the right way” by building a cloud-native architecture and true SaaS platform.

“By SaaS, I mean a true service that is multi-tenant that frees the customer from the mundane of managing these solutions,” Kumar said.

Andrew Smith, research manager at IDC, noted that Clumio customers don’t need to use anything on premises. They can simply spin up the virtual appliance and start using Clumio. The vendor says it takes 15 minutes to get the product running.

“The way they’re approaching backup as a service as an all-inclusive platform is unique,” Smith said. “The idea is to ‘SaaSify’ the entire backup environment.”

Davenport’s Smith said even with his larger environment — about 70 VMs and 40 TB worth — getting to the cloud was not an issue.

The city, with a population of about 100,000, has to retain some data indefinitely. For example, police video — a data set that’s often large — could be critical in court 10 years from now.

“The city’s not going to go out of business,” he said. “I’ve got to keep it.”

Smith said its price is an advantage. Because Clumio charges per virtual machine rather than by the size of the VM, the cost does not rise as a VM grows larger.

Screenshot of Clumio backup
Clumio, a backup-as-a-service vendor that came out of stealth Aug. 13, charges per VM.

A look at current and future features

Smith said Davenport was looking for a new backup system because its Rubrik platform wasn’t performing well enough, especially with getting data sets to the cloud. The city wanted to get away from running hardware on premises and using traditional disaster recovery, and sought a cheap cloud service.

“Clumio has kind of hit that niche,” Smith said.

He acknowledged that the product is not yet mature and Clumio is still adding functionality. He said he’s looking for the vendor to add Microsoft Exchange and SQL support. Davenport still uses old Veeam licensing for Exchange and SQL Server, but Smith said he thinks eventually the city will only use Clumio for backup. He said he finds the interface and search easy to use.

Security wise, Clumio’s backups are encrypted in transit and at rest. Kumar said its immutability is especially important in the face of data protection threats like ransomware.

“You know that you can go back to the copy [and] it’s going to be kosher,” Kumar said. “We do a whole bunch of things automatically in the platform to make sure that it is restorable to the previous copy. It’s not just about backing it up — it’s about making sure it is restorable.”

Kumar said he expects Clumio will delve into machine learning to help look at potential issues with customer data.

Funding, founders, fighting status quo

Clumio has $51 million in funding over two rounds. Sutter Hill Ventures led the Series A round. Index Ventures drove the Series B round, which also had significant participation from Sutter Hill Ventures.

The company was founded in 2017. Kaustubh Patil, vice president of engineering, and CTO Woon Ho Jung were the other founders with Kumar. All three founders previously worked at VMware, Nutanix and PernixData. Kumar was a founder of PernixData, which was acquired by Nutanix.

Clumio has about 75 employees, Kumar said.

The product is sold exclusively through the channel.

IDC’s Smith said competition will include Veeam, Zerto, Rubrik and Cohesity, as well as the more traditional backup vendors such as Veritas, Dell EMC and Commvault. Druva and Carbonite are also leaders in cloud-based backup.

“They’re going to have to compete with everybody,” Smith said. “It’s going to be pretty difficult.”

It will be important for Clumio to attract customers moving all data to the cloud, Smith said, as well as users tackling multi-cloud and that increased complexity of environment.

Kumar said his biggest competition is the status quo.

“It’s going to be about educating the market that something like this is possible,” Kumar said. “And we can give you freedom from the mundane.”

Go to Original Article
Author:

Analyst forecasts the next big things in BI

Consolidation among business intelligence vendors is driven by what’s perceived to be the next big things in BI, and that was the case during the run of merger and acquisition deals during the first half of 2019.

According to Wayne Eckerson, founder and principal consultant of Eckerson Group, self-service analytics was a key part of what made Looker and Tableau attractive to Google and Salesforce, respectively. When the next consolidation wave hits, according to Eckerson, augmented intelligence could be a big driver, as could cloud-based BI tools — and they will be viewed as the next big things in BI.

Eckerson has more than 25 years of experience in the BI software market and is the author of two books — Secrets of Analytical Leaders: Insights from Information Insiders and Performance Dashboards: Measuring, Monitoring, and Managing Your Business.

In the second part of a two-part Q&A, Eckerson talks about the driving forces behind the recent merger and acquisition deals, self-service analytics, and what excites him about the future of BI. In the first part, Eckerson discusses the divide between enterprises that use data and those that don’t, as well as the importance of DataOps and data strategies and how they play into the data divide.

Among other trends, a wave of consolidation over the last six to 12 months has left fewer vendors but ones with more end-to-end capabilities. What do you see as the next big things in BI that might spark the next wave?

Headshot of Wayne Eckerson, founder and principal consultant of Eckerson GroupWayne Eckerson

Eckerson: It definitely goes in cycles — we’ve seen this consolidation before. The last big one was in 2007-08 when the three biggest BI players — Business Objects, Cognos, Hyperion — were bought by large application vendors SAP, IBM and Oracle, respectively. Usually these cycles are based on the advent of new technology that’s come into the market. In 2002, we moved from client-server to the web, and now we’re in the age of the cloud and self-service, and Looker and Tableau caught the self-service wave with visualization and desktop tools. The next big disruption to the BI market is the cloud. We’re seeing a lag between when these new BI technologies fully maturing on these new platforms and when they get purchased and the market consolidates. If we’re to project out, maybe we’ll see some consolidation around cloud-based, AI-based BI tools where things are much more automated, things are in the cloud, and maybe it’s all embedded and you won’t even notice the BI tools. That’s probably the next wave in five or 10 years.

One thing that jumps out about first Google’s acquisition of Looker and then even more so with Salesforce’s purchase of Tableau is the price. Why are companies suddenly paying so much for BI vendors?

Eckerson: They’ve always gone for a premium, but now the premium is in the billions and not the hundreds of millions. We’re in this data-driven age now, and these are the tools that the business users touch and feel and use, so that maybe gives them a higher premium than middleware or database technology that’s behind the scenes. Tableau has been a meteor, and they probably sold at the right time for them. They’re under duress now from competitors. I think it’s just a testimony to how much data is front and center to the way businesses operate in today’s environment.

Ease of use to make data available to the citizen data scientist has been a significant push. Do you see self-service analytics taking over, or will there always be some things that are just too big and complex for average users and self-service BI will just be part of the picture?

Eckerson: Self-service is an interesting topic because there’s been so much frustration with IT, the IT bottleneck and delivering new applications for analytics that businesses wanted self-service just to get away from IT, but in the end what you really want is a blend of both. There are things that are too complex for the average business person to create on their own. If you want to build an enterprise unit for everybody, no business unit is going to do that alone, so you’d need a central group just for that. And then every division has some complex custom apps that need to be built, so you’ll need a corporate development team to build cutting-edge applications that will really help the company compete. On a day-to-day basis every business unit needs its core data analysts and data scientists to be looking into data to help optimize decisions, help optimize business processes, respond creatively and quickly to events as they happen on the ground, to win business, to avoid losing business, to manage risk — all that stuff. The self-service is really the agile, innovative arm of the business, whereas as the corporate IT team is the run-the-business operational side that will build stuff that’s needed on a long-term basis. You need both sides to operate effectively.

As you analyze the BI industry, what are the next big things in BI that get you excited?

Eckerson: I am excited about AI for BI — it’s really transforming the way people are using data to make decisions, and it’s going to transform these BI tools. Before you needed a hypothesis of what to look for when you’re doing an analysis, and now the tools will dig into the data for you. They’ll do thousands of drill-downs in a matter of seconds and expose and surface only the most relevant correlations for you to look at. That’s pretty interesting. DataOps is pretty interesting, because that will fix the back end — the data that’s being delivered into these analytical tools. I think time-series analytics is the next big wave that we’ll see hit the marketplace. Especially as the internet of things and big data take hold, companies can use time-series analytics to automate decisions. The intersection of time-series analytics, AI and cloud-based computing with its infinite storage and elasticity — the combination of those things is going to bring about a sea change. There’s a lot to be excited about in our space.

Editors’ note: This interview has been edited for clarity and conciseness.

Go to Original Article
Author:

Nuage Networks, Talari SD-WAN tack on multi-cloud connectivity

Software-defined WAN vendors are rushing to enhance their SD-WAN platforms with multi-cloud support, as more enterprises and service providers migrate their workloads to the cloud. This week, both Nuage Networks and Talari made multi-cloud connectivity announcements of their own.

Nuage Networks, a Nokia company, updated its SD-WAN platform — Virtualized Network Services — to better support SaaS and multi-cloud connectivity.

The platform enhancement moves to address three specific pain points among customers, according to Hussein Khazaal, Nuage’s vice president of marketing and partnerships. The three points, multi-cloud connectivity, value-added services and end-to-end security, are already available to customers.

“It’s a single platform that you can deploy today and get connectivity to software as a service,” Khazaal said. “We support customers as they send traffic directly from the branch to the SaaS application.”

In addition to multi-cloud connectivity, Nuage VNS offers customers the option to add value-added services — or virtual network functions (VNFs) — that can be embedded within the SD-WAN platform, hosted in x86 customer premises equipment (CPE) or through service chaining (a set of network services interconnected through the network to support an application). These VNFs are available from more than 40 third-party partners and can include services like next-generation firewalls, voice over IP and WAN optimization, Khazaal said.

While many service providers are leaning toward the VNF and virtual CPE approach, the process isn’t simple, according to Lee Doyle, principal analyst at Doyle Research.

“Many service providers are finding the vCPE and VNF approach side to be challenging,” Doyle said. “Those with the resources can, and will, pursue it, and that’s where Nuage could be a piece of the puzzle.”

When it comes to enterprise customers, however, the VNF approach is less attainable, both Doyle and Khazaal noted.

“Nuage is one piece of the puzzle that a customer might add if they’re able to do it themselves,” Doyle said. “But most customers don’t want to piece together different elements.”

For smaller enterprise customers, Khazaal recommended using the option with embedded features, like stateful firewall and URL filtering, built into the SD-WAN platform.

Although Nuage has more than 400 enterprise customers, according to a company statement, its primary market is among service providers. Nuage counts more than 50 service providers as partners that offer managed SD-WAN services — including BT, Cogeco Peer 1, Telefónica and Vertel — and has been a proven partner for service providers over the years, Doyle said.

“Nuage is a popular element of service providers’ managed services strategies, including SD-WAN,” he said. “These enhancements will be attractive mainly to the service providers.”

Nuage VNS is available now with perpetual and subscription-based licenses, and varies based on desired features and capabilities.

Talari launches Cloud Connect for SaaS, multi-cloud connectivity

In an additional multi-cloud move, Talari updated its own SD-WAN offering with Talari Cloud Connect, a platform that supports access to cloud-based and SaaS applications.

Talari also named five accompanying Cloud Connect partners: RingCentral, Pure IP, Evolve IP, Meta Networks and Mode. These partners will run Talari’s Cloud Connect point of presence (POP) technology in their own infrastructure, creating a tunnel from the customer’s Talari software into the cloud or SaaS service, according to Andy Gottlieb, Talari’s co-founder and chief marketing officer.

“The technology at the service provider is multi-tenant, so they only have to stand up one instance to support multiple customers,” Gottlieb said. Meantime, enterprises can use the Cloud Connect tunnel without having to worry about building infrastructure in the cloud, which reduces costs and complexity, he added.

Talari’s partner list reflects the demands of both customers and service providers, he said. Unified communications vendors like RingCentral, for example, require reliable connectivity and low latency for their applications. Meta Networks, on the other hand, offers cloud-based security capabilities, which enterprises are increasingly adding to their networks. Talari SD-WAN already supports multi-cloud connectivity to Amazon Web Services and Microsoft Azure.

Talari Cloud Connect will be available at the end of October. The software comes at no additional charge for Talari customers with maintenance contracts or with subscriptions, Gottlieb said. Also, Cloud Connect partners can use the Cloud Connect POP software free of charge to connect to Talari SD-WAN customers, he added.

HR chatbots from Google, IBM to be in the spotlight at HR Tech 2018

The role of big vendors, such as Google and IBM, in HR technology is expanding as their expertise in conversational robotic intelligence powers some of the chatbots used in HR applications. That observation will be evident this week at the HR Technology Conference & Expo in Las Vegas where HR chatbots will be in the spotlight.

The tech giants’ relationship to HR chatbots is analogous to Intel’s role with PC makers that slap “Intel Inside” stickers on their laptops. The machine learning and natural language processing (NLP) technologies developed by large technology sellers give chatbots conversational capabilities.

“A chatbot stands and falls with the quality of the dialogue,” said Holger Mueller, principal analyst at Constellation Research. “Users will drop and not use [a chatbot] if the answers don’t make sense,” he said.

Conference attendees assessing HR chatbots, in effect, make two bets on any one application. They not only evaluate the HR application but also the capabilities of the vendor that built the underlying, AI-related chatbot technology, whether it’s from Amazon, Microsoft, IBM, Google or some other provider. This technology is key “for the whole solution to work,” Mueller said.

Google’s new Dialogflow powers conversational recruiting

A chatbot stands and falls with the quality of the dialogue.
Holger MuellerPrincipal analyst, Constellation Research

Earlier this year, Google, for instance, announced general availability of its Dialogflow Enterprise Edition. This is Google’s platform for creating voice and text conversation and is based on its machine learning and NLP development.

Google’s technology was adopted by Brazen Technologies, which provides online hiring chat events and a recruiting platform. In late August, Brazen announced a “conversational recruiting” capability based on Google’s system, which provides the underlying chatbot intelligence.

The chatbot conversational capability is assisted by human recruiters who prewrite answers to expected questions that a candidate might ask. The system also conducts an initial screening to try to find qualified people, said Joe Matar, director of marketing at Brazen. He expects the capabilities of conversational HR chatbots to improve rapidly, but it will be a long time before they replace a recruiter’s core skills, such as relationship building, he said.

IBM Watson powers management coaching

LEADx, which is announcing its learning platform at the start of the HR Technology Conference, is using IBM Watson in its product, Coach Amanda.

Coach Amanda aims to improve managerial skills with the help of a virtual trainer. The system uses the Watson Personality Insights module, as well as its natural language conversational capabilities. The Insights program diagnoses personality to help shape the chatbot response, as well the answers and learning materials it delivers to the manager, said Kevin Kruse, founder and CEO of the firm.

Kruse said it works like this: A user can type or speak to the chatbot and ask, for instance, “What is the definition of employee engagement?” The manager may follow with a question about seeking tips on employee engagement. The chatbot answers these questions with material from a resource library based on what it knows about the manager.

The underlying IBM NLP technology has to figure out what the manager is asking about. Is the question about an employee problem? Is the manager seeking advice? Or, said Kruse, is the manager seeking a resource?

But not all firms use big vendor chatbot platforms to power HR chatbots.

HR chatbots at 2018 HR Technology Conference & Expo
HR chatbots will be in the spotlight at this year’s HR Technology Conference & Expo.

In-house and open source seen as superior by some

Jane.ai is designed to make all of a company’s information available, whether it is in a PDF or spreadsheet or resides in applications such as ServiceNow, Workday, Salesforce or among team members. HR is one of the major uses of the application, and that’s why this firm will be at the 2018 HR Technology Conference. SearchHRSoftware is the media partner for the conference.

David Karandish, founder and CEO of Jane.ai, said the system was developed in-house but also used some open source tools, such as software in Stanford CoreNLP, which provides a suite of language tools. Jane.ai developed proprietary algorithms to make matches and mine documents, he said.

An employee can use the chat system, for instance, to check vacation time or ask a question about HR policies. It can put in an IT ticket or schedule a meeting with staff.

The firm is up against the large IT vendors in AI-related development, but Karandish said the big vendor HR chatbots weren’t necessarily designed to solve a business problem. That’s why Jane.ai went with the in-house approach, he said.

“A lot of companies are coming out with cool tech, but they haven’t figured out how to actually go solve real problems with it,” Karandish said.