Tag Archives: platform

Kasten backup aims for secure Kubernetes protection

People often talk about Kubernetes “Day 1,” when you get the platform up and running. Now Kasten wants to help with “Day 2.”

Kasten’s K10 is a data management and backup platform for Kubernetes. The latest release, K10 2.0, focuses on security and simplicity.

K10 2.0 includes support for Kubernetes authentication, role-based access control, OpenID Connect, AWS Identity and Access Management roles, customer-managed keys, and integrated encryption of artifacts at rest and in flight.

“Once you put data into storage, the Day 2 operations are critical,” said Krishnan Subramanian, chief research advisor at Rishidot Research. “Day 2 is as critical as Day 1.”

Day 2 — which includes data protection, mobility, backup and restore, and disaster recovery — is becoming a pain point for Kubernetes users, Kasten CEO Niraj Tolia said.

“In 2.0, we are focused on making Kubernetes backup easy and secure,” Tolia said.

Other features the new Kasten backup software offers, which became generally available earlier in November, include a Kubernetes-native API, auto-discovery of the application environment, policy-driven operations, multi-tenancy support, and advanced logging and monitoring. The Kasten backup enables teams to operate their environments, while supporting developers’ ability to use tools of their choice, according to the vendor.

Kasten K10 dashboard screenshot
Kasten K10 provides data management and backup for Kubernetes.

Kasten backup eyes market opportunity

Kasten, which launched its original product in December 2017, generally releases an update to its customers every two weeks. A typical update that’s not as major as 2.0 typically has bug fixes, new features and increased depth in current features. Tolia said there were 55 releases between 1.0 and 2.0.

Day 2 is as critical as Day 1.
Krishnan SubramanianFounder and chief research advisor, Rishidot Research

Backup for container storage has become a hot trend in data protection. Kubernetes specifically is an open source system used to manage containers across private, public and hybrid cloud environments. Kubernetes can be used to manage microservice architectures and is deployable on most cloud providers.

“Everyone’s waking up to the fact that this is going to be the next VMware,” as in, the next infrastructure of choice, Tolia said.

Kubernetes backup products are popping up, but it looks like Kasten is a bit ahead of its time, Rishidot’s Subramanian said. He said he is seeing more enterprises using Kubernetes in production, for example, in moving legacy workloads to the platform, and that makes backup a critical element.

“Kubernetes is just starting to take off,” Subramanian said.

Kubernetes backup “has really taken off in the last two or three quarters,” Tolia said.

Subramanian said he is starting to see legacy vendors such as Dell EMC and NetApp tackling Kubernetes backup, as well as smaller vendors such as Portworx and Robin. He said Kasten had needed stronger security but caught up with K10 2.0. Down the road, he said he will look for Kasten to improve its governance and analytics.

Tolia said Kasten backup stands out because it’s “purpose-built for Kubernetes” and extends into multilayered data management.

In August, Kasten, which is based in Los Altos, Calif., closed a $14 million Series A funding round, led by Insight Partners. Tolia did not give Kasten’s customer count but said it has deployments across multiple continents.

Go to Original Article
Author:

Aviso introduces version 2.0 of AI-guided sales platform

Aviso announced version 2.0 of its artificial intelligence guided sales platform last week. The new version is aimed at lowering costs and reducing the time that sales reps spend working on CRM databases by providing them with AI tools that predict deal close probabilities and guide next best actions.

Algorithmic-guided selling using AI technology and existing sales data to guide sellers through deals is a new but increasingly popular technology. Nearly 51% of sales organizations have already deployed or plan to deploy algorithmic-guided selling in the next five years, according to a 2019 Gartner survey.

Aviso’s 2.0 sales platform uses AI tools to prioritize sales opportunities and analyze data from sources including CRM systems, emails, user calendars, chat transcripts and support and success tools to deliver real-time insights and suggest next best action for sales teams. The support and success tools are external offerings that Aviso’s platform can connect with, including customer support tools like Zendesk or Salesforce Service Cloud, and customer success tools like Gainsight or Totango, according to Amit Pande, vice president of marketing at Aviso.

The forecasting and sales guidance vendor claims the new version will help sales teams close 20% more deals and reduce spending on non-core CRM licenses by 30% compared with conventional CRM systems. The cost reduction calculation is based on “the number of non-core licenses that can be eliminated, as well as additional costs such as storage and add-ons that can be eliminated when underutilized or unused licenses are eliminated,” Pande said.

According to Aviso, new AI-based features in version 2.0 of its sales platform include:

  • Deal Execution Tools, a trio of tools meant to assist in finalizing deals. Bookings Timeline uses machine learning to calculate when deals will close based on an organization’s unique history. Each booking timeline also includes the top factors that influence the prediction. Opportunity Acceleration helps sales teams determine which opportunities carry the highest probability of closing early if they are pulled into the current quarter. Informed Editing is intended to limit typos and unsaved changes during entry of data. The tool gives users contextual help before they commit to edits, telling them what quarter or whose forecast they are updating. Changes and edits are automatically saved by the software.
  • Deal and Forecast Rooms enable users to do what-if analysis, use scenario modeling and automatically summarize forecast calls and deal review transcripts.
  • Coaching Rooms help sales managers improve sales rep performance with data from past and current deals and from team activity in Deal and Forecast Rooms. 
  • Nudges provide reminders for sales reps through an app on mobile devices. Nudges also offer recommendation for course corrections, and potential next steps based on insights from the specific deal.

Aviso’s 2.0 sales platform is currently in beta with select customers.

Cybersecurity company FireEye has been using the Aviso platform for several years and is among the selected customers. Andy Pan, director of Americas and public sector sales operations at FireEye, said the Aviso platform has helped FireEye operate in a more predictive measure through some of its new AI-driven features. “The predictive features helps us review both the macro business as a whole, and the deal-specific features provides guided pathways towards the inspection of deals.”

Other sales forecasting tools vendors in the market include Salesforce and Clari. Sales forecasting feature from Salesforce enables organizations to make forecasts specific to their needs and let managers track their team’s performance. Clari’s product includes features such as predictive forecasting, which uses AI-based projection to see the team’s achievement at the end of the quarter, and history tracking to see who last made changes to the forecast.

Go to Original Article
Author:

Oracle looks to grow multi-model database features

Perhaps no single vendor or database platform over the past three decades has been as pervasive as the Oracle database.

Much as the broader IT market has evolved, so too has Oracle’s database. Oracle has added new capabilities to meet changing needs and competitive challenges. With a move toward the cloud, new multi-model database options and increasing automation, the modern Oracle database continues to move forward. Among the executives who have been at Oracle the longest is Juan Loaiza, executive vice president of mission critical database technologies, who has watched the database market evolve, first-hand, since 1988.

In this Q&A, Loaiza discusses the evolution of the database market and how Oracle’s namesake database is positioned for the future.

Why have you stayed at Oracle for more than three decades and what has been the biggest change you’ve seen over that time?

Juan LoaizaJuan Loaiza

Juan Loaiza: A lot of it has to do with the fact that Oracle has done well. I always say Oracle’s managed to stay competitive and market-leading with good technology.

Oracle also pivots very quickly when needed. How do you survive for 40 years? Well, you have to react and lead when technology changes.

Decade after decade, Oracle continues to be relevant in the database market as it pivots to include an expanding list of capabilities to serve users.

The big change that happened a little over a year ago is that Thomas Kurian [former president of product development] left Oracle. He was head of all development and when he left what happened is that some of the teams, like database and apps, ended rolling up to [Oracle founder and CTO] Larry Ellison. Larry is now directly managing some of the big technology teams. For example, I work directly with Larry.

What is your view on the multi-model database approach?

Loaiza: This is something we’re starting to talk more about. So the term that people use is multi-model but we’re using a different term, we’re using a term called converged database and the reason for that is because multi-model is kind of one component of it.

Multi-model really talks about different data models that you can model inside the database, but we’re also doing much more than that. Blockchain is an example of converging technology that is not even thought about normally as database technology into the database. So we’re going well beyond the conventional kind of multi-model of, Hey, I can do this, data format, and that data format.

Initially, the relational database was the mainstream database people used for both OLTP [online transaction processing] and analytics. What has happened in the last 10 to 15 years is that there have been a lot of new database technologies to come around, things like NoSQL, JSON, document databases, databases for geospatial data and graph databases too. So there’s a lot of specialty databases that have come around. What’s happening is, people are having to cobble together a complex kind of web of databases to solve one problem and that creates an enormous amount of complexity.

With the idea of a converged database, we’re taking all the good ideas, whether it’s NoSQL, blockchain or graph, and we’re building it into the Oracle database. So you can basically use one data store and write your application to that.

The analogy that we use is that of a smartphone. We used to have a music device and a phone device and a calendar device and a GPS device and all these things and what’s happened is they’ve all been converged into a smartphone.

Are companies actually shifting their on-premises production database deployments to the cloud?

Loaiza: There’s definitely a switch to the cloud. There are two models to cloud; one is kind of the grassroots. So we’re seeing some of that, for example, with our autonomous database that people are using now. So they’re like, ‘Hey, I’m in the finance department, and I need a reporting database,’ or, ‘hey, I’m in the marketing department, and I need some database to run some campaign with.’ So that’s kind of a grassroots and those guys are building a new thing and they want to just go to cloud. It’s much easier and much quicker to set up a database and much more agile to go to the cloud.

The second model is where somebody up in the hierarchy says, ‘Hey, we have a strategy to move to cloud.’ Some companies want to move quickly and some companies say, ‘Hey, you know, I’m going to take my time,’ and there’s everything in the middle.

Will autonomous database technology mean enterprises will need fewer database professionals?

Loaiza: The autonomous database addresses the mundane aspects of running a database. Things like tuning the database, installing it, configuring it, setting up HA [high availability], among other tasks. That doesn’t mean that there’s nothing for database professionals to do.

Like every other field where there is automation, what you do is you move upstream, you say, ‘Hey, I’m going to work on machine learning or analytics or blockchain or security.’ There’s a lot of different aspects of data management that require a lot of labor.

One of the nice things that we have in this industry is there is no real unemployment crisis in IT. There’s a lot of unfilled jobs.

So it’s pretty straightforward for someone who has good skills in data management to just move upstream and do something that’s going to add more specific value then just configuring and setting up databases, which is really more of a mechanical process.

This interview has been edited for clarity and conciseness.

Go to Original Article
Author:

Salesforce Trailhead app makes learning more convenient

SAN FRANCISCO — Salesforce customers see the value in the Trailhead learning platform and its new mobile app.

Trailhead Go for iOS is one of two new mobile apps that Salesforce announced here at Dreamforce 2019. Trailhead Go is a mobile extension of Trailhead, Salesforce’s free customer success learning platform enabling Salesforce users and nonusers to follow different paths to learn Salesforce skills. It now also offers Amazon Partner Connect to learn how to build Amazon Alexa skills and AWS. By the end of the year, Trailhead plans to roll out live and on-demand training videos.

Salesforce provides customer success tools to users before they even become customers. For most businesses, this model is flipped, providing these tools to users after they sign contracts, said Gerry Murray, a research director at IDC.

“It’s not only about how the product works, it’s about teaching the line- of-business people to elevate their skills or further their careers in and out of their companies,” Murray said. “Trailhead Go makes it all that more convenient.”

Making education accessible

A skills gap costs companies $1.3 trillion each year, said Sarah Franklin, general manager of Trailhead, in a keynote. While many workers think they can fill that gap with education, it has become more and more inaccessible. Over the last 20 years, student tuition has increased by 200%, and student debt has increased by 163%.

Anyone who has access to the Trailhead Go app can learn, said Ray Wang, principal analyst and founder at Constellation Research.

“You don’t have to go to school; you don’t need a computer; you just need a phone,” he said.

Customers see benefits

Trailhead Go app screenshot
This personalized homepage of the Trailhead Go app shows what trails a user is working on with a quick navigation bar at the bottom.

Supermums, based in London, equips moms with Salesforce skills through a combination of training, mentoring, work experience and job search support to get them into the Salesforce ecosystem. Trainees go through a customized six-month program where they earn 50 to 100 Trailhead badges. Trainees can benefit from the Trailhead app because they’ll be able to learn on the go, making it easier to fit into their schedules, said Heather Black, a certified Salesforce administrator and CEO of Supermums.

“[Trailhead Go] will help me complete more trails and fit it into my life while I’m busy supporting a team and juggling kids,” she said. “Trailhead Go makes this accessible to more people.”

Trailhead has also branched out beyond technical skills and into functional skills, Black said.

“It helps you develop as a person, as well as help you be successful in a Salesforce career,” she said.

Trailhead is great for helping learn the basics when people are entering the CRM world, said Sayantani Mitra, a data scientist at Goby Inc., a company that specializes in accounts payable automation.

“Read them, learn them, ask the community, ask people questions, do them multiple times,” Mitra said.

The best way to learn anything is practice, practice and practice more.
Sayantani MitraData scientist, Goby

But just getting a Salesforce certification won’t get someone a job, Mitra said. They have to know what they’re doing.

“The best way to learn anything is practice, practice and practice more,” Mitra said.

Mitra plans to use the Trailhead Go app particularly on long-haul flights.

“When I go home to India … you cannot watch movies for 20 hours or sleep for 20 hours; you need something more,” she said.

Trailhead Go is generally available now for free on the Apple App Store.

Go to Original Article
Author:

New Salesforce Customer 360 aims to unify data sources

SAN FRANCISCO — Salesforce’s new customer data and identity platform released this week connects data across sales, service, marketing and other departments within an organization to provide users with a single ID for each customer.

The platform, Salesforce Customer 360 Truth (also called Single Source of Truth and SSOT by Salesforce executives) is considered by many to be a customer data platform (CDP) — though that is only one component of the new platform.

Whatever one calls it, Salesforce Customer 360 Truth resembles competing CDPs from Oracle, Adobe and Microsoft in functionality. Customers, analysts and partners said the new feature bundle solves a problem endemic to many CX platforms: reconciling customer IDs and data across sales, marketing, e-commerce and customer service platforms to a golden record.

Features in Customer 360 Truth — which come at no extra charge to Salesforce subscribers — include customer data management, identity management and privacy tools that are available now. Salesforce plans to release a unified customer profile, the most significant feature, in 2020.

The capabilities, taken together, will not only aggregate updated customer data from one base like a CDP, but will be able go further than CDPs, typically used by marketing and advertising teams. Customer 360 Truth features can route it and push actions and personalizations to sales, service and support teams dealing with an individual customer, the company said.

Customer data will be structured on the Cloud Information Model open standard modeled by MuleSoft and developed jointly by AWS, Genesys and Salesforce for The Linux Foundation.

It’s all long overdue, IDC analyst Neil Ward-Dutton said.

“Salesforce should have done this five years ago. If anything’s surprising, it’s that they have managed to not have it until now, because many customers have more than one cloud and there’s been no easy way to get a view of individual, unique IDs,” Ward-Dutton said. “It’s instructive that it’s free. I think that’s the only thing they could do. If they would have charged for this, it would have been a really bad mistake.”

Salesforce Customer 360 Truth includes a Data Manager component.
Customer 360 Truth connects customer data across Salesforce clouds and non-Salesforce systems and matches that data to individual customers to create a common profile and issue a single Salesforce ID.

Customers, partners take stock

Customers generally reacted positively to the news of Salesforce Customer 360 Truth at Dreamforce here this week, with some wondering what data and process housecleaning outside of Salesforce will be required to use the tools.

If anything’s surprising, it’s that they have managed to not have it until now, because many customers have more than one cloud and there’s been no easy way to get a view of individual, unique IDs.
Neil Ward-DuttonAnalyst, IDC

That’s the case for e.l.f. Cosmetics, CIO and CTO Ekta Chopra said. Her company runs Salesforce marketing, sales, service and e-commerce clouds, and also is an early adopter of Salesforce’s order management system, processing about a million transactions a year. While Customer 360 Truth features look promising, her company will have to evaluate how to manage different types of profiles such as customers versus wholesalers.

“We have to make sure we’re gathering all that data in the best possible way,” Chopra said. “We’re not just a direct-to-consumer business.”

Hyland Software is both a Salesforce customer and a partner, with its OnBase enterprise content management system integration available on Salesforce’s AppExchange. Salesforce Customer 360 Truth is a move in the right direction to reconcile conflicting customer data, but the process will always require a mix of different vendors’ tools to nail it all down, said Ed McQuiston, Hyland executive vice president and chief commercial officer.

“There is no one, ubiquitous platform that gives you 360,” McQuiston said. “Salesforce plays a critical part for us in terms of understanding the customer, our interactions, et cetera. But we use our own product with it, because I want to see the contracts we have, the support information. I want that complete view.”

Patrick Morrissey, general manager of Salesforce partner Upland Software, said he thinks Customer 360 features will help Salesforce customers use Upland’s Altify revenue management tools more effectively.

“Customer revenue optimization intersects quite nicely with Customer 360,” Morrissey said. “The problem is that the data and processes don’t connect. The vision that Salesforce has around Customer 360 is fantastic, because it brings the data together for the customer and reduces friction.”

CDPs can only go so far

Salesforce might not call Customer 360 Truth a CDP because its capabilities extend beyond what competing CDPs do, said Joe Stanhope, analyst at Forrester Research, who watches the technology closely.

“Salesforce was talking quite a bit about CDPs in the early iterations of Customer 360,” Stanhope said. “But I think, over time, the scope evolved and expanded. Ultimately, Customer 360 is about more than a CDP, and even more than just marketing. Customer 360 is the key to enabling the Salesforce ecosystem with data.”

Arizona State University’s CTO of EdPlus online learning, Donna Kidwell, sees the Salesforce tools as a good start to wrangle sprawling data. Her team is building a blockchain ledger to track accomplishments of the university’s learners, which comprises students pursuing degrees, professionals earning certifications, high schoolers attending camps and others who interact in some way with the university.

The ambitious project involves Salesforce CRM data and Salesforce Blockchain as a spoke of a much larger wheel that ultimately will enable data sharing across educational institutions and employers.

CDPs in general — and Salesforce Customer 360 Truth in particular — may help consolidate data that can be fed into the ledger at some point in the future. But ultimately, managing customer data across learning systems, HR applications, Salesforce and other contributing systems is a much larger problem than a CDP can solve.

“I’m definitely tracking the CDPs,” Kidwell said. “I’m hopeful that Salesforce will ease some of those concerns, but I can’t imagine they’ll be the single source. There’s not going to be a single source of truth. We’re actually going to need data strategies, and our technologies will help implement those strategies.”

Go to Original Article
Author:

Tableau analytics platform upgrades driven by user needs

LAS VEGAS — Tableau revealed a host of additions and upgrades to the Tableau analytics platform in the days both before and during Tableau Conference 2019.

Less than a week before its annual user conference, the vendor released Tableau 2019.4, a scheduled update of the Tableau analytics platform. And during the conference, Tableau unveiled not only new products and updates to existing ones, but also an enhanced partnership with Amazon Web Services to help users move to the cloud and a new partner network.

Many of the additions to the Tableau analytics platform have to do with data management, an area Tableau only recently began to explore. Among them are Tableau Catalog and Prep Conductor.

Others, meanwhile, are centered on augmented analytics, including Ask Data and Explain Data.

All of these enhancements to the Tableau analytics platform come in the wake of the news last June that Tableau was acquired by Salesforce, a deal that closed on Aug. 1 but was held up until just last week by a regulatory review in the United Kingdom looking at what effect the combination of the two companies would have on competition.

In a two-part Q&A, Andrew Beers, Tableau’s chief technology officer, discussed the new and enhanced products in the Tableau analytics platform as well as how Tableau and Salesforce will work together.

Part I focuses on data management and AI products in the Tableau analytics platform, while Part II centers on the relationship between Salesforce and Tableau.

Data management has been a theme of new products and upgrades to the Tableau analytics platform — what led Tableau in that direction?

Andrew BeersAndrew Beers

Andrew Beers: We’ve been about self-service analysis for a long time. Early themes out of the Tableau product line were putting the right tools in the hands of the people that were in the business that had the data and had the questions, and didn’t need someone standing between them and getting the answers to those questions. As that started to become really successful, then you had what happens in every self-service culture — dealing with all of this content that’s out there, all of this data that’s out there. We helped by introducing a prep product. But then you had people that were generating dashboards, generating data sets, and then we said, ‘To stick to our belief in self-service we’ve got to do something in the data management space, so what would a user-facing prep solution look like, an operationalization solution look like, a catalog solution look like?’ And that’s what started our thinking about all these various capabilities.

Along those lines, what’s the roadmap for the next few years?

Beers: We always have things that are in the works. We are at the beginning of several efforts — Tableau Prep is a baby product that’s a year and a half old. Conductor is just a couple of releases old. You’re going to see a lot of upgrades to those products and along those themes — how do you make prep easier and more approachable, how do you give your business the insight into the data and how it is being used, and how do you manage it? That’s tooling we haven’t built out that far yet. Once you have all of this knowledge and you’ve given people insights, which is a key ingredient in governance along with how to manage it in a self-service way, you’ll start to see the Catalog product grow into ideas like that.

Are these products something customers asked for, or are they products Tableau decided to develop on its own?

Beers: It’s always a combination. From the beginning we’ve listened to what our customers are saying. Sometimes they’re saying, ‘I want something that looks like this,’ but often they’re telling us, ‘Here is the kind of problem we’re facing, and here are the challenges we’re facing in our organization,’ and when you start to hear similar stories enough you generalize that the customers really need something in this space. And this is really how all of our product invention happens. It’s by listening to the intent behind what the customer is saying and then inventing the products or the new capabilities that will take the customer in a direction we think they need to go.

Shifting from data management to augmented intelligence, that’s been a theme of another set of products. Where did the motivation come from to infuse more natural language processing and machine learning into the Tableau analytics platform?

Beers: It’s a similar story here, just listening to customers and hearing them wanting to take the insights that their more analyst-style users got from Tableau to a larger part of the organization, which always leads you down the path of trying to figure out how to add more intelligence into the product. That’s not new for Tableau. In the beginning we said, ‘We want to build this tool for everyone,’ but if I’m building it for everyone I can’t assume that you know SQL, that you know color design, that you know how to tell a good story, so we had to build all those in there and then let users depart from that. With these smart things, it’s how can I extend that to letting people get different kinds of value from their question. We have a researcher in the NLP space who was seeing these signals a while ago and she started prototyping some of these ideas about how to bring natural language questioning into an analytical workspace, and that really inspired us to look deeply at the space and led us to think about acquisitions..

What’s the roadmap for Tableau’s AI capabilities?

With the way tech has been developing around things like AI and machine learning, there are just all kinds of new techniques that are available to us that weren’t mainstream enough 10 years ago to be pulling into the product.
Andrew BeersChief technology officer, Tableau

Beers: You’re going to see these AI and machine learning-style capabilities really in every layer of the product stack we have. We showed two [at the conference] — Ask Data and Explain Data — that are very much targeted at the analyst, but you’ll see it integrated into the data prep products. We’ve got some smarts in there already. We’ve added Recommendations, which is how to take the wisdom of the crowd, of the people that are at your business, to help you find things that you wouldn’t normally find or help you do operations that you yourself haven’t done yet but that your community around have done. You’re going to see that all over the product in little ways to make it easier to use and to expand the kinds of people that can do those operations.

As a technology officer, how fun is this kind of stuff for you?

Beers: It’s really exciting. It’s all kinds of fun things that we can do. I’ve always loved the mission of the company, how people see and understand data, because we can do this for decades. There’s so much interesting work ahead of us. As someone who’s focused on the technology, the problems are just super interesting, and I think with the way tech has been developing around things like AI and machine learning, there are just all kinds of new techniques that are available to us that weren’t mainstream enough 10 years ago to be pulling into the product.

Go to Original Article
Author:

HYCU backup for Google Cloud adds SAP HANA support

HYCU enhanced its Google Cloud Platform backup with SAP HANA support, offering it as a managed service that eases the burden on IT.

The HYCU Backup as a Service for Google Cloud is purpose-built for GCP, similar to how HYCU’s first major product was purpose-built for Nutanix data protection. It’s fully integrated into Google Cloud Identity & Access Management.

“It was built with the Google administrator in mind,” so there’s no extra training needed, said Subbiah Sundaram, vice president of products at HYCU.

Offering it as a service is critical to protecting cloud workloads natively, according to Enterprise Strategy Group senior analyst Christophe Bertrand. The firm’s research shows that IT professionals want similar features in cloud-native data protection as in their on-premises environments, but there are gaps.

“Among the key areas are enterprise-class scalability, which HYCU is addressing in this release with enhancements to cloud-native incrementals, scalability, mission-critical application support with SAP HANA and performance optimizations,” Bertrand wrote in an email. “Cloud is about scale, and this means that data protection mechanisms have to adapt.”

Protection for a ‘mission-critical application’

HYCU backup for GCP is supporting SAP HANA for the first time with this release. The support requires a special understanding of the infrastructure being protected and a mechanism to coordinate with SAP HANA to get a consistent copy, according to Sundaram.

The HYCU Backup as a Service uses Google snapshots for database-consistent, impact-free backup and recovery. It includes support for single file recovery.

The use of native storage snapshots is a distinguished approach, according to Bertrand.

“I expect that we will see a number of HYCU customers scale their environments in time,” Bertrand wrote. “SAP HANA is a mission-critical application in many enterprises, and in combination with GCP, offers a lot of promise for scaling deployments up and out, and the ability to do analytics for business uses beyond just backup or BC/DR.”

Sundaram said Google sellers and partners asked for the SAP HANA support — they want more customers adding SAP HANA on GCP. SAP HANA, an in-memory database for processing high volumes of data in real time, is popular with large retailers.

Screenshot of HYCU backup for Google Cloud Platform
HYCU backups use changed block tracking functionality to enable optimized bucket storage use.

Dive deeper into HYCU’s strategy

HYCU’s GCP backup product originally launched in July 2018. Because it is a service, HYCU takes care of the installation, management and upgrades. HYCU claims one-click backups.

It was built with the Google administrator in mind.
Subbiah SundaramVice president of products, HYCU

Users back up to Google Cloud Storage buckets. Starting with this update, HYCU backup uses changed block tracking to enable optimized bucket storage consumption.

HYCU can keep costs down because the customer doesn’t pay for compute, Sundaram said. The product’s incremental backups and auto-tiering also save money.

The product does not require caching storage, according to HYCU, which means cheaper data transfer for backup and better use of cloud storage.

HYCU, which is based in Boston, Mass., has built its strategy on offering specialized services that go deep in specific environments, according to Bertrand.

“It gives them this best of breed advantage over generalists, and our research shows that IT professionals have no problem using the best cloud backup solutions for the job at hand — meaning using a new solution or an additional vendor,” Bertrand wrote. “I believe that they are well-positioned to deliver additional services beyond backup and BC/DR, such as intelligent data management based on data reuse.”

HYCU Backup as a Service for Google Cloud is available on the GCP Marketplace and through authorized partners. Cost depends on the amount of data under protection and frequency of backup.

HYCU backup automatically updated for current customers in October.

In the coming weeks, HYCU expects to launch its Protégé product for multi-cloud disaster recovery and migration. It’s also planning a major update in early 2020 that will add another supported cloud platform.

Go to Original Article
Author:

Acquia, Drupal founder Dries Buytaert on open source, Vista, CDPs

NEW ORLEANS — In 2000, Acquia CTO Dries Buytaert created his own web platform, Drupal, which became an open source web content management system. And, in turn, he launched Acquia, which became the commercially supported instance of Drupal.

In an interview during the Acquia Engage user conference here this week, Buytaert discussed Vista Equity Partners’ billion-dollar majority stake in the company, the role of Acquia Drupal, marketing automation ambitions and a possible Acquia customer data platform.

Tell me how the Vista deal happened. Were you actively seeking a buyer?

Dries Buytaert: No. We were profitable, we really didn’t need more investment. But at the same time, we have an ambitious roadmap and our competitors are well-funded. We were starting to receive a lot of inbound requests from different firms, including Vista. When they come to you, you’ve got to look at it. It made sense.

How much of Acquia does Vista own; what is ‘a majority?’

Buytaert: I don’t know the exact number, but it’s a majority. Some of our investors got out, some stayed in. AWS, for example, stayed in.

Was CEO Mike Sullivan brought on two years ago to shop Acquia Drupal around for acquisition?

Buytaert: No. It’s funny how people read into these things as if it was all planned out. Mike was brought on because Tom Erickson left. I was in on the search for Mike.

Is your Acquia Drupal role changing now, or do you see it changing soon?

Buytaert: I can’t really speak for Vista, but in the conversation they were very excited about our momentum, our strategy and the team. My belief is that they will leave all of these things intact, including my role.

Last year at this time, you discussed IBM buying Red Hat and Adobe buying Magento and wondered about the future of their open source versions under new ownership. Now you’re in that position with Acquia Drupal.

Dries Buytaert
Drupal creator and Acquia CTO Dries Buytaert addressing the open-source faithful at his company’s Engage user conference.

Buytaert: The way I think about it, we’ve always been majority-owned by investors, and we’ve swapped out investors. For me, nothing really changed, it’s the same strategy, the same beliefs about open source. Just different investors. We’ll just keep going, faster and faster.

We surround Drupal with tools and projects and supercharge it. Ultimately, the financials speak for themselves. We are growing the top line and the bottom line — because the model works. I’ve been balancing this for 12 years, giving back to open source. It’s fascinating to people how we do this but to me, it’s so natural it’s hard to explain. The reality is that 99% of our customers use Drupal, so our success is completely tied to the success of Drupal.

Why did Acquia buy Mautic and jump into the marketing automation space, where there’s much more competition than web content management?

Buytaert: Mautic is the only open source marketing automation platform. Open source disrupted the web content management space, and I don’t think there’s any commercial, standalone, proprietary CMS that is really successful. The ones that are successful do a lot more than that, like Sitecore and Adobe, that do e-commerce, et cetera.

Pure-play proprietary CMS vendors have to shift to something else, like digital experience, to stay relevant. No sane person would pay hundreds of thousands of dollars just for the CMS piece.

In your keynote today, you talked about unified customer profiles and consolidating data sources, and said to stay tuned for future announcements. Is Acquia coming out with a customer data platform (CDP)?

Buytaert: I can’t comment on that, specifically, but we are thinking a lot about the data layer. It seems like there are a lot of similarities with CDPs — it’s about integration and unifying the profile with [personalization engine] Acquia Lift and Mautic. Mautic has about a hundred integrations built by community members. It’s not a new challenge; we have been thinking and working on it for a long time.

The whole CDP space is in an emergent state. No one talked about CDPs two years ago, and now everyone’s a CDP. Maybe that’s why I’m avoiding the term. Some CDPs are glorified tag managers, others are integration platforms, yet others positioned themselves two years ago as a content personalization platform. They come in so many flavors today that ‘CDP’ means different things to different people.

What we’re obsessed with — whether you want to call it a CDP or not — is creating a unified user profile, and do it in a way that’s complimentary to what we have.
Dries BuytaertCTO, Acquia

What we’re obsessed with — whether you want to call it a CDP or not — is creating a unified user profile, and doing it in a way that’s complimentary to what we have.

You’re moving into a space controlled by much bigger competitors like Oracle, Salesforce, Microsoft, Adobe and SAP. Is there enough room for Acquia?

Buytaert: Acquia is an open platform, and open source. Developers want that. Literally, no developer wakes up and says, “Let me buy an Adobe product.” Developers love open source, and that’s why there’s enough space for Acquia.

But many marketers who buy these tools are not developers. They want to sign up for a SaaS subscription, turn it on and start marketing.

Buytaert: If you really want to build great experiences, you need developers. You can’t just have marketers build great digital experiences, because the number of integrations you need is so vast — as with custom legacy systems — it’s not just a little drag-and-drop. Really bringing it together is going to require developers, and empowering them to build these things through open source is going to be a big thing.

Go to Original Article
Author:

Tallying the momentous growth and continued expansion of Dynamics 365 and the Power Platform – The Official Microsoft Blog

We’ve seen incredible growth of Dynamics 365 and the Power Platform just in the past year. This momentum is driving a massive investment in people and breakthrough technologies that will empower organizations to transform in the next decade.

We have allocated hundreds of millions of dollars in our business cloud that power business transformation across markets and industries and help organizations solve difficult problems.

This fiscal year we are also heavily investing in the people that bring Dynamics 365 and the Power Platform to life — a rapidly growing global network of experts, from engineers and researchers to sales and marketing professionals. Side-by-side with our incredible partner community, the people that power innovation at Microsoft will fuel transformational experiences for our customers into the next decade.

Accelerating innovation across industries

In every industry, I hear about the struggle to transform from a reactive to proactive organization that can respond to changes in the market, customer needs, and even within their own business. When I talk to customers who have rolled out Dynamics 365 and the Power Platform, the conversation shifts to the breakthrough outcomes they’ve achieved, often in very short time frames.

Customers talk about our unique ability to connect data holistically across departments and teams — with AI-powered insights to drive better outcomes. Let me share a few examples.

This year we’ve focused on a new vision for retail that unifies back office, in-store and digital experiences. One of Washington state’s founding wineries — Ste. Michelle Wine Estates — is onboarding Dynamics 365 Commerce to bridge physical and digital channels, streamline operations with cloud intelligence and continue building brand loyalty with hyper-personalized customer experiences.

When I talk to manufacturers, we often zero in on ways to bring more efficiency to the factory floor and supply chain. Again, it’s our ability to harness data from physical and digital worlds, reason over it with AI-infused insights, that opens doors to new possibilities. For example, Majans, the Australian-based snackfood company, is creating the factory of the future with the help of Microsoft Dynamics 365 Supply Chain Management, Power BI and Azure IoT Hub — bringing Internet of Things (IoT) intelligence to every step in the supply chain, from quality control on the production floor to key performance indicators to track future investments. When everyone relies on a single source of truth about production, inventory and sales performance, decisions employees make drive the same outcome — all made possible on our connected business cloud.

These connected experiences extend to emerging technologies that bridge digital and physical worlds, such as our investment in mixed reality. We’re working with companies like PACCAR — manufacturer of premium trucks — to improve manufacturing productivity and employee training using Dynamics 365 Guides and HoloLens 2, as well as Siemens to enable technicians to service its eHighway — an electrified freight transport system — by completing service steps with hands-free efficiency using HoloLens and two-way communication and documentation in Dynamics 365 Field Service.

For many of our customers, the journey to Dynamics 365 and the Power Platform started with a need for more personalized customer experiences. Our customer data platform (CDP) featuring Dynamics 365 Customer Insights, is helping Tivoli Gardens — one of the world’s longest-running amusement parks — personalize guest experiences across every touchpoint — on the website, at the hotel and in the park.  Marston’s has onboarded Dynamics 365 Sales and Customer Insights to unify guest data and infuse personalized experiences across its 1,500-plus pubs across the U.K.

The value of Dynamics 365 is compounded when coupled with the Power Platform. In late 2019, there are over 3 million monthly active developers on the Power Platform, from non-technical “citizen developers” to Microsoft partners developing world-class, customized apps. In the last year, we’ve seen a 700% growth in Power Apps production apps and a 300% growth in monthly active users. All of those users generate a ton of data, with more than 25 billion Power Automate steps run each day and 25 million data models hosted in the Power BI service.

The impact of the Power Platform is shared in the stories our customers share with us. TruGreen, one of the largest lawn care companies in the U.S., onboarded Dynamics 365 Customer Insights and the Microsoft Power Platform to provide more proactive and predictive services to customers, freeing employees to spend more time on higher value tasks and complex customer issue resolution. And the American Red Cross is leveraging Power Platform integration with Teams to improve disaster response times.

From the Fortune 500 companies below to the thousands of small and medium sized businesses, city and state governments, schools and colleges and nonprofit organizations — Dynamics 365 and the Microsoft Cloud are driving transformative success delivering on business outcomes.

24 business logos of Microsoft partners

Partnering to drive customer success

We can’t talk about growth and momentum of Dynamics 365 and Power Platform without spotlighting our partner community — from ISVs to System Integrators that are the lifeblood of driving scale for our business. We launched new programs, such as the new ISV Connect Program, to help partners get Dynamics 365 and Power Apps solutions to market faster.

Want to empower the next generation of connected cloud business? Join our team!

The incredible momentum of Dynamics 365 and Power Platform means our team is growing, too. In markets around the globe, we’re looking for people who want to make a difference and take their career to the next level by helping global organizations digitally transform on Microsoft Dynamics 365 and the Power Platform. If you’re interested in joining our rapidly growing team, we’re hiring across a wealth of disciplines, from engineering to technical sales, in markets across the globe. Visit careers.microsoft.com to explore business applications specialist career opportunities.

Tags: , ,

Go to Original Article
Author: Microsoft News Center

Microsoft Power Platform adds chatbots; Flow now Power Automate

More bots and automation tools went live on the Microsoft Power Platform, Microsoft announced today. In their formal introductions, Microsoft said the tools will make data sources flow within applications like SharePoint, OneDrive and Dynamics 365, and create more efficiencies with custom apps.

The more than 400 capabilities added to the Microsoft Power Platform focus on expanding its robotic process automation potential for users, as well as new integrations between the platform and Microsoft Teams, according to a blog post by James Phillips, corporate vice president of business applications at Microsoft.

Some of those include robotic process automation (RPA) tools for Microsoft Power Automate, formerly known as Flow, which makes AI tools easier to add into PowerApps. Also newly available are tools for creating user interfaces in Power Automate.

AI Builder adds a point-and-click means to fold common processes such as forms processing, object detection and text classification into apps — processes commonly used for SharePoint and OneDrive content curation.

Microsoft is adding these tools, as well as new security features to analytics platform Power BI, in part to coax customers who remain on premises into the Azure cloud, said G2 analyst Michael Fauscette.

PowerApps reduce the development needed to create necessary connections between systems in the cloud, such as content in OneDrive and SharePoint with work being done in Dynamics 365 CRM, Teams and ERP applications.

Microsoft Power Automate, formerly Flow
Microsoft Power Automate, a low-code app-design tool,is the new version ofFlow.

Chatbots go live

Also announced as generally available at Microsoft Ignite are Power Virtual Agents, do-it-yourself chatbots on the Microsoft Power Platform.

They’ll likely first be used by customer service teams on Dynamics 365, said Constellation Research analyst R “Ray” Wang, but they could spread to other business areas such as human resources, which could use the bots to answer common questions during employee recruiting or onboarding.

If an agent is costing you $15 an hour and the chatbot 15 cents an hour … it’s all about call deflection.
R ‘Ray’ WangAnalyst, Constellation Research

While some companies may choose outside consultants and developers to build custom chatbots instead of making their own on the Microsoft Power Platform, Wang said some companies may try it to build them internally. Large call centers employing many human agents and running on Microsoft applications would be logical candidates for piloting new bots.

“I think they’ll start coming here to build their virtual agents,” Wang said. “[Bot] training will be an issue, but it’s a matter of scale. If an agent is costing you $15 an hour and the chatbot 15 cents an hour … it’s all about call deflection.”

Microsoft Power Platform evolves

PowerApps, which launched in late 2015, originally found utility with users of Microsoft Dynamics CRM who needed to automate and standardize processes across data sets inside the Microsoft environment and connect to outside platforms such as Salesforce, said Gartner analyst Ed Anderson.

Use quickly spread to SharePoint, OneDrive and Dynamics ERP users, as they found that Flow — a low-code app-design tool — enabled the creation of connectors and apps without developer overhead. Third-party consultants and developers also used PowerApps to speed up deliverables to clients. Power BI, Power Automate and PowerApps together became known as the Microsoft Power Platform a year ago.

“PowerApps are really interesting for OneDrive and SharePoint because it lets you quickly identify data sources and quickly do something meaningful with them — connect them together, add some logic around them or customized interfaces,” Anderson said.

Go to Original Article
Author: