Tag Archives: platform

Sigma analytics platform’s interface simplifies queries

In desperate need of data dexterity, Volta Charging turned to the Sigma analytics platform to improve its business intelligence capabilities and ultimately help fuel its growth.

Volta, based in San Francisco and founded in 2010, is a provider of electric vehicle charging stations, and three years ago, when Mia Oppelstrup started at Volta, the company faced a significant problem.

Because there aren’t dedicated charging stations the same way there are dedicated gas stations, Volta has to negotiate with organizations — mostly retail businesses — for parking spots where Volta can place its charging stations.

Naturally, Volta wants its charging stations placed in the parking spots with the best locations near the business they serve. But before an organization gives Volta those spots, Volta has to show that it makes economic sense, that by putting electric car charging stations closest to the door it will help boost customer traffic through the door.

That takes data. It takes proof.

Volta, however, was struggling with its data. It had the necessary information, but finding the data and then putting it in a digestible form was painstakingly slow. Queries had to be submitted to engineers, and those engineers then had to write code to transform the data before delivering a report.

Any slight change required an entirely new query, which involved more coding, time and labor for the engineers.

But then the Sigma analytics platform transformed Volta’s BI capabilities, Volta executives said.

Curiosity isn’t enough to justify engineering time, but curiosity is a way to get new insights. By working with Sigma and doing queries on my own I’m able to find new metrics.
Mia OppelstrupBusiness intelligence manager, Volta Charging

“If I had to ask an engineer every time I had a question, I couldn’t justify all the time it would take unless I knew I’d be getting an available answer,” said Oppelstrup, who began in marketing at Volta and now is the company’s business intelligence manager. “Curiosity isn’t enough to justify engineering time, but curiosity is a way to get new insights. By working with Sigma and doing queries on my own I’m able to find new metrics.”

Metrics, Oppelstrup added, that she’d never be able to find on her own.

“It’s huge for someone like me who never wrote code,” Oppelstrup said. “It would otherwise be like searching a warehouse with a forklift while blindfolded. You get stuck when you have to wait for an engineer.”

Volta looked at other BI platforms — Tableau and Microsoft’s Power BI, in particular — but just under two years ago chose Sigma and has forged ahead with the platform from the 2014 startup.

The product

Sigma Computing was founded by the trio of Jason Frantz, Mike Speiser and Rob Woollen.

Based in San Francisco, the vendor has gone through three rounds of financing and to date raised $58 million, most recently attracting $30 million in November 2019.

When Sigma was founded, and ideas for the Sigma analytics platform first developed, it was in response to what the founders viewed as a lack of access to data.

“Gartner reported that 60 to 73 percent of data is going unused and that only 30 percent of employees use BI tools,” Woollen, Sigma’s CEO, said. “I came back to that — BI was stuck with a small number of users and data was just sitting there, so my mission was to solve that problem and correct all this.”

Woollen, who previously worked at Salesforce and Sutter Hill Ventures — a main investor in Sigma — and his co-founders set out to make data more accessible. They set out to design a BI platform that could be used by ordinary business users — citizen data scientists — without having to rely so much on engineers, and one that respond quickly no matter the queries users ask of it.

Sigma launched the Sigma analytics platform in November 2018.

Like other BI platforms, Sigma — entirely based in the cloud — connects to a user’s cloud data warehouse in order to access the user’s data. Unlike most BI platforms, however, the Sigma analytics platform is a low-code BI tool that doesn’t require engineering expertise to sift through the data, pull the data relevant to a given query and present it in a digestible form.

A key element of that is the Sigma analytics platform’s user interface, which resembles a spreadsheet.

With SQL running in the background to automatically write the necessary code, users can simply make entries and notations in the spreadsheet and Sigma will run the query.

“The focus is always on expanding the audience, and 30 percent employee usage is the one that frustrates me,” Woollen said. “We’re focused on solving that problem and making BI more accessible to more people.”

The interface is key to that end.

“Products in the past focused on a simple interface,” Woollen said. “Our philosophy is that just because a businessperson isn’t technical that shouldn’t mean they can’t ask complicated questions.”

With the Sigma analytics platform’s spreadsheet interface, users can query their data, for example, to examine sales performance in a certain location, time or week. They can then tweak it to look at a different time, or a different week. They can then look at it on a monthly basis, compare it year over year, add and subtract fields and columns at will.

And rather than file a ticket to the IT department for each separate query, they can run the query themselves.

“The spreadsheet interface combines the power to ask any question of the data without having to write SQL or ask a programmer to do it,” Woollen said.

Giving end users power to explore data

Volta knew it had a data dexterity problem — an inability to truly explore its data given its reliance on engineers to run time- and labor-consuming queries — even before Oppelstrup arrived. The company was looking at different BI platforms to attempt to help, but most of the platforms Volta tried out still demanded engineering expertise, Oppelstrup said.

The outlier was the Sigma analytics platform.

“Within a day I was able to set up my own complex joins and answer questions by myself in a visual way,” Oppelstrup said. “I always felt intimidated by data, but Sigma felt like using a spreadsheet and Google Drive.”

One of the significant issues Volta faced before it adopted the Sigma analytics platform was the inability of its salespeople to show data when meeting with retail outlets and attempting to secure prime parking spaces for Volta’s charging stations.

Because of the difficulty accessing data, the salespeople didn’t have the numbers to prove that by placing charging stations near the door it would increase customer traffic.

With the platform’s querying capability, however, Oppelstrup and her team were able to make the discoveries that armed Volta’s salespeople with hard data rather than simply anecdotes.

They could now show a bank a surge in the use of charging stations near banks between 9 a.m. and 4 p.m., movie theaters a similar surge in the use just before the matinee and again before the evening feature, and grocery stores a surge near stores at lunchtime and after work.

They could also show that the charging stations were being used by actual customers, and not by random people charging up their vehicles and then leaving without also going into the bank, the movie theater or the grocery store.

“It’s changed how our sales team approaches its job — it used to just be about relationships, but now there’s data at every step,” Oppelstrup said.

Sigma enables Oppelstrup to give certain teams access to certain data, everyone access to other data, and importantly, easily redact data fields within a set that might otherwise prevent her from sharing information entirely, she said.

And that gets to the heart of Woollen’s intent when he helped start Sigma — enabling business users to work with more data and giving more people that ability to use BI tools.

“Access leads to collaboration,” he said.

Go to Original Article
Author:

Google Cloud security gets boost with Secret Manager

Google has added a new managed service called Secret Manager to its cloud platform amid a climate increasingly marked by high-profile data breaches and exposures.

Secret Manager, now in beta, builds on existing Google Cloud security services by providing a central place to store and manage sensitive data such as API keys or passwords.

The system employs the principle of least privilege, meaning only a project’s owners can look at secrets without explicitly granted permissions, Google said in a blog post. Secret Manager works in conjunction with the Cloud Audit Logging service to create access audit trails. These data sets can then be moved into anomaly detection systems to check for breaches and other abnormalities.

All data is encrypted in transit and at rest with AES-256-level encryption keys. Google plans to add support for customer-managed keys later on, according to the blog.

A secrets manager … is really no different than a database, but just with more audit logs and access checking.
Scott PiperAWS security consultant, Summit Route

Google Cloud customers have been able to manage sensitive data prior to now with Berglas, an open source project that runs from the command line, whereas Secret Manager adds a layer of abstraction through a set of APIs.

Berglas can be used on its own going forward, as well as directly through Secret Manager beginning with the recently released 0.5.0 version, Google said. Google also offers a migration tool for moving sensitive data out of Berglas and into Secret Manager.

Secret Manager builds on the existing Google Cloud security lineup, which also includes Key Management Service, Cloud Security Command Center and VPC Service Controls.

With Secret Manager, Google has introduced its own take on products such as HashiCorp Vault and AWS Secrets Manager, said Scott Piper, an AWS security consultant at Summit Route in Salt Lake City.

Scott Piper, an AWS security consultant at Summit Route Scott Piper

A key management service is used to keep an encryption key and perform encryption operations, Piper said. “So, you send them data, and they encrypt them. A secrets manager, on the other hand, is really no different than a database, but just with more audit logs and access checking. You request a piece of data from it — such as your database password — and it returns it back to you. The purpose of these solutions is to avoid keeping secrets in code.”

Doug Cahill, an analyst at Enterprise Strategy GroupDoug Cahill

Indeed, Google’s Key Management Service targets two different audiences within enterprise IT, said Doug Cahill, an analyst at Enterprise Strategy Group in Milford, Mass.

“The former is focused on managing the lifecycle of data encryption keys, while the latter is focused on securing the secrets employed to securely operate API-driven infrastructure-as-code environments,” Cahill said.

As such, data security and privacy professionals and compliance officers are the likely consumers of a key management offering, whereas secret management services are targeted toward DevOps, Cahill added.

Meanwhile, it is surprising that the Google Cloud security portfolio didn’t already have something like Secret Manager, but AWS only released its own version in mid-2018, Piper said. Microsoft released Azure Key Vault in 2015 and has positioned it as appropriate for managing both encryption keys and other types of sensitive data.

Pricing for Secret Manager during the beta period is calculated two ways: Google charges $0.03 per 10,000 operations, and $0.06 per active secret version per regional replica, per month.

Go to Original Article
Author:

Developers could ease DevOps deployment with CircleCI Orbs

CI/CD platform provider CircleCI has introduced a suite of 20 integrations that automate deployment and were developed with prominent partners including AWS, Azure, Google Cloud, VMware and Salesforce.

These integrations, known as CircleCI Orbs, enable developers to quickly automate deployments directly from their CI/CD pipelines. CircleCI launched Orbs in November 2018, and today there are more than 1,200 listed in its registry. But users created the vast majority of them; the difference with CircleCI’s internally created orbs is that they’re backed by vendor support.

CircleCI Orbs are shareable configuration packages for development builds, said Tom Trahan, CircleCI’s vice president of business development. The orbs define reusable commands, executors and jobs so that commonly used pieces of configuration can be condensed into a single line of code, he said.

The process of automating deployment can be challenging, which is why CircleCI added this suite of out-of-the-box integrations.

Orbs have two primary benefits for developers, said Chris Condo, an analyst at Forrester Research. “They can be certified by the third parties that create them, and they are maintainable pieces of code that contain logic, actions and connections to CD [continuous delivery] capabilities,” he said.

The orbs help CircleCI operate in an increasingly competitive market that includes open-source Jenkins as well as the commercial CloudBees Jenkins Platform, GitLab and GitHub, as well as cloud platform providers such as AWS and Microsoft.

Orbs are very similar in design to the best package managers that you see — like npm for Node.js, or like the Java library or Ruby Gems.
Tom TrahanVice president of business development, CircleCI

“When we launched Orbs, it was because our customers were asking us for a way to operate the same way that they operate within the broader open source world, particularly when you think about open source frameworks for various languages,” Trahan said. “Orbs are very similar in design to the best package managers that you see — like npm for Node.js, or like the Java library or Ruby Gems.”

These are all frameworks created so that bundles of code could be packaged up and made available to developers, which is what the CircleCI Orbs do, Trahan added.

Developers don’t want to have to “reinvent the wheel,” when they can simply access bundles of code and best practices that others have already developed, he said.

Multi-cloud trend drives need for easier deployment

Anything that removes boring configuration work from a developer’s plate is likely to be welcome, said James Governor, an analyst at RedMonk, based in Portland, Maine.

“CircleCI building out a catalog of deployment orbs makes a lot of sense, particularly as the market becomes increasingly multi-cloud oriented,” Governor said. “Enterprises want to see their vendors offer a wide range of supported platforms. The Orb approach allows for standardized, repeatable deployments and rollbacks.”

However, the process of automating deployments can be problematic for some teams because of the time it takes to write integrations with services such as AWS ECS or Google Cloud Run, Trahan said. The CircleCI deployment orbs are designed to limit the complexity and time spent creating integrations.

“Customers are asking for simpler ways to connect their dev and CD processes; Orbs helps them do that,” Forrester’s Condo said. “So I see Orbs as a very nice evolutionary step that allows teams to build maintainable abstractions between their development and deployment processes.”

How commercially successful the new suite of Orbs will be remains to be seen, but conceptually, the approach has been embraced by CircleCI users. Since their launch in November 2018, CircleCI orbs are now used by 13,000 user organizations, with around 40,000 repositories and nine million CI/CD pipelines, Trahan said.

Pricing for CircleCI’s CI/CD pipeline services is free for small teams and starts at $30 a month for teams with four or more developers. Pricing for enterprise customers starts at $3,000 a month. The orbs are free for all CircleCI users.

Go to Original Article
Author:

Ex-SAP exec steers Episerver CMS toward digital experience market

NEW YORK — The Episerver CMS is morphing into a digital experience platform, led by CEO Alex Atzberger, the former SAP C/4HANA customer experience platform lead. He departed SAP in October and joined Episerver last month.

We sat down with Atzberger at NRF 2020 Vision: Retail’s Big Show to discuss recent Episerver acquisitions such as Insite Software, future acquisitions, how digital experience and customer experience differ, why he left SAP and his vision for Episerver’s acquisition and product roadmap.

How did you end up at Episerver? Things happened kind of fast after you left SAP.

Alex Atzberger: It happened very fast. When I desired to leave SAP, I looked for a cloud company with triple-digit [hundreds of millions of dollars in] cloud revenue. I was looking for something in CX, the most exciting and fastest-growing part of enterprise software. And I was looking for something that had the right strategic mindset.

[Episerver] had been acquired by Insight Partners, which had put money into the business, so they’re at an inflection point. They are the leader in what is still king, which is content. Even if you look at commerce-centric businesses, content matters a lot. And how do you marry content and commerce together? There are very few companies that have both of those embedded, and Epi is one of them. It worked out well, and the timing was perfect, very fast.

You’ve only been at Episerver for a month, but how would you describe your vision for the company and the product roadmap moving forward?

There are a couple of trends that continue to be very important: One, understanding everything about your customer, and two, serving up the next best action.
Alex AtzbergerCEO, Episerver

Atzberger: We have an untold story. People are really, really happy with this technology. One big part of the strategy, going forward, is expansion in North America, and telling the story of Epi.

Because of the size of the U.S. market, we have to decide on which specific verticals to focus on. There’s a large part of the economy that is not digital, that is somehow forgotten. These companies will not work with [platform vendors] that are too large; they need [vendors] that are large enough to serve, but small enough to care about the results. … Ultimately commerce and content are the face of so many brands, the heart of your business. … We’re going to focus on that market, and bringing automation to content, using AI and automation to scale [digital operations].

Do you feel like you’re competing with your old company SAP, since Episerver CMS is now on its way to being a full-featured digital experience platform with content and commerce clouds?

Atzberger: When I built the SAP CX platform, we built it under the notion of connecting supply chain and demand chain. It was really a relevant message for very large companies that were looking at one platform. Epi is much more focused on the digital experience, truly understanding the digital customer, and doing it in such a way that companies between, say, a million and a billion, are the sweet spot. It’s 80% or 90% whole different [market].

What happened at SAP? Bill McDermott left [in October], and you weren’t far behind. It was all very quick.

Atzberger: If you look at the big picture, it was 15 years [at SAP]. We all want to be CEO of a company. At one point it becomes harder, and you basically end up being part of a company for life.

There’s too much innovation going on, too much excitement going on that I wanted to be part of as well. Ariba and CX are a massive part of SAP. I’m very proud of that and I’m proud of what SAP has done as a company. With the CEO change it was a natural point [to depart].

How do your past experiences at SAP and SAP Ariba color what you’ll be doing at Episerver?

Atzberger: Those involved transformation, and I think it’s going to be a bit of the same here, rallying people around a common cause and a common brand.

The acquisition of B2B e-commerce company Insite Software, which caters to manufacturers and distributors, happened within days after you joined Epi. The deal probably was in the works before you started, right? Did you have final sign-off on the acquisition, or was the deal finished before?

Atzberger: Yes, it was in the works. The strategic direction was important in speaking about it with [Epi’s private equity owners] Insight Partners. There’s a huge B2B commerce opportunity.

When the acquisitions of Insite and [product content tagging automation technology] Idio were discussed with me, not only was I supportive of them but also it attracted me to Episerver as a company. The acquisitions made it so much more compelling to be at this place, at this time. I went to Minneapolis and met [Insite Software CEO] Steve Shaffer, and saw how well they executed against their goals. It left me inspired. I left Minneapolis thinking, ‘This is part of the future of Epi.’

What can you tell me about how you’re thinking about future acquisitions? You’re growing, you’re flush with cash. You can’t be done. What’s next?

Atzberger: We’re not done. Our focus now is the integration of Idio and Insite. What’s interesting to me is that there are a couple of trends that continue to be very important: One, understanding everything about your customer, and two, serving up the next best action. Everything that we do in the foreseeable future will be focused on the digital experience, and helping our customers get better and more informed data about their customers so they can make better decisions.

Go to Original Article
Author:

Using the Windows Admin Center Azure services feature

To drive adoption of its cloud platform, Microsoft is lowering the technical barrier to Azure through the Windows Admin Center management tool.

Microsoft increasingly blurs the lines between on-premises Windows Server operating systems and its cloud platform.

One way the company has done this is by exposing Azure services alongside Windows Server services in the Windows Admin Center. Organizations that might have been reluctant to go through a lengthy deployment process that required PowerShell expertise can use the Windows Admin Center Azure functionality to set up a hybrid arrangement with just a few clicks in some instances.

Azure Backup

One of the Azure services that Windows Server 2019 can use natively is Azure Backup. This cloud service backs up on-premises resources to Azure. This service offers 9,999 recovery points for each instance and is capable of triple redundant storage within a single Azure region by creating three replicas.

Azure Backup can also provide geo-redundant storage, which insulates protected resources against regional disasters.

You access Azure Backup through the Windows Admin Center, as shown in Figure 1. After you register Windows Server with Azure, setting up Azure Backup takes four steps.

Azure Backup setup
Figure 1: The Windows Admin Center walks you through the steps to set up Azure Backup.

Microsoft designed Azure Backup to replace on-premises backup products. Organizations may find that Azure Backup is less expensive than their existing backup system, but the opposite may also be true. The costs vary widely depending on the volume of data, the type of replication and the data retention policy.

Azure Active Directory

Microsoft positions the Windows Admin Center as a one of the primary management tools for Windows Server. Because sensitive resources are exposed within the Windows Admin Center console, Microsoft offers a way to add an extra layer of security through Azure Active Directory.

When you enable the requirement for Azure Active Directory security, you will be required to authenticate into both the local machine and into Azure Active Directory.

To use Azure Active Directory, you must register the Windows Server with Azure, then you can require Azure Active Directory authentication to be used by opening the Windows Admin Center and then clicking on the Settings icon, followed by the Access tab. Figure 2 shows a simple toggle switch to turn Azure Active Directory authentication on or off.

Azure Active Directory authentication
Figure 2: The toggle switch in the Windows Admin Center sets up Azure Active Directory authentication.

Azure Site Recovery

Azure Site Recovery replicates machines running on-premises to the Microsoft Azure cloud. If a disaster occurs, you can fail over mission-critical workloads to use the replica VMs in the cloud. Once on-premises functionality returns, you can fail back workloads to your data center. Using the Azure cloud as a recovery site is far more cost-effective than building your own recovery data center, or even using a co-location facility.

Like other Azure services, Azure Site Recovery is exposed through the Windows Admin Center. To use it, the server must be registered with Azure. Although Hyper-V is the preferred hosting platform for use with Azure Site Recovery, the service also supports the replication of VMware VMs. The service also replicates between Azure VMs.

To enable a VM for use with the Azure Site Recovery services, open the Windows Admin Center and click on the Virtual Machines tab. This portion of the console is divided into two separate tabs. A Summary tab details the host’s hardware resource consumption, while the Inventory tab lists the individual VMs on the host.

Click on the Inventory tab and then select the checkbox for the VM you want to replicate to the Azure cloud. You can select multiple VMs and there is also a checkbox above the Name column to select all the VMs on the list. After selecting one or more VMs, click on More, and then choose the Set Up VM Protection option from the drop-down list, shown in Figure 3.

VM protection
Figure 3: To set up replication to Azure with the Azure Site Recovery service, select one or more VMs and then choose the Set Up VM Protection option.

The console will open a window to set up the host with Azure Site Recovery. Select the Azure subscription to use, and to create or select a resource group and a recovery vault. You will also need to select a location, as shown in Figure 4.

Azure Site Recovery setup
Figure 4: After you select the VMs to protect in Azure Site Recovery, finalize the process by selecting a location in the Azure cloud.

Storage Migration Service

The Storage Migration Service migrates the contents of existing servers to new physical servers, VMs or to the Azure cloud. This can help organizations reduce costs through workload consolidation.

You access the Storage Migration Service by selecting the Storage Migration Service tab in the Windows Admin Center, which opens a dialog box outlining the storage migration process as shown in Figure 5. The migration involves getting an inventory of your servers, transferring the data from those servers to the new location, and then cutting over to the new server.

Storage Migration Services overview
Figure 5: Microsoft developed Storage Migration Services to ease migrations to new servers, VMs or Azure VMs through a three-step process.

As time goes on, it seems almost inevitable that Microsoft will update the Windows Admin Center to expose even more Azure services. Eventually, this console will likely provide access to all of the native Windows Server services and all services running in Azure.

Go to Original Article
Author:

Citrix brings Workspace and micro apps to Google Cloud

Citrix Workspace platform for Google Cloud is now generally available. In an announcement, Citrix said the move would simplify tasks for IT professionals and users alike by using micro apps and unifying tasks in a single work feed.

The partnership underscores Citrix’s commitment to keep its services agnostic to support its customers’ choice in cloud providers, according to analysts.

Eric Kenney, a senior product manager at Citrix, said IT professionals are, at present, responsible for wrangling a variety of disparate products. These applications may, for example, govern security, file synchronization, file sharing and virtual desktops, and all of them could have different portals and login screens. Citrix Workspace is designed to make it easier to administer a range of end-user computing applications.

“It’s really difficult to manage all of these different vendors and resources,” he said. “With Workspace, IT professionals are able to bring these solutions together, with one partner, to deliver them to users.”

Putting these solutions and the options to manage them in one place helps both desktop administrators and users, Kenney said.

Although Workspace provides a centralized place through which Citrix products, such as Citrix Virtual Apps and Desktops, Citrix Virtual Desktops and Citrix ADC, may be launched, Kenney said the platform goes beyond that. The intent, he said, is to provide a home for whatever application a company wants to deliver to its users, including homegrown and cloud-hosted offerings.

One way Workspace acts to simplify employee workloads is through the use of micro apps, or small programs that can accomplish simple tasks quickly, according to Kenney.

“An analogy we use is the office copier; it has a ton of buttons on it,” he said, noting that, with knowledge of those functions, people can collate, print double-sided copies and perform any number of specialized tasks. Most people, though, only use the big green button. “That’s a way of looking at enterprise applications; you’re using them a lot, but only for a small sliver of their functionality.”

Employees approving an expense report, for example, typically must go into a separate application to review and OK the document. Kenney said that process is less streamlined than it could be and that micro apps can integrate multiple tasks of approving an expense report into one feed, enabling workers to accomplish in seconds what used to take minutes.

“You could review and approve [the report] and never have to leave Workspace,” he said.

Workspace’s new availability also provides Citrix greater integration with Google Cloud services, among them Google’s G Suite, a collection of productivity apps. Kenney said a new cloud service, Citrix Access Control, provides administrators additional control over user actions on Google Drive documents.

For example, if a malware link is inadvertently added to a document, the Access Control settings could ensure the link is opened in an isolated browser that is safely disposed of at the end of a user session. Access Control can also restrict “copy and paste” functionality in certain documents.

Workspace isn’t just for IT

Ulrik Christensen, principal infrastructure engineer at Oncology Venture, said Citrix services, including Workspace, have made things easier for his firm. The drug development company is a global operation with offices and labs in both Denmark and the U.S., and manufacturing operations in India.

“I have four to five people in the U.S., and they’re not even in the same office,” he said, adding that the complexity of supporting the different hardware they use, including Apple machines, Windows machines and Chromebooks, has proven difficult in the past.

Moving to the kind of standardized system offered by Citrix has improved the user experience and lessened the burden on IT, Christensen said.

“It’s a lot easier if something doesn’t work,” he said. “We can help because we know the whole platform… It also made it a lot easier for IT to provide users new applications and updates.”

Security had improved as well, Christensen said. With only one way to access the company’s network, it is at less risk and the firm can be more confident that its data is protected.

Citrix continues to support cloud choice

Andrew Hewitt, an analyst at Forrester Research, said the partnership with Google Cloud makes sense for Citrix, as it bolsters one of the key tenets of its pitch to customers.

Andrew HewittAndrew Hewitt

“Citrix’s core messaging is around experience, choice and security,” he said. “This announcement sits squarely in its desire to be an agnostic player in the [end-user computing market] that can enable enterprises to pick and choose whatever technologies they want to deploy to their end users.”

Citrix’s core messaging is around experience, choice and security.
Andrew HewittAnalyst, Forrester Research

The move, Hewitt said, seems like a logical extension of past partnerships with Google.

“For example, Citrix has full API access to manage Chromebooks; it supports all the management models for Android Enterprise and provides Citrix Receiver for virtualization support on Chromebooks,” he said. “This announcement is just further deepening of the relationship with Google.”

Mark BowkerMark Bowker

Enterprise Strategy Group senior analyst Mark Bowker said the partnership is good for Google as well.

“Google is trying to make inroads into the enterprise,” he said, noting pushes with Chromebooks and the Chrome browser.

Bowker added, though, that enterprises must still interact with Windows frequently. By working with Citrix, then, Google can provide its users with easier access to Windows-based services.

Citrix recognizes the importance of being able to provide its services on its customers’ cloud of choice, including a recent announcement of deeper ties with AWS. Still, its closest ties are with Microsoft, Bowker said. “The strength of their integration is ultimately with Microsoft, and always has been,” he said.

Go to Original Article
Author:

Qualtrics XM adds mobile, AI, information governance

Qualtrics XM added AI and information governance tools to its customer and employee experience measurement platform this week and gave its year-old mobile app an infusion of dashboards to put data into the hands of front-line workers on the go.

In some ways, the new features reflect the influence of SAP, which acquired Qualtrics for $8 billion a year ago. The new features, such as mobile dashboarding, likely reflect a step toward making Qualtrics data relevant and available to customer-facing employees who use other SAP applications, in addition to marketing and research teams, Constellation Research principal analyst Nicole France said.

Getting such data into the hands of front-line employees makes the data more likely to be effectively used.

“Simply making these tools more widely available gets people more used to seeing this type of information, and it changes behaviors,” France said, adding that new features like mobile dashboards subtly get more people involved in using real-time performance metrics. “It’s doing it in almost a subliminal way, rather than trying to make it a quick-change program.” 

A number of Qualtrics competitors have also slowly added mobile dashboarding so employees can monitor reaction to a product, customer service or employee initiatives. But they’re all trying to find the right balance, lest it degrades employee experience or causes knee-jerk reactions to real-time fluctuations in customer response, Forrester Research senior analyst Faith Adams said

Qualtrics XM mobile NPS dashboard
Qualtrics XM mobile-app upgrades include dashboards to convey real-time customer response data to front-line employees responsible for product and service performance.

“It can be great — but it is also one that you need to be really careful with, too,” Adams said. “Some firms have noted that when they adopt mobile, it sometimes sets an expectation to employees of all levels that they are ‘always on.'”

Both France and Adams noted that the mobile app will help sales teams keep more plugged in to customer sentiment in their territories by getting data to them more quickly.

BMW, an early adopter of the new mobile app, uses it in dealerships to keep salespeople apprised of how individual customers feel about the purchasing process during the sale, and to prevent sales from falling through, according to Kelly Waldher, Qualtrics executive vice president and general manager.

AI and information governance tools debut

Qualtrics XM also added Smart Conversations, an AI-assisted tool to automate customer dialog around feedback. Two other AI features comb unstructured data for insights; one graphically visualizes customer sentiment and the other more precisely measures customer sentiment.

Prior to being acquired by SAP, Qualtrics had built its own AI and machine learning tools, Waldher said, and will continue to strategically invest in it. That said, Qualtrics will likely add features based on SAP’s Leonardo AI toolbox down the road.  

“We have that opportunity to work more closely with SAP engineers to leverage Leonardo,” Waldher said. “We’re still in the early stages of trying to tap into the broader SAP AI capabilities, but we’re excited to have that stack available to us.”

Also new to Qualtrics XM is a set of information governance features, which Waldher said will enable customers to better comply with privacy rules in both the U.S. and Europe. Qualtrics users will be able to monitor who is using data, and how within their organizations.

“Chief compliance officers and those within the IT group can make sure that the tools that are being deployed across the organization have advanced security and governance capabilities,” Waldher said. “SAP’s global strength, their presence in Western Europe and beyond, has strongly reinforced the path [of building compliance tools] we were already on.”

The new features are included in most paid Qualtrics plans at no extra charge, with a few of the AI tools requiring different licensing plans to use.

Go to Original Article
Author:

Can AI help save penguins? – Microsoft News Center India

Working on Microsoft Azure platform, Mohanty and his colleagues used a Convolutional Neural Network model to come up with a solution that can identify and count penguins with a high degree of accuracy. The model can potentially help researchers speed up their studies around the status of penguin populations.

The team is now working on the classification, identification and counting of other species using similar deep learning techniques.

Building AI to save the planet

A long-time Microsoft partner headquartered in Hyderabad in India, Gramener is not new to leveraging AI for social good using Microsoft Azure. It was one of the earliest partners for Microsoft’s AI for Earth program announced in 2017.

“I believe that AI can help make the world a better place by accelerating biodiversity conservation and help solve the biggest environmental challenges we face today. When we came to know about Microsoft’s AI for Earth program over two years ago, we reached out to Microsoft as we wanted to find ways to partner and help with our expertise,” says Kesari.

While the program was still in its infancy, the teams from Gramener and Microsoft worked jointly to come up with quick projects to showcase what’s possible with AI and inspire those out there in the field. They started with a proof of concept for identifying flora and fauna species in a photograph.

“We worked more like an experimentation arm working with the team led by Lucas Joppa (Microsoft’s Chief Environmental Officer, and founder of AI for Earth). We built a model, using data available from iNaturalist, that could classify thousands of different species with 80 percent accuracy,” Kesari reveals.

Another proof of concept revolved around camera traps that are used for biodiversity studies in forests. The camera traps take multiple images whenever they detect motion, which leads to a large number of photos that had to be scanned manually.

Soumya Ranjan Mohanty, Lead Data Scientist, Gramener
Soumya Ranjan Mohanty, Lead Data Scientist, Gramener

“Most camera trap photos are blank as they don’t have any animal in the frame. Even in the frames that do, often the animal is too close to be identified or the photo is blurry,” says Mohanty, who also leads the AI for Earth partnership from Gramener.

The team came up with a two-step solution that first weeds out unusable images and then uses a deep learning model to classify images that have an animal in them. This solution too was converted by the Microsoft team into what is now the Camera Trap API that AI for Earth grantees or anyone can freely use.

“AI is critical to conservation because we simply don’t have time to wait for humans to annotate millions of images before we can answer wildlife population questions. For the same reason, we need to rapidly prototype AI applications for conservation, and it’s been fantastic to have Gramener on board as our ‘advanced development team’,” says Dan Morris, principal scientist and program director for Microsoft’s AI for Earth program.

Anticipating the needs of grantees, Gramener and Microsoft have also worked on creating other APIs, like the Land Cover Mapping API that leverages machine learning to provide high-resolution land cover information. These APIs are now part of the public technical resources available for AI for Earth grantees or anyone to use, to accelerate their projects without having to build the base model themselves.

Go to Original Article
Author: Microsoft News Center

SageMaker Studio makes model building, monitoring easier

LAS VEGAS — AWS launched a host of new tools and capabilities for Amazon SageMaker, AWS’ cloud platform for creating and deploying machine learning models; drawing the most notice was Amazon SageMaker Studio, a web-based integrated development platform.

In addition to SageMaker Studio, the IDE for platform for building, using and monitoring machine learning models, the other new AWS products aim to make it easier for non-expert developers to create models and to make them more explainable.

During a keynote presentation at the AWS re:Invent 2019  conference here Tuesday, AWS CEO Andy Jassy described five other new SageMaker tools: Experiments, Model Monitor, Autopilot, Notebooks and Debugger.

“SageMaker Studio along with SageMaker Experiments, SageMaker Model Monitor, SageMaker Autopilot and Sagemaker Debugger collectively add lots more lifecycle capabilities for the full ML [machine learning] lifecycle and to support teams,” said Mike Gualtieri, an analyst at Forrester.

New tools

SageMaker Studio, Jassy claimed, is a “fully-integrated development environment for machine learning.” The new platform pulls together all of SageMaker’s capabilities, along with code, notebooks and datasets, into one environment. AWS intends the platform to simplify SageMaker, enabling users to create, deploy, monitor, debug and manage models in one environment.

Google and Microsoft have similar machine learning IDEs, Gualtieri noted, adding that Google plans for its IDE to be based on DataFusion, its cloud-native data integration service, and to be connected to other Google services.

SageMaker Notebooks aims to make it easier to create and manage open source Jupyter notebooks. With elastic compute, users can create one-click notebooks, Jassy said. The new tool also enables users to more easily adjust compute power for their notebooks and transfer the content of a notebook.

Meanwhile, SageMaker Experiments automatically captures input parameters, configuration and results of developers’ machine learning models to make it simpler for developers to track different iterations of models, according to AWS. Experiments keeps all that information in one place and introduces a search function to comb through current and past model iterations.

AWS CEO Andy Jassy talks about new Amazon SageMaker capabilitiesatre:Invent 2019
AWS CEO Andy Jassy talks about new Amazon SageMaker capabilities at re:Invent 2019

“It is a much, much easier way to find, search for and collect your experiments when building a model,” Jassy said.

As the name suggests, SageMaker Debugger enables users to debug and profile their models more effectively. The tool collects and monitors key metrics from popular frameworks, and provides real-time metrics about accuracy and performance, potentially giving developers deeper insights into their own models. It is designed to make models more explainable for non-data scientists.

SageMaker Model Monitor also tries to make models more explainable by helping developers detect and fix concept drift, which refers to the evolution of data and data relationships over time. Unless models are updated in near real time, concept drift can drastically skew the accuracy of their outputs. Model Monitor constantly scans the data and model outputs to detect concept drift, alerting developers when it detects it and helping them identify the cause.

Automating model building

With Amazon SageMaker Autopilot, developers can automatically build models without, according to Jassy, sacrificing explainability.

Autopilot is “AutoML with full control and visibility,” he asserted. AutoML essentially is the process of automating machine learning modeling and development tools.

The new Autopilot module automatically selects the correct algorithm based on the available data and use case and then trains 50 unique models. Those models are then ranked by accuracy.

“AutoML is the future of ML development. I predict that within two years, 90 percent of all ML models will be created using AutoML by data scientists, developers and business analysts,” Gualtieri said.

SageMaker Autopilot is a must-have for AWS.
Mike GualtieriAnalyst, Forrester

“SageMaker Autopilot is a must-have for AWS, but it probably will help” other vendors also, including such AWS competitors as DataRobot because the AWS move further legitimizes the automated machine learning approach, he continued.

Other AWS rivals, including Google Cloud Platform, Microsoft Azure, IBM, SAS, RapidMiner, Aible and H2O.ai, also have automated machine learning capabilities, Gualtieri noted.

However, according to Nick McQuire, vice president at advisory firm CCS Insight, some of the  new AWS capabilities are innovative.

“Studio is a great complement to the other products as the single pane of glass developers and data scientists need and its incorporation of the new features, especially Model Monitor and Debugger, are among the first in the market,” he said.

“Although AWS may appear late to the game with Studio, what they are showing is pretty unique, especially the positioning of the IDE as similar to traditional software development with … Experiments, Debugger and Model Monitor being integrated into Studio,” McQuire said. “These are big jumps in the SageMaker capability on what’s out there in the market.”

Google also recently released several new tools aimed at delivering explainable AI, plus a new product suite, Google Cloud Explainable AI.

Go to Original Article
Author:

Kasten backup aims for secure Kubernetes protection

People often talk about Kubernetes “Day 1,” when you get the platform up and running. Now Kasten wants to help with “Day 2.”

Kasten’s K10 is a data management and backup platform for Kubernetes. The latest release, K10 2.0, focuses on security and simplicity.

K10 2.0 includes support for Kubernetes authentication, role-based access control, OpenID Connect, AWS Identity and Access Management roles, customer-managed keys, and integrated encryption of artifacts at rest and in flight.

“Once you put data into storage, the Day 2 operations are critical,” said Krishnan Subramanian, chief research advisor at Rishidot Research. “Day 2 is as critical as Day 1.”

Day 2 — which includes data protection, mobility, backup and restore, and disaster recovery — is becoming a pain point for Kubernetes users, Kasten CEO Niraj Tolia said.

“In 2.0, we are focused on making Kubernetes backup easy and secure,” Tolia said.

Other features the new Kasten backup software offers, which became generally available earlier in November, include a Kubernetes-native API, auto-discovery of the application environment, policy-driven operations, multi-tenancy support, and advanced logging and monitoring. The Kasten backup enables teams to operate their environments, while supporting developers’ ability to use tools of their choice, according to the vendor.

Kasten K10 dashboard screenshot
Kasten K10 provides data management and backup for Kubernetes.

Kasten backup eyes market opportunity

Kasten, which launched its original product in December 2017, generally releases an update to its customers every two weeks. A typical update that’s not as major as 2.0 typically has bug fixes, new features and increased depth in current features. Tolia said there were 55 releases between 1.0 and 2.0.

Day 2 is as critical as Day 1.
Krishnan SubramanianFounder and chief research advisor, Rishidot Research

Backup for container storage has become a hot trend in data protection. Kubernetes specifically is an open source system used to manage containers across private, public and hybrid cloud environments. Kubernetes can be used to manage microservice architectures and is deployable on most cloud providers.

“Everyone’s waking up to the fact that this is going to be the next VMware,” as in, the next infrastructure of choice, Tolia said.

Kubernetes backup products are popping up, but it looks like Kasten is a bit ahead of its time, Rishidot’s Subramanian said. He said he is seeing more enterprises using Kubernetes in production, for example, in moving legacy workloads to the platform, and that makes backup a critical element.

“Kubernetes is just starting to take off,” Subramanian said.

Kubernetes backup “has really taken off in the last two or three quarters,” Tolia said.

Subramanian said he is starting to see legacy vendors such as Dell EMC and NetApp tackling Kubernetes backup, as well as smaller vendors such as Portworx and Robin. He said Kasten had needed stronger security but caught up with K10 2.0. Down the road, he said he will look for Kasten to improve its governance and analytics.

Tolia said Kasten backup stands out because it’s “purpose-built for Kubernetes” and extends into multilayered data management.

In August, Kasten, which is based in Los Altos, Calif., closed a $14 million Series A funding round, led by Insight Partners. Tolia did not give Kasten’s customer count but said it has deployments across multiple continents.

Go to Original Article
Author: