Alluxio updates data orchestration platform, launches 2.0

Alluxio has launched Alluxio 2.0, a platform designed for data engineers who manage and deploy analytical and AI workloads in the cloud.

According to Alluxio, the 2.0 version was built particularly with hybrid and multi-cloud environments in mind, with the aim of providing data orchestration to bring data locality, accessibility and elasticity to compute.

Alluxio 2.0 Community Edition and Enterprise Edition provide a handful of new capabilities, including data orchestration for multi-cloud, compute-optimized data access for cloud analytics, AWS support and architectural foundations using open source.

Data orchestration for multi-cloud

There are three main components to the data orchestration capabilities of Alluxio 2.0: policy-driven data management, administration of data access policies and cross-cloud storage data movement using data service.

Policy-driven data management enables data engineers to automate data movement across different storage systems based on predefined policies. Users can also automate tiering of data across any environment or any number of storage systems. Alluxio claims this will reduce storage costs because the data platform teams will only manage the most important data in the expensive storage systems, while moving less important data to cheaper alternatives.

The administration of data access policies enables users to configure policies at any directory or folder level to streamline data access and workload performance. This includes defining behaviors for individual data sets for core functions, such as writing data or syncing it with Alluxio storage systems.

With cross-cloud storage data movement using data service, Alluxio claims users get highly efficient data movement across cloud stores, such as AWS S3 and Google Cloud services.

Compute-optimized data access for cloud analytics

The compute-optimized data access capabilities include two components: compute-focused cluster partitioning and integration with external data sources over REST.

Compute-focused cluster partitioning enables users to partition a single Alluxio cluster based on any dimension. This keeps data sets within each framework or workload from being contaminated by the other. Alluxio claims that this reduces data transfer costs and constrains data to stay within a specific region or zone.

Integration with external data sources over REST enables users to import data from web-based sources, which can then be aggregated in Alluxio to perform analytics. Users can also direct web locations with files to Alluxio to be pulled in as needed.

AWS support

The new suite provides Amazon Elastic MapReduce (EMR) service integration. According to Alluxio, Amazon EMR is frequently used during the process of moving to cloud services to deploy analytical and AI workloads. Amazon EMR is now available as a data layer within EMR for Spark, Presto and Hive frameworks.

Architectural foundations using open source

According to Alluxio, core foundational elements have been rebuilt using open source technologies. RocksDB is now used for tiering metadata of files and objects for data that Alluxio manages to enable hyperscale. Alluxio uses gRPC as the core transport protocol for communication with clusters, as well as between the client and master.

In addition to the main components, other new features include the following:

  • Alluxio Data Service: A distributed clustered service.
  • Adaptive replication: Configures a range for the number of copies of data stored in Alluxio that are automatically managed.
  • Embedded journal: A fault tolerance and high availability mode for file and object metadata that uses the RAFT consensus algorithm and is separate from other external storage systems.
  • Alluxio POSIX API: A Portable OS Interface-compatible API that enables frameworks such as Tensorflow, Caffe and other Python-based models to directly access data from any storage system through Alluxio using traditional access.

Alluxio 2.0 Community Edition and Enterprise Edition are both generally available now.

Go to Original Article
Author:

Automated transcription services for adaptive applications

Automated transcription services have a variety of applications. Enterprises frequently use them to transcribe meetings, and call centers use them to transcribe phone calls into text to more easily analyze the substance of each call.

The services are widely used to aid the deaf, by automatically providing subtitles to videos and television shows, as well as in call centers that enable the deaf to communicate with each other by transcribing each person’s speech.

VTCSecure and Google

VTCSecure, a several-years-old startup based in Clearwater, Fla., uses Google Cloud’s Speech-to-Text services to power a transcription platform that is used by businesses, non-profits, and municipalities around the world to aid the deaf and hard of hearing.

The platform offers an array of capabilities, including video services that connect users to a real-time sign-language interpreter, and deaf-to-deaf call centers. The call centers, enabling users to connect via video, voice or real-time-text, build on Google Cloud’s Speech-to-Text technology to provide users with automatic transcriptions.

Google Cloud has long sold Speech-to-Text and Text-to-Speech services, which provide developers with the data and framework to create their own transcription or voice applications. For Hayes, the services, powered in part by speech technologies developed by parent company Alphabet Inc.’s DeepMind division, were easy to set up and adapt.

“It was one of the best processes,” said Peter Hayes, CEO of VTCSecure. He added that his company has been with happy with what it considers a high level of support from Google.

Speech-to-text

Hayes said Google provides technologies, as well as development support, for VTCSecure and for his newest company, TranslateLive.

Hayes also runs the platform on Google Cloud, after doing a demo for the FTC that he said lagged on a rival cloud network.

Google Cloud’s Speech-to-Text and Text-to-Speech technology, as well as the translation technologies used for TranslateLive, constantly receive updates from Google, Hayes said.

Startup Verbit provides automated transcription services that it built in-house. While only two years old, the startup considers itself a competitor to Google Cloud’s transcription services, even releasing a blog post last year outlining how its automated transcription services can surpass Google’s.

Automatic translation service, automatic translation services, Verbit
Automatic translation services from companies like Verbit are used by the deaf and hard of hearing

Transcription startup

Verbit, unlike Google, adds humans to the transcription loop, explained Tom Livne, co-founder and CEO of the Israel-based startup. It relies on its home-grown models for an initial transcription, and then passes those off to remote human transcribers who fine-tune the transcription, reviewing them and making edits.

The combined process produces high accuracy, Livne said.

A lawyer, Livne initially started Verbit to specifically sell to law firms. However, the vendor moved quickly into the education space.

“We want to create an equal opportunity for students with disabilities,” Livne said. Technology, he noted, has long been able to aid those with disabilities.

We want to create an equal opportunity for students with disabilities.
Tom LivneCo-founder and CEO, Verbit

George Mason University, a public university in Fairfax, Va., relies on Verbit to automatically transcribe videos and online lectures.

“We address the technology needs of students with disabilities here on campus,” said Korey Singleton, assistive technology initiative manager at George Mason.

After trying out other vendors, the school settled on Verbit largely because of its competitive pricing, Singleton said. As most of its captioning and transcription comes from the development of online courses, the school doesn’t require a quick turnaround, Singleton said. So, Verbit was able to offer a cheaper price.

“We needed to find a vendor that could do everything we needed to do and provide us with a really good rate,” Singleton said. Verbit provided that.

Moving forward, George Mason will be looking for a way to automatically integrate transcripts with the courses. Now, putting them together is a manual process, but with some APIs and automated technologies, Singleton said he’s aiming to make that happen automatically.

Go to Original Article
Author:

Amazon CTO Werner Vogels on transparency, developers, multi-cloud

Amazon CTO Werner Vogels is known for his work with Amazon Web Services, but he actually leads technology innovation across the entire company. In a keynote talk at this week’s AWS Summit event in New York City, he outlined new product directions and his philosophy for the future of cloud computing.

Vogels sat down with TechTarget to discuss a wide range of issues, from transparency into future development of AWS services to customers’ multi-cloud plans.

In December 2018, AWS posted a public roadmap for its container strategy on GitHub. This was seen as an unusual, maybe unprecedented move. Talk about transparency in terms of a philosophy — will we see more of this kind of thing out of AWS?

Werner Vogels: As always, with respect to customer interaction, we try to experiment. The whole thing with roadmaps is that once you produce it, you have to stick with it. And historically, we’ve always tried to be more secretive. We’ve always tried to keep the roadmap with customers under NDA. Mostly so we could have the opportunity to change our minds.

Because once you promise customers you’re going to deliver X, Y and Z in September, you have to deliver X, Y and Z in September for them.

And so I think given the tremendous interest of developers in containers, this seems like a really great space to start with giving the community access to a roadmap, knowing what’s coming. And I think definitely given our close cooperation with that group we need this sort of ecosystem. I think it was really important to show what our plans are there.

One critique of AWS is that CloudFormation lags too much with regard to support for new AWS features. In response, AWS pledged to provide more transparency around CloudFormation, including a roadmap. What’s going on from your perspective with CloudFormation?

Werner Vogels, Amazon CTO
Werner Vogels, vice president and CTO of Amazon

Vogels: Often we have a number of innovations scheduled for CloudFormation, but as you can see we put a lot of effort into the Cloud Development Kit, or CDK. One thing we’ve gotten from developers is that they prefer to write code instead of these large, declarative JSON and XML files. I showed it onstage this morning, with the demo that we did. We’ve put most of our effort in actually going the CDK route more than sort of extending CloudFormation.

Most customers have asked for new features in CloudFormation to get sort of parity with what Terraform is doing. I have great respect for HashiCorp and the speed at which they’re innovating. They’re a great partner. And as such, we’re working with CloudFormation to take it in the direction that customers are asking for.

I think overall, we’re on a good path, the right path. But I love the fact that there is a long list of requests for CloudFormation. It means that customers are passionate about it and want us to do more.

There is a sense these days that enterprises should look to be multi-cloud, not tied to a single provider, for reasons such as cost, vendor management and richer opportunities for innovation. One of your competitors, Google, hopes to be a middleman player with its Anthos multi-cloud deployment platform. What is your stance on multi-cloud, and can we see something like Anthos coming out of AWS someday?

Vogels: It depends a bit on how you define multi-cloud. If you think about if you have this one application that you want to run on any of the providers, you pretty quickly go to a lowest common denominator, which is to use a cloud as a data center. You just use instances as a service. Now you get some elasticity, you get some cost savings out of it, maybe some more reliability, but you get none of the other benefits. You can’t use any of the security tools that Amazon is giving you. Plus, you need to have your workforce, your development force able to be proficient in each and every one of these clouds that you’re using, which seems like a waste.

Given the tremendous interest of developers in containers, this seems like a really great space to start with giving the community access to a roadmap, knowing what’s coming.
Werner VogelsCTO, Amazon

The few companies that I’ve seen being slightly successful with having a multi-cloud approach are ones that say, oh this is one particular thing that this particular provider is unique in and I really want to make use of that. Well, sometimes that’s as some sort of a vertical, or it might be in a particular location.

The other thing that we’re working with most of our enterprise customers is, what is an exit strategy? What do I need to do, if one moment I decide that I would like to move over to another provider? That for any large enterprise is just good due diligence. If you start using a [SaaS application], you want to know about what do we need to do to get my data out of there, if I want to move let’s say from Salesforce to Workday.

It’s the same for most large enterprises. They want to know how much work is it actually for me to actually move if I decide to go from cloud provider A to cloud provider B, or maybe bring it back on premises.

That’s something that we’ve been working on with most of our large customers, because that’s just good due diligence.

You talked about your strategy for developers today [in the AWS Summit keynote]. Are you satisfied with where AWS is with regard to developer experience?

Vogels: I’m never satisfied. I think this is mostly focused on serverless. Anything serverless is still so much in flux. We see customers building more and more complex and larger applications using only serverless components, and we’re learning from that. What are the kinds of things that customers want?

For example, when we launched [Lambda] Layers, that was purely from feedback from customers saying, ‘Hey you know, we have this whole set of basic components that we are always using for each of our applications, but it doesn’t allow us to actually easily integrate them.’ So we built Layers for customers.

We continue to look at how we can do these things. The same goes for building Custom Runtimes. There [are] only so many languages you can do yourself, but if there’s someone else that wants to do Haskell or Caml, or any let’s say, less popular language, we should be able to enable them. And so we built Custom Runtimes.

Part two of TechTarget’s Q&A with Amazon CTO Werner Vogels will touch on AWS Outposts, AWS’ pace of innovation, and how customers can control cloud costs.

Go to Original Article
Author:

Zoom vulnerability reveals privacy issues for users

Zoom faced privacy concerns after the disclosure of a vulnerability that could allow threat actors to use the video conferencing software to spy on users.

The Zoom vulnerability, originally reported to only affect the Mac version of the software, has been found to partially affect Windows and Linux as well. Jonathan Leitschuh, software engineer at open source project Gradle, disclosed the Zoom vulnerability in a blog post earlier this week and said it “allows any website to forcibly join a user to a Zoom call, with their video camera activated, without the user’s permission.”

On top of this, this vulnerability would have allowed any webpage to DOS (Denial of Service) a Mac by repeatedly joining a user to an invalid call,” Leitschuh added. “Additionally, if you’ve ever installed the Zoom client and then uninstalled it, you still have a localhost web server on your machine that will happily re-install the Zoom client for you, without requiring any user interaction on your behalf besides visiting a webpage.”

According to Leitschuh, it took Zoom 10 days to confirm the vulnerability and in a meeting on June 11, he told Zoom there was a way to bypass the planned fix, but Zoom did not address these concerns when Zoom reported the vulnerability fixed close to two weeks later. The Zoom vulnerability resurfaced on July 7, Leitschuh disclosed on July 8 and Zoom patched the Mac client on July 9. Zoom also worked with Apple on a silent background update for Mac users, released July 10, which removed the Zoom localhost from systems.

“Ultimately, Zoom failed at quickly confirming that the reported vulnerability actually existed and they failed at having a fix to the issue delivered to customers in a timely manner,” Leitschuh wrote. “An organization of this profile and with such a large user base should have been more proactive in protecting their users from attack.” 

Zoom — whose video conferencing software is used by more than 4 million users in approximately 750,000 companies around the world — downplayed the severity of the issue and refuted Leitschuh’s characterization of the company.

This trust tradeoff, between making it easy and making it secure, is something that every consumer should consider.
Tom PattersonChief trust officer, Unisys

“Once the issue was brought to our Security team’s attention, we responded within ten minutes, gathering additional details, and proceeded to perform a risk assessment,” Richard Farley, CISO at Zoom, wrote in the company’s response. “Our determination was that both the DOS issue and meeting join with camera on concern were both low risk because, in the case of DOS, no user information was at risk, and in the case of meeting join, users have the ability to choose their camera settings.”

“To be clear, the host or any other participant cannot override a user’s video and audio settings to, for example, turn their camera on,” Farley added. 

Both the disclosure and response from Zoom portrayed the issue as only affecting the Mac client, but Alex Willmer, Python developer for CGI, wrote on Twitter that the Zoom vulnerability affected Windows and Linux as well.

“In particular, if zoommtg:// is registered as a protocol handler with Firefox then [Zoom] joins me to the call without any clicks,” Willmer tweeted. “To be clear, a colleague and I saw the auto-join/auto-webcam/auto-microphone behavior with Firefox, and Chromium/Chrome; on Linux, and Windows. We did not find any webserver on port 19421 on Linux. We didn’t check Windows for the webserver.”

Leitschuh confirmed Willmer’s discovery, but it is unclear if Zoom is working to fix these platform clients. Leitschuh also noted in his disclosure that the issue affects a whitehite label version of Zoom licensed to VoIP provider RingCentral. It is unclear if RingCentral has been patched.

Leitschuh told SearchSecurity via Twitter DM that “Zoom believes the Windows/Linux vulnerabilities are the browser vendors’ to fix,” but he disagrees.

Zoom did not respond to requests for comment at the time of this post.

Tom Patterson, chief trust officer at Unisys, said the tradeoff between security and ease of use is “not always a fair trade.”

“The fact that uninstalling any app doesn’t completely uninstall all components runs counter to engendering trust. In this case, it’s an architectural decision made by the manufacturers which appears to be designed to make operations much easier for users,” Patterson told SearchSecurity. “This trust tradeoff, between making it easy and making it secure, is something that every consumer should consider.”

Go to Original Article
Author:

Microsoft Azure partners get new tool, migration program

John Moore and Spencer Smith

Microsoft Azure partners are getting a new management tool and a migration program as part of channel investments Microsoft revealed in the run-up to the company’s annual partner conference.

Microsoft introduced Azure Lighthouse, which the company said gives partners “a single control plane” for viewing and managing Azure across customers. The new Azure Migration Program, or AMP, aims to help customers accelerate their cloud adoption.

The moves are part of a broader barrage of announcements ahead of Microsoft Inspire, which will run from July 14 to 18 in Las Vegas. In addition to the Azure initiatives, Microsoft expanded its Teams collaboration software, launched vertical market integrations for Dynamics 365 and took the wraps off a new Microsoft competency and several specializations.

Partners, meanwhile, saw potential in both Azure developments.

“We haven’t started working with it yet, but we are excited about what potential it can provide and some of the solutions to challenges we are hearing it is going to offer,” said Rory McCaw, president of enterprise advisory services at Green House Data, an Azure Expert Managed Services Provider (MSP) based in Cheyenne, Wyo.

He said Lighthouse offers a single platform that provides access into multiple tenants, simplifying cloud management. The management offering will be available to Microsoft Azure partners this month, according to Microsoft.

JD HelmsJD Helms

JD Helms, president of CloudJumper, a cloud workspace vendor and Gold-level Microsoft partner, called Azure Lighthouse an attempt to simplify what he termed the “powerful, yet extremely complex” Azure management portal. “While it’s a step in the right direction, it is still limited in scope and functionality.”

Helms said Lighthouse validates the market demand for a simplified Azure management console, as well as third-party Azure management consoles. CloudJumper offers Cloud Workspace Management Suite, which Helms said provides a single-pane-of-glass console for a range of Azure services.

AMP, meanwhile, provides resources and tools, such as Azure Migrate, which centrally plans and tracks an Azure migration. AMP provides guidance to customers through “Microsoft experts and specialized migration partners,” according to Microsoft.

AMP also provides offers to reduce the expense of migration. “Microsoft has always had funding programs and funding mechanisms, but it is clear that AMP is highlighting their interest in seeing a massive migration of workloads to Azure,” McCaw said.

He said the timing is appropriate, given the approaching end of life of Windows Server 2008 and this week’s end of support for SQL Server 2008. Microsoft is offering three years of automated Extended Security Updates for SQL Server 2008 customers that rehost their workloads to Azure.

AMP ties in to VMware on Azure, McCaw added. He said AMP aims to make it easier for enterprise customers using VMware as the hypervisor to move from an on-premises VMware implementation to an Azure-based one, especially for their older Windows Server and SQL Server instances.

Robin Brandl, vice president of strategic alliances at CloudJumper, said AMP “simplifies the migration process for specific workloads and provides an alternative to traditional lift-and-shift deployments.”

In addition to the tools and programs for Microsoft Azure partners, Microsoft rolled out extensions to Microsoft Teams and Dynamics 365. New Teams features include partner integrations that support contact centers, compliance recording and cloud solutions providers, according to Microsoft. Dynamics 365 updates include integrations targeting the automotive and financial services industries. Microsoft also announced the latest version of its Dynamics 365 Nonprofit Accelerator.

Other developments include the availability of a Microsoft competency in security and five advanced specializations focusing on Microsoft Azure: Windows Server and SQL Server Migration to Microsoft Azure; Linux and Open Source Databases Migration to Microsoft Azure; Data Warehouse Migration to Microsoft Azure; Modernization of Web Applications in Microsoft Azure; and Kubernetes on Microsoft Azure.

2nd Watch launches AWS migration service

2nd Watch, a public cloud MSP based in Seattle, has rolled out a service that focuses on AWS Managed Services (AMS).

For us, we see [AMS] as a piece of the cloud journey that gets the customer to be successful.
Jeff AdenExecutive vice president of marketing and business development, 2nd Watch

The company’s AMS Onboarding Accelerator aims to help clients assess, migrate and operationalize applications from on-premises infrastructure to the AWS cloud, according to 2nd Watch. The service is delivered in two parts. Part one consists of a four-hour discovery and planning workshop that is free of charge. Part two provides onboarding acceleration, a phase that includes applications discovery, architecture and infrastructure design, and a migration strategy, among other components.

AWS’ entry into the managed services space in late 2016 raised eyebrows, but MSP executives report AMS hasn’t hurt their businesses.

Jeff AdenJeff Aden

Jeff Aden, executive vice president of marketing and business development at 2nd Watch, said not everything a customer wants to do in the Amazon cloud fits into AMS.

“For us, we see [AMS] as a piece of the cloud journey that gets the customer to be successful,” he said. “We are not shying away from it. We see it as an opportunity for a customer to continue to adopt and scale their cloud usage.”

He said some ISVs have used 2nd Watch’s Onboarding Accelerator to provide their on-premises offerings in a cloud-based solution. He said the ISV example is a good use case but noted the service isn’t exclusive to ISVs.

Other news

  • IBM on Tuesday finalized its $34 billion acquisition of Red Hat. When news of the acquisition first broke in October, channel partners said the success of the merger would hinge on IBM preserving Red Hat’s open source portfolio and culture. IBM maintains that Red Hat will operate independently under its ownership and keep its commitment to open source software. In a blog post, Red Hat said there are no plans to change its existing partner programs.
  • Rackspace has entered a partnership with Tech Mahindra Ltd., which will let Rackspace cross-sell to Tech Mahindra customers and provide joint product and services offerings. The alliance will also improve Rackspace’s “internal business applications and processes,” the company said.
  • BitTitan, a managed services automation company, has enhanced its cloud migration product, MigrationWiz, to support the adoption of Microsoft Teams. The company also launched a Collaboration License to enable migrations of Microsoft Teams instances between Office 365 tenants. The Collaboration License costs $15 per user with a data allowance of 10 GB per license, according to BitTitan.
  • Logicalis Group, an IT solutions and managed services provider based in New York, expanded its operations in South Africa. The company purchased Mars Technologies, an IT services provider headquartered in Cape Town.
  • ThreatLocker has integrated its cybersecurity tools with Kaseya’s IT infrastructure management platform.
  • Montreal-based data center services provider eStruxture has unveiled a global channel partner program. EStruxture said the program provides support for referral and reseller partners.
  • M-Files Corp., an information management software vendor, said it has tripled membership of its Certified Application Partner program since its 2017 launch. The program offers technical and marketing support for partners, ISVs and resellers that develop applications on M-Files’ platform.
  • Distributor Ingram Micro inked a deal with CriticalStart to provide its managed detection and response services to U.S. partners.
  • In other distribution news, cloud distributor Zedsphere agreed to provide Axcient’s business availability products for MSPs in the EMEA market.
  • Anexinet Corp., a digital business transformation solutions provider based in Philadelphia, launched an Enterprise Architecture Modernization Kickstart. The company said the three-week program provides customers with a modern, event-driven architecture strategy.
  • ProcessUnity, which offers SaaS products for risk and compliance management, added U.K.-based MSP DVV Solutions to its partner program. DVV Solutions specializes in third-party risk management.
  • Threat detection and response vendor Lastline appointed Jarrett Miller as its vice president of global channel sales. Miller will oversee Lastline’s global channel program and build out partnerships with distributors, resellers, systems integrators and managed security service providers, the company said.

Market Share is a news roundup published every Friday.

Go to Original Article
Author:

Google’s Elastifile buy shows need for cloud file storage

Google’s acquisition of startup Elastifile underscored the increasing importance of enterprise-class file storage in the public cloud.

Major cloud providers have long offered block storage for applications that customers run on their compute services and focused on scale-out object storage for the massively growing volumes of colder unstructured data. Now they’re also shoring up file storage as enterprises look to shift more workloads to the cloud.

Google disclosed its intention to purchase Elastifile for an undisclosed sum after collaborating with the startup on a fully managed file storage service that launched early in 2019 on its cloud platform. At the time, Elastifile’s CEO, Erwan Menard, positioned the service as a complement to the Google Cloud Filestore, saying his company’s technology would provide higher performance, scale-out capacity and enterprise-grade features than the Google option.

Integration plans

In a blog post on the acquisition, Google Cloud CEO Thomas Kurian said the teams would join together to integrate the Elastifile technology with Google Cloud Filestore. Kurian wrote that Elastifile’s pioneering software-defined approach would address the challenges of file storage for enterprise-grade applications running at scale in the cloud.

“Google now has the opportunity to create hybrid cloud file services to connect the growing unstructured data at the edge or core data centers to the public cloud for processing,” said Julia Palmer, a vice president at Gartner. She said Google could have needed considerably more time to develop and perfect a scale-out file system if not for the Elastifile acquisition.

Building an enterprise-level, high-performance NFS file system from scratch is “insanely difficult,” said Scott Sinclair, a senior analyst at Enterprise Strategy Group. He said Google had several months to “put Elastifile through its paces,” see that the technology looked good, and opt to buy rather than build the sort of file system that is “essential for the modern application environments that Google wants to sell into.”

Target workloads

Kurian cited examples of companies running SAP and developers building stateful container-based applications that require natively compatible file storage. He noted customers such as Appsbroker, eSilicon and Forbes that use the Elastifile Cloud File Service on Google Cloud Platform (GCP). In the case of eSilicon, the company bursts semiconductor design workflows to Google Cloud when it needs extra compute and storage capacity during peak times, Elastifile has said.

“The combination of Elastifile and Google Cloud will support bringing traditional workloads into GCP faster and simplify the management and scaling of data and compute intensive workloads,” Kurian wrote. “Furthermore, we believe this combination will empower businesses to build industry-specific, high performance applications that need petabyte-scale file storage more quickly and easily.”

Elastifile’s Israel-based engineering team spent four years developing the distributed Elastifile Cloud File System (ECFS). They designed ECFS for hybrid and public cloud use and banked on high-speed flash hardware to prevent metadata server bottlenecks and facilitate consistent performance.

Elastifile emerged from stealth in April 2017, claiming 25 customers, including 16 service providers. Target use cases it cited for ECFS included high-performance NAS, workload consolidation in virtualized environments, big data analytics, relational and NoSQL databases, high-performance computing, and the lift and shift of data and applications to the cloud. Elastifile raised $74 million over four funding rounds, including strategic investments from Dell Technologies, Cisco and Western Digital.

One open question is the degree to which Google will support Elastifile’s existing customers, especially those with hybrid cloud deployments that did not run on GCP. Both Google and Elastifile declined to respond.

Cloud NAS competition

The competitive landscape for the Elastifile Cloud File Service on GCP has included Amazon’s Elastic File System (EFS), Dell EMC’s Isilon on GCP, Microsoft’s Azure NetApp Files, and NetApp on GCP.

“Cloud NAS and cloud file systems are the last mile for cloud storage. Everybody does block. Everybody does object. NAS and file services were kind of an afterthought,” said Henry Baltazar, research director of storage at 451 Research.

But Baltazar said as more companies are thinking about moving their NFS-based legacy applications to the cloud, they don’t want to go through the pain and the cost of rewriting them for object storage or building a virtual file service. He sees Google’s acquisition of Elastifile as “a good sign for customers that more of these services will be available” for cloud NAS.

“Google doesn’t really make infrastructure acquisitions, so it says something that Google would make a deal like this,” Baltazar said. “It just shows that there’s a need.”

Go to Original Article
Author:

SAP PartnerEdge initiative offers free S/4HANA Cloud resources

SAP is taking a new tact to grow its public cloud offering and development resources: It’s giving them away for free.

As of July 1, qualified SAP PartnerEdge members can now test and demonstrate systems built on SAP S/4HANA Cloud and SAP C/4HANA free of charge. Partners need to have a valid SAP PartnerEdge status and employ three consultants who are certified in “operations capabilities for SAP applications running on S/4HANA Cloud” to get free access to test and demo systems, according to the company. Partners who have at least three consultants certified for SAP C/4HANA applications also qualify for the initiative.

SAP PartnerEdge is designed to provide SAP partners the resources they can use to develop, sell, service and manage SAP systems and applications. The program currently has more than 19,800 partners worldwide, according to SAP.

The new SAP PartnerEdge initiative appears to have positive reviews from partners, but one analyst believes the effort is a long-overdue strategic move to build out SAP applications and keep pace with other cloud providers like AWS and Microsoft.

Seeding the cloud applications

The SAP PartnerEdge initiative is an attempt to open up SAP Cloud Platform and S/4HANA to a broader range of developers and seed the market with applications built on SAP technology, according to analyst Joshua Greenbaum, principal at Enterprise Applications Consulting, based in Berkeley, Calif. This is similar to the approach long favored by the likes of Microsoft and AWS.

“They want the developers of future great products — whether they’re internal development teams, startups or professional teams — to think about SAP as their development platform,” Greenbaum said. “That’s their fundamental strategy for growing the uptake of SAP Cloud Platform and S/4HANA.”

It’s being met with enthusiasm. Alain Dubois, chief marketing and business development officer at Beyond Technologies, called the program a “great initiative.”

Beyond Technologies, a Montreal-based SAP partner that specializes in development and integration of a range of SAP products, including S/4HANA and C/4HANA. It plans to take advantage of the free cloud access the new SAP PartnerEdge program offers.

“We will use it for demos and [proofs of concept] as well as for enablement, which is crucial for us as a value-added reseller because it will help [keep down] customer acquisition costs,” he said.

Shaun Syvertsen, CEO of ConvergentIS, is also looking forward to using the program’s resources. ConvergentIS is an SAP partner based in Calgary, Alberta, that provides SAP technology development and consulting services.

The SAP PartnerEdge program will drive the long-term success of SAP technology with partners, Syvertsen said. It’s particularly valuable for partners to get a deeper understanding of public cloud SAP versions.

“In particular, you have to understand that 20 years of on-premises experience is potentially dangerous in the cloud environment, as the setup and range of flexibility is quite different from on-premises,” Syvertsen said. “So, a new cloud-centric mindset and cloud-specific experience is critical.”

Better late than never

SAP partners have been beating the drum for an initiative like this for years to help them keep pace with competitors like AWS, Microsoft and Salesforce, Enterprise Applications Consulting’s Greenbaum explained.

“The partners and would-be partners have been saying that SAP has to emulate the rest of the market for a while,” he said. “There are lots of open source tools and there’s a huge amount of love and support [for developers] from competitor platform providers, and the partners have always said that SAP has to do something similar or they’ll go somewhere else.”

SAP has to do some work to catch up to Salesforce or AWS, but it’s still a relatively new game as cloud uptake numbers are just beginning to gain momentum in the enterprise applications market, according to Greenbaum.

What could be a differentiator for SAP also is the totality of what it can offer compared to the other cloud companies.

“Underneath the hood, SAP provides access to business services — data and processes — that are potentially very valuable,” he said. “Salesforce can do that, but only within the domain of CRM, and AWS doesn’t really do that at all. So, this is a good time for SAP. It would have been a better time two years ago, but the story is hardly over at this point.”

Availability for the new SAP PartnerEdge program for qualified partners began July 1. Registration will remain open until Sept. 30, 2019, according to the company. Partners who have already bought the test and demonstration licenses will receive a migration offer from SAP Partner Licensing Services if they want to migrate their existing services to the free access, according to SAP.

Go to Original Article
Author:

Zoom security issues leave vendor scrambling

Zoom was caught flatfooted this week by the reaction to a security researcher’s report on the vulnerabilities of a web server it had quietly installed on Apple computers. The debacle raised broader questions on whether unified communications vendors were too quick to sacrifice privacy and security for ease of use.

The Zoom security issue stemmed from the use of the web server as a workaround for a privacy feature on version 12 of the Safari web browser, which Apple released for the Mac last fall. The feature forced users to consent to open Zoom’s video app every time they tried to join a meeting. In contrast, browsers like Chrome and Firefox let users check a box telling them to automatically trust Zoom’s app in the future.

Zoom felt the extra click in Safari would undermine its frictionless experience for joining meetings, so it installed the web server on Mac computers to launch a meeting immediately.

That left Mac users vulnerable to being instantly joined to a Zoom meeting by clicking on a spam link or loading a malicious website or pop-up advertisement. A similar risk still exists for all Mac and PC users who choose to have their web browsers automatically launch Zoom.

Another issue with the Mac web server was that it would remain in place even after users deleted the Zoom app, and would automatically reinstall Zoom upon receiving a request to join a meeting, according to the security researcher. It also created an avenue for denial-of-service attacks, a risk that Zoom released an optional patch for in May.

In a broader sense, the permanent installation of a web server on local devices troubled independent researcher Jonathan Leitschuh, who sparked this week’s events with a blog post Monday.

“First off, let me start off by saying having an installed app that is running a web server on my local machine with a totally undocumented API feels incredibly sketchy to me,” Leitschuh wrote in his public disclosure. “Secondly, the fact that any website that I visit can interact with this web server running on my machine is a huge red flag for me as a security researcher.”

Leitschuh’s disclosure forced Zoom to issue multiple statements as user outrage grew. The security threat received widespread international news coverage, with many headlines containing the chilling combination of “hacker” and “webcam.” In an interview Wednesday, Zoom’s chief information security officer, Richard Farley, said the news coverage caused “maybe some panic that was unnecessary.”

“Part of the challenge for us, of course, is controlling that message out there that this was not as big a deal as it’s been made out to be,” Farley said. “There’s a lot of misinformation that went out there. … People just didn’t understand it.”

Zoom initially tried to assuage fears about the Mac web server without removing it. The company pointed out that it would be obvious to users they had just joined a meeting because a window would open in the foreground and their webcam’s indicator light would flash on. Also, a hacker couldn’t gain access to a webcam in secret or retain access to that video feed after users exited a meeting.  

Ultimately, Zoom reversed its original position and released a software update Tuesday that removed the web server from its Mac architecture. The next day, Apple pushed out a software patch that wiped the web server from all Mac devices, even for users who had previously deleted Zoom.

“We misjudged the situation and did not respond quickly enough — and that’s on us,” Zoom CEO Eric Yuan wrote in a blog post. “We take full ownership, and we’ve learned a great deal.”

Zoom’s default preferences added fuel to the fire. Unless users go out of their way to alter Zoom’s out-of-the-box settings, their webcams will be on by default when joining meetings. Also, Zoom does not by default have a pre-meeting lobby in which users confirm their audio and video settings before connecting.

Zoom said it would release an update over the July 13 weekend to make it easier for new users to control video settings. The first time a user joins a meeting, they will be able to instruct the app to join them to all future sessions with their webcams turned off.

Zoom has also taken heat for allowing embedded IFrame codes to launch Zoom meetings. In a statement, the company said IFrames — a method for adding HTML content to webpages — was necessary to support its integrations.

Leitschuh first raised the security issues with Zoom in March. The company invited him to its private bug bounty program, offering money in exchange for Leitschuh agreeing not to disclose his research publicly. Leitschuh, who said the proposed bounty was less than $1,000, declined because of the demand for secrecy.

Despite clashing over whether to remove the web server, Leitschuh and Zoom were able to agree on the severity of the risk it posed. They gave it a Common Vulnerability Scoring System rating of 5.4 out of 10. That score is in the “medium” range — riskier than “low” but not as severe as “high” or “critical.”

Zoom’s response to Leitschuh’s concerns was an indicator that companies have to verify the security architectures of UC vendors, analysts said.

“This event should be a clear reminder to both vendors and customers using UC and collaboration tools that there are very real threats to their platforms,” said Michael Brandenburg, analyst at Frost & Sullivan. “We are long past the days of only having to worry about toll fraud, and businesses have to be as mindful of the security risks on their UC platforms as they are with any other business application.”

Go to Original Article
Author:

AWS Summit widens net with services for containers, devs

NEW YORK — AWS pledges to maintain its torrid pace of product and services innovations and continue to expand the breadth of both to meet customer needs.

“You decide how to build software, not us,” said Werner Vogels, Amazon vice president and CTO, in a keynote at the AWS Summit NYC event. “So, we need to give you a really big toolbox so you can get the tools you need.”

But AWS, which holds a healthy lead over Microsoft and Google in the cloud market, also wants to serve as an automation engine for customers, Vogels added.

“I strongly believe that in the future … you will only write business logic,” he said. “Focus on building your application, drop it somewhere and we will make it secure and highly available for you.”

Parade of new AWS services continues

Vogels sprinkled a series of news announcements throughout his keynote, two of which centered on containers. First, Amazon CloudWatch Container Insights, a service that provides container-level monitoring, is now in preview for monitoring clusters in Amazon Elastic Container Service and Amazon Fargate, in addition to Amazon EKS and Kubernetes. In addition, AWS for Fluent Bit, which serves as a centralized environment for container logging, is now generally available, he said.

Serverless compute also got some attention with the release of Amazon EventBridge, a serverless event bus to take in and process data across AWS’ own services and SaaS applications. AWS customers currently do this with a lot of custom code, so “the goal for us was to provide a much simpler programming model,” Vogels said. Initial SaaS partners for EventBridge include Zendesk, OneLogin and Symantec.

Focus on building your application, drop it somewhere and we will make it secure and highly available for you.
Werner VogelsCTO, AWS

AWS minds the past, with eye on the future

Most customers are moving away from the concept of a monolithic application, “but there are still lots of monoliths out there,” such as SAP ERP implementations that won’t go away anytime soon, Vogels said.

But IT shops with a cloud-first mindset focus on newer architectural patterns, such as microservices. AWS wants to serve both types of applications with a full range of instance types, containers and serverless functionality, Vogels said.

He cited customers such as McDonald’s, which has built a home-delivery system with Amazon Elastic Container Service. It can take up to 20,000 orders per second and is integrated with partners such as Uber Eats, Vogels said.

Vogels ceded the stage for a time to Steve Randich, executive vice president and CIO of the Financial Industry Regulatory Authority (FINRA), a nonprofit group that seeks to keep brokerage firms fair and honest.

FINRA moved wholesale to AWS and its systems now ingest up to 155 billion market events in a single day — double what it was three years ago. “When we hit these peaks, we don’t even know them operationally because the infrastructure is so elastic,” Randich said.

FINRA has designed the AWS-hosted apps to run across multiple availability zones. “Essentially, our disaster recovery is tested daily in this regard,” he said.

AWS’ ode to developers

Developers have long been a crucial component of AWS’ customer base, and the company has built out a string of tool sets aimed to meet a broad set of languages and integrated development environments (IDEs). These include AWS Cloud9, IntelliJ, Python, Visual Studio and Visual Studio Code.

VS Code is Microsoft’s lighter-weight, browser-based IDE, which has seen strong initial uptake. All the different languages in VS Code are now generally available, Vogels said to audience applause.

Additionally, AWS Cloud Development Kit (CDK) is now generally available with support for TypeScript and Python. AWS CDK makes it easier for developers to use high-level construct to define cloud infrastructure in code, said Martin Beeby, AWS principal developer evangelist, in a demo.

AWS seeks to keep the cloud secure

Vogels also used part of his AWS Summit talk to reiterate AWS’ views on security, as he did at the recent AWS re:Inforce conference dedicated to cloud security.

“There is no line in the sand that says, ‘This is good-enough security,'” he said, citing newer techniques such as automated reasoning as key advancements.

Werner Vogels, AWS CTO
Werner Vogels, CTO of AWS, on stage at the AWS Summit in New York.

Classic security precautions have become practically obsolete, he added. “If firewalls were the way to protect our systems, then we’d still have moats [around buildings],” Vogels said. Most attack patterns AWS sees are not brute-force front-door efforts, but rather spear-phishing and other techniques: “There’s always an idiot that clicks that link,” he said.

The full spectrum of IT, from operations to engineering to compliance, must be mindful of security, Vogels said. This is true within DevOps practices such as CI/CD from both an external and internal level, he said. The first involves matters such as identity access management and hardened servers, while the latter brings in techniques including artifact validation and static code analysis.

AWS Summit draws veteran customers and newcomers

The event at the Jacob K. Javits Convention Center drew thousands of attendees with a wide range of cloud experience, from FINRA to fledgling startups.

“The analytics are very interesting to me, and how I can translate that into a set of services for the clients I’m starting to work with,” said Donald O’Toole, owner of CeltTools LLC, a two-person startup based in Brooklyn. He retired from IBM in 2018 after 35 years.

AWS customer Timehop offers a mobile application oriented around “digital nostalgia,” which pulls together users’ photographs from various sources such as Facebook and Google Photos, said CTO Dmitry Traytel.

A few years ago, Timehop found itself in a place familiar to many startups: Low on venture capital and with no viable monetization strategy. The company created its own advertising server on top of AWS, dubbed Nimbus, rather than rely on third-party products. Once a user session starts, the system conducts an auction for multiple prominent mobile ad networks, which results in the best possible price for its ad inventory.

“Nimbus let us pivot to a different category,” Traytel said.

Go to Original Article
Author:

Microsoft Partner Network licensing changes put channel on alert

Pending Microsoft Partner Network policy changes affecting product licensing have alarmed some partners, with more than 5,000 people signing a petition to register their disapproval.

A key area of contention is Microsoft’s plan to eliminate the internal use rights (IUR) association with product licenses included in Microsoft Action Pack and those included with a competency. Action Pack gives partners access to product licenses and technical enablement services, through which they can create applications and develop service offerings. Microsoft positions Action Pack, which ranges from OSes to business applications, as a way for new MPN members to get started. Competencies are business specializations in areas such as cloud business applications and data analytics.

The revised IUR association policy will compel Microsoft partners to pay for licenses they have been using in-house under the current Microsoft Partner Network membership terms. The new policy goes into effect July 1, 2020.

Paul Katz, president and chief software architect at EfficiencyNext, a software developer in Washington, D.C., said the policy change will cause the company to purchase five Office 365 Enterprise E3 seats. In addition, EfficiencyNext stands to lose the Microsoft Azure credits the company uses to run its website, although Katz said the policy change’s effect on the Azure benefit is somewhat ambiguous at this point. The licensing fees coupled with the potential loss of Azure credits would result in an annual net cost of about $2,400 a year, he added.

“That’s a thorn in the side, but it doesn’t change our world,” Katz said.

The stakes are much higher, he said, for larger partners with more licenses they will need to pay for. A partner with 100 Office 365 E3 licenses, for example, would need to shell out $24,000 annually, based on the $20 per user, per month seat fee.

Charles WeaverCharles Weaver

Charles Weaver, CEO of MSPAlliance, an association representing managed service providers (MSPs), said he found out about the Microsoft policy change when a board member sent him the online petition. “It’s going to sting most of them,” he said of the licensing shift’s effect on service providers. “It is probably not going to be received well by the rank-and-file MSPs.”

The partner petition, posted on Change.org, stated Microsoft’s policies represent unfair treatment, noting partners “have been so loyal to the Microsoft business.” Microsoft couldn’t be reached for comment.

Microsoft Partner Network: Policy consequences

Katz advised partners to “get licensed up” in light of the IUR change, noting that Microsoft has been aggressive in the past with software asset management engagements.

Weaver, however, said he hopes that won’t be the case.

“I can’t think of anything more destructive to the relationship between Microsoft and the channel than that,” he said, noting the audits software vendors pursue tend to target large customers, where millions of dollars are at stake.

People don’t want to come to terms with the fact that we are resellers and we don’t, in any way, shape or form, control the products.
Stanley LouissaintPresident, Fluid Designs Inc.

In addition to causing some partners to incur higher licensing costs, the Microsoft IUR policy shift could also hinder partners’ use-what-you-sell strategies. Resellers and service providers that use a vendor’s products to help run their business gain technology experience, which they can transfer to end customers when deploying those products.

Katz said “dogfooding” — as in, eating one’s own dog food — is the best way to test products, especially for companies that can’t afford to set up a separate test environment.

But the restriction on IUR would discourage this approach and could cause Microsoft to miss out on opportunities down the road.

Weaver pointed to a potential unintended consequence of Microsoft’s action: “They stop the freeloading of MSPs from using their software, as they look at it, and they lose potentially thousands of MSPs who no longer try that stuff out and no longer have access to it and may go to different vendors and different solutions.”

A part of doing business

Stanley LouissaintStanley Louissaint

Stanley Louissaint, president of Fluid Designs Inc., an IT services provider in Union, N.J., said the MPN policy changes don’t affect his company but noted the unease among partners. Louissaint suggested changes in vendor policies are simply part of doing business as a channel partner.

“People don’t want to come to terms with the fact that we are resellers and we don’t, in any way, shape or form, control the products,” he said. “If [Microsoft] changes how they want to deal with us, it is what it is.”

Louissaint said the bottom line is Microsoft wants partners to become paying customers when using the vendor’s products to run their businesses. As for creating test beds to assess products, channel partners still can download software on a trial basis — for up to 180 days, in some cases, he added.

Jeff Aden, executive vice president of marketing and business development at 2nd Watch, a Seattle MSP, said the new policy “is not going to change what we do” unless there is an unforeseen effect. 2nd Watch is a Microsoft Gold partner and an AWS Premier Consulting Partner.

EfficiencyNext’s Katz said the licensing changes don’t mean Microsoft is greedy. He noted Windows Insider members can download preview versions of Windows for free, and there is a community version of Visual Studio that is free for up to five users in nonenterprise organizations.

“They are still a great company, and we are still happy to be working with them,” he said.

Go to Original Article
Author: