Tag Archives: company’s

AWS Outposts brings hybrid cloud support — but only for Amazon

LAS VEGAS — AWS controls nearly half of the public IaaS market today, and based on the company’s rules against use of the term ‘multi-cloud,’ would be happy to have it all, even as rivals Microsoft and Google make incremental gains and more customers adopt multi-cloud strategies.

That’s the key takeaway from the start of this year’s massive re:Invent conference here this week, which was marked by the release of AWS Outposts for hybrid clouds and a lengthy keynote from AWS CEO Andy Jassy that began with a tongue-in-cheek invite to AWS’ big tent in the cloud.

“You have to decide what you’re going to bring,” Jassy said of customers who want to move workloads into the public cloud. “It’s a little bit like moving from a home,” he added, as a projected slide comically depicted moving boxes affixed with logos for rival vendors such as Oracle and IBM sitting on a driveway.

“It turns out when companies are making this big transformation, what we see is that all bets are off,” Jassy said. “They reconsider everything.”

For several years now, AWS has used re:Invent as a showcase for large customers in highly regulated industries that have made substantial, if not complete, migrations to its platform. One such company is Goldman Sachs, which has worked with AWS on several projects, including Marcus, a digital banking service for consumers. A transaction banking service that helps companies manage their cash in a cloud-native stack on AWS is coming next year, said Goldman Sachs CEO David Solomon, who appeared during Jassy’s talk. Goldman is also moving its Marquee market intelligence platform into production on AWS.

Along with showcasing enthusiastic customers like Goldman Sachs, Jassy took a series of shots at the competition, some veiled and others overt.

“Every industry has lots of companies with mainframes, but everyone wants to move off of them,” he claimed. The same goes for databases, he added. Customers are trying to move away from Oracle and Microsoft SQL Server due to factors such as expense and lock-in, he said. Jassy didn’t mention that similar accusations have been lodged at AWS’ native database services.

Jassy repeatedly took aim at Microsoft, which has the second most popular cloud platform after AWS, albeit with a significant lag. “People don’t want to pay the tax anymore for Windows,” he said.

But it isn’t as if AWS would actually shun Microsoft technology, since it has long been a host for many Windows Server workloads. In fact, it wants as much as it can get. This week, AWS introduced a new bring-your-own-license program for Windows Server and SQL Server designed to make it easier for customers to run those licenses on AWS, versus Azure.

AWS pushes hybrid cloud, but rejects multi-cloud

One of the more prominent, although long-expected, updates this week is the general availability of AWS Outposts. These specialized server racks provided by AWS reside in customers’ own data centers, in order to comply with regulations or meet low-latency needs. They are loaded with a range of AWS software, are fully managed by AWS and maintain continuous connections to local AWS regions.

The company is taking the AWS Outposts idea a bit further with the release of new AWS Local Zones. These will consist of Outpost machines placed in facilities very close to large cities, giving customers who don’t want or have their own data centers, but still have low-latency requirements, another option. Local Zones, the first of which is in the Los Angeles area, provide this capability and tie back to AWS’ larger regional zones, the company said.

Outposts, AWS Local Zones and the previously launched VMware Cloud on AWS constitute a hybrid cloud computing portfolio for AWS — but you won’t hear Jassy or other executives say the phrase multi-cloud, at least not in public.

In fact, partners who want to co-brand with AWS are forbidden from using that phrase and similar verbiage in marketing materials, according to an AWS co-branding document provided to SearchAWS.com.

“AWS does not allow or approve use of the terms ‘multi-cloud,’ ‘cross cloud,’ ‘any cloud, ‘every cloud,’ or any other language that implies designing or supporting more than one cloud provider,” the co-branding guidelines, released in August, state. “In this same vein, AWS will also not approve references to multiple cloud providers (by name, logo, or generically).”

An AWS spokesperson didn’t immediately reply to a request for comment.

The statement may not be surprising in context of AWS’s market lead, but does stand in contrast to recent approaches by Google, with the Anthos multi-cloud container management platform, and Microsoft’s Azure Arc, which uses native Azure tools, but has multi-cloud management aspects.

AWS customers may certainly want multi-cloud capabilities, but can protect themselves by using portable products and technologies, such as Kubernetes at the lowest level with a tradeoff being the manual labor involved, said Holger Mueller, an analyst with Constellation Research in Cupertino, Calif.

“To be fair, Azure and Google are only at the beginning of [multi-cloud],” he said.

Meanwhile, many AWS customers have apparently grown quite comfortable moving their IT estates onto the platform. One example is Cox Automotive, known for its digital properties such as Autotrader.com and Kelley Blue Book.

In total, Cox has more than 200 software applications, many of which it accrued through a series of acquisitions, and the company expects to move it all onto AWS, said Chris Dillon, VP of architecture, during a re:Invent presentation.

Cox is using AWS Well-Architected Framework, a best practices tool for deployments on AWS, to manage the transition.

“When you start something new and do it quickly you always run the risk of not doing it well,” said Gene Mahon, director of engineering operations. “We made a decision early on that everything would go through a Well-Architected review.”

Go to Original Article

SAP sees S/4HANA migration as its future, but do customers?

The first part of our 20-year SAP retrospective examined the company’s emerging dominance in the ERP market and its transition to the HANA in-memory database. Part two looks at the release of SAP S/4HANA in February 2015. The “next-generation ERP” was touted by the company as the key to SAP’s future, but it ultimately raised questions that in many cases have yet to be answered. The issues surrounding the S/4HANA migration remain the most compelling initiative for the company’s future.

Questions about SAP’s future have altered in the past year, as the company has undergone an almost complete changeover in its leadership ranks. Most of the SAP executives who drove the strategy around S/4HANA and the intelligent enterprise have left the company, including former CEO Bill McDermott. New co-CEOs Jennifer Morgan and Christian Klein are SAP veterans, and analysts don’t think the change in leadership will make for significant changes in the company’s technology and business strategy.

But they will take over the most daunting task SAP has faced: convincing customers of the business value of the intelligent enterprise, a data-driven transformation of businesses with S/4HANA serving as the digital core. As part of the transition toward intelligence, SAP is pushing customers to move off of tried and true SAP ECC ERP systems (or the even older SAP R/3), and onto the modern “next-generation ERP” S/4HANA. SAP plans to end support for ECC by 2025.

Dan LahlDan Lahl

S/4HANA is all about enabling businesses to make decisions in real time as data becomes available, said Dan Lahl, SAP vice president of product marketing and a 24-year SAP veteran.

“That’s really what S/4HANA is about,” Lahl said. “You want to analyze the data that’s in your system today. Not yesterday’s or last week’s information and data that leads you to make decisions that don’t even matter anymore, because the data’s a week out. It’s about giving customers the ability to make better decisions at their fingertips.”

S/4HANA migration a matter of when, not if

Most SAP customers see the value of an S/4HANA migration, but they are concerned about how to get there, with many citing concerns about the cost and complexity of the move. This is a conundrum that SAP acknowledges.

“We see that our customers aren’t grappling with if [they are going to move], but when,” said Lloyd Adams, managing director of the East Region at SAP America. “One of our responsibilities, then, is to provide that clarity and demonstrate the value of S/4HANA, but to do so in the context of the customers’ business and their industry. Just as important as showing them how to move, we need to do it as simply as possible, which can be a challenge.”

Lloyd AdamsLloyd Adams

S/4HANA is the right platform for the intelligent enterprise because of the way it can handle all the data that the intelligent enterprise requires, said Derek Oats, CEO of Americas at SNP, an SAP partner based in Heidelberg, Germany that provides migration services.

In order to build the intelligent enterprise, customers need to have a platform that can consume data from a variety of systems — including enterprise applications, IoT sensors and other sources — and ready it for analytics, AI and machine learning, according to Oats. S/4HANA uses SAP HANA, a columnar, in-memory database, to do that and then presents the data in an easy-to-navigate Fiori user interface, he said.

“If you don’t have that ability to push out of the way a lot of the work and the crunching that has often occurred down to the base level, you’re kind of at a standstill,” he said. “You can only get so much out of a relational database because you have to rely on the CPU at the application layer to do a lot of the crunching.”

S/4HANA business case difficult to make

Although many SAP customers understand the benefits of S/4HANA, SAP has had a tough sell in getting its migration message across to its large customer base. The majority of customers plan to remain on SAP ECC and have only vague plans for an S/4HANA migration.

Joshua GreenbaumJoshua Greenbaum

“The potential for S/4HANA hasn’t been realized to the degree that SAP would like,” said Joshua Greenbaum, principal at Enterprise Applications Consulting. “More companies are really looking at S/4HANA as the driver of genuine business change, and recognize that this is what it’s supposed to be for. But when you ask them, ‘What’s your business case for upgrading to S/4HANA?’ The answer is ‘2025.’”

The real issue with S/4HANA is that the concepts behind it are relatively big and very specific to company, line of business and geography.
Joshua GreenbaumPrincipal, Enterprise Applications Consulting

One of the problems that SAP faces when convincing customers of the value of S/4HANA and the intelligent enterprise is that no simple use case drives the point home, Greenbaum said. Twenty years ago, Y2K provided an easy-to-understand reason why companies needed to overhaul their enterprise business systems, and the fear that computers wouldn’t adapt to the year 2000 led in large measure to SAP’s early growth.

“Digital transformation is a complicated problem and the real issue with S/4HANA is that the concepts behind it are relatively big and very specific to company, line of business and geography,” he said. “So the use cases are much harder to justify, or it’s much more complicated to justify than, ‘Everything is going to blow up on January 1, 2000, so we have to get our software upgraded.'”

Evolving competition faces S/4HANA

Jon Reed, analyst and co-founder of ERP news and analysis firm Diginomica.com, agrees that SAP has successfully embraced the general concept of the intelligent enterprise with S/4HANA, but struggles to present understandable use cases.

Jon ReedJon Reed

“The question of S/4HANA adoption remains central to SAP’s future prospects, but SAP customers are still trying to understand the business case,” Reed said. “That’s because agile, customer-facing projects get the attention these days, not multi-year tech platform modernizations. For those SAP customers that embrace a total transformation — and want to use SAP tech to do it — S/4HANA looks like a viable go-to product.”

SAP’s issues with driving S/4HANA adoption may not come from the traditional enterprise competitors like Oracle, Microsoft and Infor, but from cloud-based business applications like Salesforce and Workday, said Eric Kimberling, president of Third Stage Consulting, a Denver-based firm that provides advice on ERP deployments and implementations.

Eric KimberlingEric Kimberling

“They aren’t direct competitors with SAP; they don’t have the breadth of functionality and the scale that SAP does, but they have really good functionality in their best-of-breed world,” Kimberling said. “Companies like Workday and Salesforce make it easier to add a little piece of something without having to worry about a big SAP project, so there’s an indirect competition with S/4HANA.”

SAP customers are going to have to adapt to evolving enterprise business conditions regardless of whether or when they move to S/4HANA, Greenbaum said.

“Companies have to build business processes to drive the new business models. Whatever platform they settle on, they’re going to be unable to stand still,” he said. “There’s going to have to be this movement in the customer base. The question is will they build primarily on top of S/4HANA? Will they use an Amazon or an Azure hyperscaler as the platform for innovation? Will they go to their CRM or workforce automation tool for that? The ‘where’ and ‘what next’ is complicated, but certainly a lot of companies are positioning themselves to use S/4HANA for that.”

Go to Original Article

Microsoft cybersecurity strategy, hybrid cloud in focus at Ignite

Microsoft CEO Satya Nadella has hinted that the big news at the company’s Ignite conference will involve cybersecurity and updates to its approach to hybrid and distributed cloud applications.

“Rising cyber threats and increasing regulation mean security and compliance is a strategic priority for every organization,” Nadella said on Microsoft’s earnings call for the first quarter of 2020 this week. He highlighted that the company has offerings across identity, security and compliance that span people, devices, apps, developer tools, data and infrastructure “to protect customers in today’s zero trust environment.”

In addition to Microsoft cybersecurity-related comments, Nadella addressed investor questions about the company’s hybrid cloud business.

“Our approach has always been about this distributed computing fabric, or thinking about hybrid not as some transitory phase, but as a long-term vision for how computing will meet the real-world needs,” he replied in the call.

Satya NadellaSatya Nadella

Microsoft’s hybrid cloud offerings include Azure Stack, which takes a subset of Azure’s software foundation and installs it on specialized hardware to be run in customer-controlled environments.

At Ignite, “you will see us take the next leap forward even in terms of how we think about the architecture inclusive of the application models, programming models on what distributed computing looks like going forward,” Nadella said.

Microsoft targets cybersecurity, hybrid cloud

Given that cybersecurity and hybrid cloud computing are two of the hottest areas in enterprise tech today, Nadella’s teases aren’t especially surprising. But the specific details of what Microsoft has planned are worth delving into, analysts said.

It was a bit surprising that Nadella didn’t mention Azure Stack in his remarks on the conference call, given the progress that product has made in the market, said Holger Mueller, an analyst with Constellation Research in Cupertino, Calif.

However, Ignite’s session agenda includes a fair number of Azure Stack sessions, covering matters such as migration planning and operational best practices. One possibility is that Microsoft will announce expansions of Azure Stack’s footprint so it’s more on par with the Azure cloud’s full capabilities, Mueller added. 

Azure CTO Mark Russinovich is scheduled to speak at Ignite on multiple occasions. One session will focus on new innovations in Azure’s global architecture and another targets next-generation application development and deployment.

On Twitter, Russinovich said he’ll discuss matters such as DAPR, Microsoft’s recently launched open source runtime for microservices applications. He also plans to talk about Open Application Model, a specification for cloud-native app development, and Rudr, a reference implementation of the Open Application Model (OAM).

The OAM is a project under the Open Web Foundation. It serves as a specification so the application description is separated from the details of how the application is deployed and managed by the infrastructure 

According to a source familiar with the company’s plans, Microsoft released OAM because it is designed to be built by developers but then passed on for execution by an operations team, adding that DAPR is a way to build applications that are designed to be componentized.

“Developers don’t have to worry about where (an application) will run,” the source said. “They just describe its resource requirements, focus on building a microservices application and not to worry about how each component will communicate with the others,” he said. Going into Ignite, the hyperscale cloud market is being driven by a couple of factors, said Jay Lyman, an analyst with 451 Research.

“AWS, Microsoft and Google sort of define the modern enterprise IT operational paradigm with their breadth of services, innovation and competition,” Lyman said. “At the same time, the market serves as a discipline for them.”

I wouldn’t be surprised to see Microsoft announce something around support for other public clouds.
Jay LymanAnalyst, 451 Research

Hybrid cloud is an example of this, having emerged to meet customer needs to run on-premises infrastructure in a similar manner to public clouds, he added. Azure Stack, Google Kubernetes Engine On-Prem and AWS Outposts are some early answers to the problem.

Meanwhile, Google’s Anthos and IBM Red Hat’s OpenShift platform target multi-cloud application deployments.

“I wouldn’t be surprised to see Microsoft announce something around support for other public clouds,” Lyman said.

Microsoft cybersecurity portfolio gains gravity

Some analysts believe Microsoft is already well positioned in the cybersecurity market on the proven reliability of Windows Defender, Active Directory, the Azure Active Directory, Azure Sentinel and Office 365 Advanced Threat Protection.

“Many enterprises trust Microsoft to manage the identities of their users accessing information both from on-prem and cloud-based applications,” said Doug Cahill, senior analyst and group director at the Enterprise Strategy Group (ESG) in Milford, Ma. “They’re already a formidable cybersecurity competitor,” he said.

In a recent survey conducted by ESG, IT pros said one of the most important attributes they look for in an enterprise-class cybersecurity vendor is the reliability of products across their portfolio and that they are “well-aligned” with their particular IT initiatives.

“Obviously, Microsoft is one of the leading IT vendors,” Cahill said. “They have Active Directory, which is broadly adopted, serving as a foundational piece of their cybersecurity strategy,” he said.

Logically, the next step for Microsoft is to extend its platform out to so it plays across the broader attack surface, which includes the rapidly growing Office 365.

During the earnings call, Nadella ran down what he believes are the individual strengths of the company’s cybersecurity offerings. He made special note of the cloud-based Sentinel and its ability to analyze security vulnerabilities across an entire organization using AI to “detect, investigate and automatically remediate threats.”

Nadella said the company would reveal more details about its “expanding opportunities in the cybersecurity market” at Ignite.

Go to Original Article

Panasas storage roadmap includes route to software-defined

Panasas is easy to overlook in the scale-out NAS market. The company’s products don’t carry the name recognition of Dell EMC Isilon, NetApp NAS filers and IBM Spectrum Scale. But CEO Faye Pairman said her team is content to fly below the radar — for now — concentrating mostly on high-performance computing, or HPC.

The Panasas storage flagship is the ActiveStor hybrid array with the PanFS parallel file system. The modular architecture scales performance in a linear fashion, as additional capacity is added to the system. “The bigger our solution gets, the faster we go,” Pairman said.

Panasas founder Garth Gibson launched the object-based storage architecture in 2000. Gibson, a computer science professor at Carnegie Mellon University in Pittsburgh, was a a developer of RAID storage taxonomy. He serves as Panasas’ chief scientist.

Panasas has gone through many changes over the past several years, marked by varying degrees of success to broaden into mainstream commercial NAS. That was Pairman’s charter when she took over as CEO in 2010. Key executives left in a 2016 management shuffle, and while investors have provided $155 million to Panasas since its inception, the last reported funding was a $52.5 million venture round in 2013.

As a private company, Panasas does not disclose its revenue, but “we don’t have the freedom to hemorrhage cash,” Pairman said.

We caught up with Pairman recently to discuss Panasas’ growth strategy, which could include offering a software-only license option for PanFS. She also addressed how the vendor is moving to make its software portable and why Panasas isn’t jumping on the object-storage bandwagon.

Panasas storage initially aimed for the high end of the HPC market. You were hired to increase Panasas’ presence in the commercial enterprise space. How have you been executing on that strategy?

Faye Pairman: It required looking at our parallel file system and making it more commercially ready, with features added to improve stability and make it more usable and reliable. We’ve been on that track until very recently.

We have an awesome file system that is very targeted at the midrange commercial HPC market. We sell our product as a fully integrated appliance, so our next major objective — and we announced some of this already — is to disaggregate the file system from the hardware. The reason we did that is to take advantage of commodity hardware choices on the market.

Once the file system is what we call ‘portable,’ meaning you can run it on any hardware, there will be a lot of new opportunity for us. That’s what you’ll be hearing from us in the next six months.

Would Panasas storage benefit by introducing an object storage platform, even as an archive device?

Pairman: You know, this is a question we’ve struggled with over the years. Our customers would like us to service the whole market. [Object storage] would be a very different financial profile than the markets we serve. As a small company, right now, it’s not a focus for us.

We differentiate in terms of performance and scale. Normally, what you see in scale-out NAS is that the bigger it gets, the more sluggish it tends to be. We have linear scalability, so the bigger our solution gets, the faster we go.

That’s critically important to the segments we serve. It’s different from object storage, which is all about being simple and the ability to get bigger and bigger. And performance is not a consideration.

Which vendors do you commonly face off with in deals? 

Pairman: Our primary competitor is IBM Spectrum Scale, with a file system and approach that is probably the most similar to our own and a very clear target on commercial HPC. We also run into Isilon, which plays more to commercial — meaning high reads, high usability features, but [decreased] performance at scale.

And then, at the very high end, we see DataDirect Networks (DDN) with a Lustre file system for all-out performance, but very little consideration for usability and manageability.

The niche is in the niche. We target very specific markets and very specific workloads.
Faye PairmanCEO, Panasas

Which industry verticals are prominent users of Panasas storage architecture? Are you a niche within the niche of HPC?

Pairman: The niche is in the niche. We target very specific markets and very specific workloads. We serve all kinds of application environments, where we manage very large numbers of users and very large numbers of files.

Our target markets are manufacturing, which is a real sweet spot, as well as life sciences and media and entertainment. We also have a big practice in oil and gas exploration and all kinds of scientific applications, and even some manufacturing applications within the federal government.

Panasas storage is a hybrid system, and we manage a combination of disk and flash. With every use case, while we specialize in managing very large files, we also have the ability to manage the file size that a company does on flash.

What impact could DDN’s acquisition of open source Lustre exert on the scale-out sector, in general, and Panasas in particular?

Pairman: I think it’s a potential market-changer and might benefit us, which is why we’re keeping a close eye on where Lustre ends up. We don’t compete directly with Lustre, which is more at the high end.

Until now, Lustre always sat in pretty neutral hands. It was in a peaceful place with Intel and Seagate, but they both exited the Lustre business, and Lustre ended up in DDN’s hands. It remains to be seen what that portends. But there is a long list of vendors that depend on Lustre remaining neutral, and now it’s in the hands of the most aggressive competitor in that space.

What happens to Lustre is less relevant to us if it stays the same. If it falters, we think we have an opportunity to move into that space. It’s potentially a big shakeup that could benefit vendors like us who build a proprietary file system.

Juniper boosting performance of SRX5000 firewall for IoT, 5G

Juniper Networks has introduced a security acceleration card that boosts the performance of the company’s SRX5000 line of firewalls to future-proof the data centers of service providers, cloud providers and large enterprises.

Juniper designed the services processing card, SPC3, for organizations anticipating large data flows from upcoming multi-cloud, internet-of-things and 5G applications. Besides meeting future demand, the SPC3 can also accommodate current traffic increases due to video conferencing, media streaming and other data-intensive applications.

The SPC3 multiplies performance up to a factor of 11 across key metrics for the SRX5000 line, Juniper said. Organizations using the Juniper SPC2 can upgrade to the SPC3 without service interruptions.

What’s in the SRX5000 line?

The SRX5000 line’s security services include a stateful firewall, an intrusion prevention system, unified threat management and a virtual private network. Network operators manage security policies for SRX5000 hardware through Juniper’s Junos Space Security Director.

With the addition of an SPC, the SRX5000 line can support up to 2 Tbps of firewall throughput. The line’s I/O cards offer a range of connectivity options, including 1 Gigabit Ethernet, 10 GbE, 40 GbE and 100 GbE interfaces.

Security is one area Juniper has reported quarterly revenue growth while overall sales have declined. For the quarter ended June 30, Juniper reported last month revenue from its security business increased to $79.5 million from $68.7 million a year ago.

However, overall revenue fell 8% to $1.2 billion, and the company said sales in the current quarter would also be down. Nevertheless, the company expects to return to quarterly revenue growth in the fourth quarter.

Cisco Viptela integrated with IOS XE on ISR, ASR

Cisco has integrated its Viptela software-defined WAN with the company’s IOS XE network operating system, effectively making the cloud-controlled SD-WAN product an option for distributing network traffic from Cisco ISR and ASR routers.

Announced this week, the integration means companies using Cisco’s legacy SD-WAN product, Intelligent WAN — often used with the Integrated Services Router (ISR) — can switch to a much simpler system. IWAN’s complexity precluded broad market adoption, so when Cisco acquired Viptela last year for $610 million, many analysts predicted the company would eventually migrate customers to Viptela.

Connecting Cisco Viptela to IOS XE adds a cloud-controlled element to IOS XE hardware through the SD-WAN product’s vManage console. The cloud-based software is the centralized component for configuration management and monitoring network traffic going to and from the ISR and Aggregation Services Router (ASR) hardware.

As a router network operating system, IOS XE includes dozens of services beyond routing and switching, such as encryption, authentication, firewall capabilities and policy enforcement.

Next for Cisco Viptela

In March, Cisco launched cloud-based predictive analytics for Viptela, called vAnalytics. The software, which companies access through vManage, provides network managers with answers to what-if scenarios.

Over the next 18 months, Cisco plans to merge vManage into DNA Center, a centralized software console for managing campus networks built on top of Cisco’s Catalyst 9000 campus switches. The integration would provide network managers with a single view of their LAN, WAN and campus networks.

Companies use SD-WAN for traffic distribution across broadband, Long Term Evolution and MPLS links connecting campuses and remote offices to the internet and the corporate data center. In the first quarter, companies refreshing their campus and branch networks contributed to a more than 5% increase year to year in 1 Gb Ethernet revenue and a nearly 16% rise in port shipments, according to IDC.

Cisco claimed organizations use more than 1 million ISR and ASR routers globally. ASR routers are designed for high-bandwidth applications, such as video streaming, while ISR systems are for small or midsize networks found in small businesses and branch offices.

AWS SSO puts Amazon at the center of IT access

AWS’ latest service is another step in the company’s goal to be the hub for corporations’ IT activity.

AWS Single Sign-On (AWS SSO), added with little fanfare after AWS re:Invent 2017, is a welcome addition for many users. The service centralizes the management of multiple AWS accounts, as well as additional third-party applications tethered to those accounts.

AWS SSO uses AWS Organizations and can be extended with a configuration wizard to Security Assertion Markup Language (SAML) applications. It also comes with built-in integrations with popular services such as Box, Office 365, Salesforce and Slack.

Users of the service access AWS and outside applications through a single portal, within individually assigned access policies. Sign-in activities and administrative changes are tracked by AWS CloudTrail, and companies can audit employee use of those services themselves or use an outside service such as Splunk or Sumo Logic to analyze those logs.

Permissions to various Amazon cloud services and outside apps can be configured in AWS SSO for common IT job categories or by creating custom groupings. The service also connects to on-premises Microsoft Active Directory to identify credentials and manage which employees or groups of employees can access which AWS accounts.

The service has limitations. It’s currently confined to the U.S. East region in Virginia, and can’t be accessed through the AWS Command Line Interface or via an API. Also, any changes to permissions can only be made by a master account.

AWS has a reputation for going after segments of IT that it sees as vulnerable, and this could be a direct shot at some of the prominent SSO providers on the market. Okta in particular is popular among the enterprise market, so this free alternative from AWS could be attractive, said Adam Book, principal cloud engineer at Relus Technologies, an AWS consulting partner in Peachtree Corners, Ga.

For large organizations single sign-on is important. … Once you get into third-party apps your users don’t want to remember 50 different passwords.
Adam Bookprincipal cloud engineer, Relus Technologies

“You can manage all your apps in one place and not pay for a third party,” he said. “Amazon then becomes your one trusted source for everything.”

AWS solved some of the complexity around managing accounts when it enabled administrators to establish roles for users, but this simplifies things further with a single point to track work across development, QA and production accounts, Book said. It also helps to manage onboarding and removal of employees’ credentials based on their employment status.

“For large organizations single sign-on is important,” he said. “I don’t think it’s as much for the Amazon accounts, but once you get into third-party apps your users don’t want to remember 50 different passwords.”

Joe Emison, founder and CTO, BuildFaxJoe Emison

Others see AWS SSO as not just a way to unseat Okta, but to go after Active Directory as well. SSO can be used with or without the Microsoft directory service, which isn’t ideal for cloud environments despite an updated version in Microsoft Azure, said Joe Emison, founder and CTO of BuildFax, an AWS customer in Austin, Texas.

“Active Directory, at its core, is really based around the idea that everyone is going to be connected to a local network to start up their computer and connect to a master server and get rules and policies from there,” he said. “That’s nice if everyone goes into the office, but this is not the world we live in.”

Compared to AWS Identity and Access Management (IAM), Active Directory lacks fine-grained access control to assign permissions and can be difficult to integrate with SAML-based applications, Emison said. By incorporating IAM tools within SSO and extending that level of control to outside applications, AWS could eventually supplant Active Directory as organizations’ preferred means to manage employee access.

Trevor Jones is a senior news writer with SearchCloudComputing and SearchAWS. Contact him at [email protected].

Cloud App Discovery spotlights shadow IT users

Do you know what end users do with a company’s data? Do they use Dropbox to share documents with clients? Discuss…


* remove unnecessary class from ul
$(“#inlineregform”).find( “ul” ).removeClass(“default-list”);

* Replace “errorMessageInput” class with “sign-up-error-msg” class
function renameErrorMsgClass() {
$(“.errorMessageInput”).each(function() {
if ($(this).hasClass(“hidden”)) {
$(this).removeClass(“errorMessageInput hidden”).addClass(“sign-up-error-msg hidden”);
} else {

* when validation function is called, replace “errorMessageInput” with “sign-up-error-msg”
* before return
function validateThis(v, form) {
var validateReturn = urValidation.validate(v, form);
return validateReturn;

* DoC pop-up window js – included in moScripts.js which is not included in responsive page
$(“#inlineRegistration”).on(“click”,”a.consentWindow”, function(e) {
window.open(this.href, “Consent”, “width=500,height=600,scrollbars=1”);

trade secrets via Slack? Plan secret projects on Trello? The Cloud App Discovery feature in Office 365 reveals certain shadow IT practices admins need to know to secure the enterprise.

End users often enlist cloud services to perform their jobs, but the practice of introducing unsanctioned apps invites risk. It circumvents security practices, which potentially opens the company to an unexpected compliance issue or a cyberattack. Cloud App Discovery uncovers shadow IT without the need to implement agent-based software on users’ computers and mobile devices.

Here’s how to identify and monitor use of unauthorized cloud services within the organization — and what to do about it.

Find hidden app usage with Cloud App Discovery

Office 365’s E3 subscription includes Cloud App Discovery, a component of Cloud App Security. This service interprets log files from web proxy servers, firewalls and network devices, such as wireless access points and switches, to create a visual picture of the shadow IT services used in the organization.

Cloud App Security dashboard
Figure 1. The Discover tab in Office 365 Cloud App Security presents a visual summary of shadow IT services used in the organization.

The Office 365 version of Cloud App Discovery indicates services that have similar functions to Office 365 apps, especially productivity services. Therefore, the discovered apps section does not include nonproductivity applications. We’ll show how to uncover those later in this article.

Create reports of productivity apps

Cloud App Discovery uses logs taken from a network device that sits between end users and the internet. The Cloud App Discovery service supports common log file formats, such as those generated by Cisco access points, open source web proxy servers or third-party cloud services, such as Symantec Websense.

The admin then accesses the Cloud App Discovery feature from the Security & Compliance Center. Download a log file from the network device in a format that Cloud App Discovery supports, navigate to the main console and choose Discover > Create new snapshot report.

Search for and specify the log format from the list, then upload the log file. Office 365 takes up to 24 hours to process and display the results.

Log file upload
Figure 2. To create a new snapshot report, search for the log format you want to use, and upload the log file.

Navigate to Discover > Manage snapshot reports to see the uploaded file. Office 365 shows processed reports as Ready.

Manage snapshot reports
Figure 3. The snapshot reports section indicates when the admin uploaded the report and its status.

The report shows the productivity apps in use from the Office 365 platform and from other cloud services. Select an app to open an Excel spreadsheet for more details, such as how many users accessed the service, how many times users accessed it and the amount of traffic uploaded to and downloaded from the service.

Discovered apps
Figure 4. View the report to see the productivity apps that are in use and to see detailed information about each app.

Automate the log upload process

Organizations that subscribe to Enterprise Mobility and Security (EMS) E3 can extend Cloud App Discovery’s functionality in several powerful ways.

The continuous reports feature automates log uploads through a customized VM with a syslog server and an HTTPS uploader.

To configure continuous reports, use the Discover > Upload logs automatically option in Cloud App Security. The admin adds a data source, which replaces the uploaded log file. The admin then defines a log collector and links it to the data source, which generates the information to deploy the Hyper-V or VMware VM.

After the VM deploys, configure one or more network devices to send data to the log collector in the format that matches the defined data source. Figure 5 shows an example of a Cisco Meraki device set up to send URL data in syslog format to the log collector’s VM IP address.

Configure URL data
Figure 5. Configure a network device to send data to the VM IP address for the log collector.

After about 24 hours, results from logged data will appear in the Cloud App Discovery section. The admin accesses both real-time and historic information related to app usage.

Cloud App Discovery dashboard
Figure 6. The Cloud App Discovery dashboard shows current app usage statistics and provides access to historical information.

See the threat level of shadow IT services

Aside from productivity services — such as webmail, cloud storage and content sharing — Cloud App Discovery also provides visibility into other areas. The EMS-based version of the tool detects internet of things devices, cloud service use from providers such as Amazon Web Services and visits to websites.

Cloud App Discovery ranks the discovered services based on risk score from one to 10. A lower score indicates a more suspicious application. The Cloud Discovery service determines the rank through assessment of security policies, such as where the data resides, who has access, who has control and whether organizations can prevent unauthorized access.

Apps designed for enterprise use, such as Google’s G Suite, get good scores. Services that provide less organizational control, such as WhatsApp, receive poor grades.

WhatsApp is considered a risky service because no one has administrative control. For example, a financial advisor who communicates with a client over WhatsApp could breach regulations because the business cannot record the conversation for future discovery.

View the detailed report on each service, and decide whether to approve the cloud service.

Figure 7 lists the services with usage statistics and threat level:

Discovered apps tab
Figure 7. The Discovered apps tab lists the services used on the company network with details on the traffic used and the risk score.

Take action against shadow IT

Administrators should take action when armed with data from Cloud App Discovery. If workers use Trello, Slack and Box, then admins should deploy the corresponding Office 365 services — Planner, Teams and OneDrive for Business, respectively.

However, IT should still take action even if the business can’t make these Office 365 apps immediately available. In that case, let end users know that the company plans to roll out Microsoft services to replace shadow IT apps. Explain the benefits of the move, such as service integration across the Office 365 suite.

The EMS-integrated capabilities give admins a way to configure security alerts when workers use these unsanctioned apps. Part of the continuous reports feature partially controls the use of apps. For example, an admin creates a rule that identifies when a user downloads a lot of data from Office 365 and then uploads a lot of data to Dropbox. When the rule detects this activity, the admin gets an alert and notifies the security team to block that user’s access to Office 365.

Next Steps

Slack or Microsoft Teams: Which one makes more sense?

Shadow IT dangers present best opportunity to use cloud access security brokers

Regulate shadow IT to reduce risk

Advanced machine learning, database automation touted at OpenWorld

SAN FRANCISCO — Oracle founder and CTO Larry Ellison this week detailed the company’s autonomous Oracle Database 18c for the cloud, which Ellison said will rely on advanced machine learning techniques to greatly reduce database administration tasks, such as tuning and patching. 

At the heart of this Oracle cloud database is extensive use of machine learning, which Ellison called “the first branch of artificial intelligence that really works.”

This application of machine learning, which employs neural networks and other modeling algorithms to sift large amounts of log data and detects recurring patterns of database activity, is also part of a cybersecurity product for automatically patching databases that Ellison pledged to discuss further at the event.

For Oracle Database 18c, machine learning allows it to “patch itself while running, all without any downtime whatsoever,” according to Ellison, who spoke at Oracle OpenWorld 2017. He also said operations improvements allow his company to offer an Oracle Database 18c SLA that guarantees 99.995% reliability and availability, while reducing planned and unplanned downtime to less than 30 minutes per year.

Automation of database administration

While, the system, which Ellison dubbed as “self-driving” and “the world’s first autonomous database,” may be unique by some measures, it is also part of a long-standing trend that is well under way.

Automation of database cluster deployment on cloud has become increasingly common, and wider automation can be anticipated, according to Tony Baer, an analyst at Ovum.

“You can see how cloud databases are doing automation — with database sharding as a major example,” Baer said. Meanwhile, query performance and other database activities are also being affected by advanced machine learning technology, he said.

Baer noted that “Oracle has all kinds of database activity logs. That is big data that acts as a corpus for machine learning that can figure out what is a normal pattern, and highlight queries that are going to cause trouble.”

Advanced machine learning adds another element to the mix, but the latest Oracle moves are best viewed as part of an evolution in process automation, according to Vinod Bhutani, database services manager at DBAMart Database Services in Broomfield, Colo.

“There is a whole lot of automation for the database already. For example, there are such tools as Oracle SQL Tuning Advisor and Segment Advisor,” Bhutani said in an interview at Oracle OpenWorld.

“In my view, the database is 60% to 70% automated already,” he said, adding that the amount of automation employed is often based on the database administrators’ comfort levels with such automation’s effectiveness.

Bhutani said he would be looking for additional details, particularly on Oracle’s cybersecurity offerings, to see how much further Oracle takes database automation.

Whither the DBA?

In his Oracle OpenWorld keynote, Ellison admitted the move to greater automation for the Oracle cloud database could be seen as a threat to DBA job security. But he was basically sanguine on the prospects.

“Yes, you are automating the ways of database professionals, but they already have more work than they can possibly ever get to,” he said.

Greater database automation will free up DBAs from routine patching and repetitive tuning, he said, enabling them to focus more on schema design, analytics — including advanced machine learning styles of analytics — and securing data.

Noel Yuhanna, an analyst at Forrester, on hand at Oracle OpenWorld, agreed. “The DBA job is being changed toward more data-driven initiatives, with more emphasis on security and governance — and architecting the future of the data,” he said.

“The DBA will focus more on business value, as opposed to technology,” he said.

Meanwhile, analyst Baer also pointed to an increasingly important role for the DBA. “There is definitely a future for the DBA. There is just no question about it,” he said. “You can’t automate everything.”

Hearing Redshift steps

Ellison said the Oracle Database 18c, running on Exadata infrastructure on the Oracle Cloud or Cloud at Customer , will become generally available in December for data warehousing only, with a transactional version appearing in June of 2018.

This “data warehouse-first approach” emphasizes Oracle’s intention to compete more fully with Amazon, its cloud and its Redshift cloud data warehouse. At Oracle OpenWorld, Ellison repeatedly cited Redshift as a competitor, claiming superior uptime and better relative pricing for Oracle.

“We guarantee our bill will be less than half of what Amazon Redshift will be,” he said. “We will write that in your contract.”

The company further moved to sweeten its cloud pricing deal recently, introducing a “bring your own license” policy for existing customers moving databases, middleware and more to the Oracle cloud platform.

With the “18c” designation, Oracle takes on a model-year style naming format for its database, not unlike that of Microsoft SQL Server. Aligning database naming with calendar years is in some part a bow to the growing use of yearly, subscription-based pricing models for databases on the cloud.