Oracle Cloud Infrastructure SVP shares competitive strategy

SAN FRANCISCO — Oracle’s late entry to the public cloud space has been met with skepticism, but the company has a few strategies, particularly around autonomous cloud, unexpected partnerships and shared responsibility, that executives expect to make up for its tardiness.

Oracle Cloud Infrastructure (OCI), the company’s second attempt at a public IaaS, was built with the help of former AWS technical employees like Clay Magouyrk, who joined Oracle in 2014.

Now senior vice president of engineering for OCI, Magouyrk spoke with TechTarget during Oracle’s OpenWorld conference about the company’s market position, its efforts to attract customers and what the future may hold.

A common refrain is that Oracle is very late — maybe too late — to arrive at a viable IaaS strategy. What’s your response to this?

Clay Magouyrk: I think you have to understand where we’re at in our evolution of the cloud. I remember when everybody had a kind of, you know, a dumb phone. And I remember it took two years to get to a place where everyone had a smartphone. The switching speed was amazing.

So then the question is, [the industry has] been doing cloud infrastructure for 15 years. Why isn’t it all [migrated over from on-premises] yet? It’s still 90% greenfield. One of the things I worked on at Amazon before I left was the Fire Phone, and the Fire Phone was too late, in the same way that the Windows Phone reboot was too late. Once Apple and Android had 90%-plus market share it was impossible [to catch up].

The reason I spend my day here every day working really hard is because we’re still at the 10% penetration [level]. People act like we’re 90% of the way into the cloud infrastructure transition, but we’re not anywhere close to it.

One of the biggest OCI announcements at OpenWorld was Maximum Security Zones, which you positioned as a response to data breaches at IaaS providers caused by misconfigured systems. But it doesn’t fully reject the shared responsibility stance around security forwarded by AWS and others.

Magouyrk: I think, fundamentally, in the long term, you always end up with some shared responsibility. What we’re trying to say here at Oracle is that we are going to push that boundary.

If you look at cloud in general, cloud providers have the responsibility of patching the hypervisor. And then, okay, your job is to patch your OS. Cool, well, with Autonomous Linux, now it’s our job to patch the OS. So we’re bringing that up the stack.

We’re going to create a construct where you can’t misconfigure it. We’re taking the stance that this is not just your responsibility, we are going to work with you on it.
Clay MagouyrkSVP, Oracle Cloud Infrastructure

If you look at the vast majority of security incidents these days, they’re not [committed by] massively sophisticated hackers. It’s misconfiguration. When you moved to the cloud, we gave you all these tools. And now you’ve got a million tools. And you have this programmable infrastructure. It’s so easy to do stuff, right?

We’re going to create a construct where you can’t misconfigure it. We’re taking the stance that this is not just your responsibility, we are going to work with you on it.

Think about it from our perspective on SaaS. We take on a ton of responsibility for that. There’s no way for you to mess it up. As we can make infrastructure to where you can’t mess it up, I don’t see any reason why we can’t take on more responsibility the same way we do in SaaS. The problem is that if people still need that control, to be able to mess it up, then it has to be on them.

You plan to dramatically expand OCI’s global footprint, bringing it to 36 regions by the end of 2020. Tell us about Oracle’s process here.

Magouyrk: There are things under the hood that people don’t see. One of those things is that we’ve massively optimized how we build regions, both from an infrastructure perspective, as well as a software perspective.

When I joined Oracle in 2014, we had zero OCI regions. I knew that if we were going to compete, we had to build regions that are way faster than our competitors. I knew how [AWS] built them, because I had worked there. And we hired people that also worked there. What we did is we took a much more aggressive approach.

Clay Magouyrk, SVP of Oracle Cloud Infrastructure
Clay Magouyrk, senior vice president, Oracle Cloud Infrastructure

The way most companies build these regions is you have 200 teams, and they all do it by hand. So the physical hardware gets rolled in, and the actual software teams spend all their time installing the stuff.

When you bought a Windows CD back in the day, Microsoft didn’t send an engineer to install it for you, right? They had an installer. Well, you can do that for the cloud; it’s a bunch of work. But we’ve made that investment from a software perspective.

If you look back to when Amazon got into this, they didn’t know where they were going. It was very early days. And they were like, we’re going to build these big honkin’ regions, and they’re going to cost a jillion dollars and there is going to be so much stuff there.

We realized, by the time I came to Oracle, we were going to need to put regions everywhere, but they’re going to start small and be able to grow. We’ve engineered for that. From a technical perspective, that gives us a lot more flexibility than some of our competitors.

It’s my understanding that Oracle is doing this buildout largely through co-location agreements and not actually building its own data centers, in the interest of speed. Can you elaborate?

Magouyrk: I think the term [co-location] has changed. If you go back 10 or 15 years, co-lo meant you get one rack in a small data center, that kind of a thing. But as you actually see these large cloud providers — the way they’re doing these worldwide rollouts — co-lo providers have actually changed their model. They still do that kind of retail stuff. But they also have a wholesale model. And that’s what has enabled a lot of this rapid global expansion.

Every major cloud provider uses a ton of colocation facilities. There’s just no way that it makes sense otherwise. As a co-lo provider, you can do things where you put stuff in a campus. You buy a bunch of land, and then you start small. But then, as that fills up, you build new buildings and you can deeply interconnect that with fiber. As a [co-lo] customer, you have this expansion plan in that same area, but you don’t have to build a giant data center [yourself].

One OCI announcement this week at OpenWorld concerned a new “always free” tier, which seemed clearly aimed at attracting developers to the platform. Tell us more about your efforts in this area.

Magouyrk: A lot of it is social proof. While we would like to think that everyone’s making a fully informed, highly educated decision, the reality is that most humans do close approximations based on what everyone else does. The key is to create a flywheel that gets enough going, and then it becomes oh, well, they all chose Oracle, why can’t we?

We have dedicated startup investments where we go to startups and give them a bunch of free credits and incentives. Around developers in general, we put a lot of energy into attracting people out of college. We work with universities to give them access to cloud computing resources. They use that in their classrooms to get people familiar with it.

I view [the free tier] as the start of something … Because it’s not just making it free, it’s making the signup easy, the support experience easy. Do you have the collateral and the ecosystem around it? Do you have the right forums for people to ask questions? You do all of those things in a row, and it builds.

The initial use case you put forward with your interoperability partnership between Microsoft Azure and OCI was to split an application up, with the logic and presentation tiers on Azure tied back to an Oracle database running on Exadata inside OCI. Why would a customer want to do that?

Magouyrk: When [Oracle CTO and chairman] Larry [Ellison] gets up there, and he’s like, look, let me show you the performance difference between what you can get from this versus an Exadata, I don’t think people actually believe it, but it’s true. There are all these amazing workloads that need giant relational databases that just can’t move anywhere.

My job is to get you in OCI and then just keep pulling more. Maybe you add some Analytics Cloud on top of it to analyze that data. You just get people hooked that way.

What about going further and bringing Exadata boxes right into Azure data centers? That would eliminate the need for the high-speed interconnect between OCI sites and Microsoft’s.

Magouyrk: As you can imagine, those types of conversations do happen in the abstract. I’m sure Microsoft would love that. The thing you have to understand is that we did a lot of work in Oracle Cloud infrastructure to make Exadata run well. We offer bare-metal; we offer off-the-box virtualized networking. There’s a whole bunch of features.

Let’s say that we were to make a deal and I gave Microsoft a bunch of Exadatas. It would be off in a little tiny part of the network. It’s not actually integrated into their experience. They wouldn’t have a database service wrapped around it. The experience would be terrible. For them, it’s not important [to host Exadatas].

For us, it’s so valuable that we make it really, really good. What we’re not going to do is take the thing that we think is incredibly valuable and then have other people do it badly.

What is next for this deal? Will we see others like it, say with Google? Going even further, is a détente between Oracle and AWS possible?

Magouyrk: Right now, we have great ideas, and we have good buzz, and we have very interesting customers. What we have to do over the next six months is convert those into a bunch of very happy, very public reference customers. That’s the next level of uptick in the process. Until we have that play out, I don’t think anyone’s going to know.

In terms of where we’re taking this with Microsoft, I think it’s about us working much better together. We’re making Oracle Linux work better in [Microsoft’s] cloud. A big part of the reason we chose them is because we have such customer overlap. It might be interesting, technically, for us to do it with Google. But it’s not like Google is already in every single one of our customer accounts. We’re doing this from a customer-driven perspective.

Go to Original Article
Author:

Sinkholed Magecart domains resurrected for advertising schemes

Sinkholed Magecart domains previously used for payment card skimmers could pose new threats such as ad fraud and malvertising.

RiskIQ, a San Francisco-based threat intelligence vendor, discovered a handful of sinkholed domains formerly used by Magecart cybercriminals have been subtly purchased and re-registered by unknown groups. Instead of using these old Magecart domains for payment card skimming, the threat actors are using them as traffic sources for advertising schemes.

In a blog post, Yonathan Klijnsma, head of threat research at RiskIQ, explained that registrars often put domains up for sale again after they have been taken down due to malicious activity.

“Here’s the catch: when these domains come back online, they retain their call-outs to malicious domains placed on breached websites by attackers, which means they also retain their value to threat actors,” Klijnsma wrote in the blog post.

A “secondary market” has emerged around Magecart domains where other threat actors use the domains, which are still receiving significant traffic after being taken down, to run advertisements. Klijnsma told SearchSecurity it’s common for a formerly malicious domain with an attractive or common name to be purchased by domain name speculators for advertising purposes.

“They buy it up and the main domain gets a parking page and the parking page will have ads,” he said. “And that’s their way of monetizing it in a white hat sort of way.”

But the purchasers of these Magecart domains went a step further, Klijnsma said. The threat actors took advantage of the malicious JavaScript the Magecart cybercriminals previously used to call out to their skimmers, though instead of downloading skimmers the JavaScript now injects ads. In addition, the revised JavaScript contains another remote script that counts traffic to the domains.

Klijnsma said the new threat actors can’t “play the ignorance card”; they wouldn’t use the exact same file path as the Magecart skimmers and then log the traffic to the domains unless they knew what the domains had been used for previously and were aware of how to monetize them, he said.

“They have some knowledge of what’s going on, which is curious and illegitimate in our eyes,” Klijnsma said.

Potential threats on old domains

Klijnsma said he found the secondary market for Magecart domains accidentally. He discovered RiskIQ’s platform had flagged a handful of the old domains by crawling the pages, but it hadn’t flagged any skimmer activity, which prompted him to take a closer look at one of the domains that had been sinkholed — cdnanalytics.net.

“I noticed the injection of ads in the page, which is definitely not a skimmer,” he said. “After that, looking at the actual domain, I noticed it was re-registered.”

The re-registering of the domain was done in a “very subtle way” where the new threat actors used the same registrar as the Magecart cybercriminals; the only change in the WhoIs data for the domain was the name server. Klijnsma said it’s unclear why the new owners used the same registrars for all of the Magecart domains but he said he believes it was “purposeful.”

Klijnsma also said he noticed the advertising script contained another domain — cleverjump.org — and an analysis of host pairs between the cleverjump.org script revealed “several hundred domains” for 2019 alone, including several old Magecart domains.

“If you look at those other domains, you’ll find a lot of attractive, nicely-made domain names,” he said. “I think they just keep buying this stuff up to deliver ads and get whatever traffic they can get.”

Klijnsma said the Magecart domains present a legal and ethical gray area. While the new threat actors are using malicious Magecart code and monetizing illegitimate traffic for ad fraud, there are currently no signs of malvertising or other direct threats to users on the sites. The ads being served on the domains are thus far legitimate and are being served from several ad networks, which RiskIQ declined to name.

While RiskIQ flagged the Magecart domains, the company won’t blacklist them again unless it detects skimmers, malware or other malicious activity on them. However, Klijnsma urged caution in his blog post.

“While ads themselves aren’t malicious, they are exploiting the vulnerabilities in websites while the site owners don’t benefit,” he wrote. “Moreover, in the future, threat actors may also engage in other schemes and threat activity far more malicious than advertising.”

Go to Original Article
Author:

Swim DataFabric platform helps to understand edge streaming data

The new Swim DataFabric platform aims to help IT professionals categorize and make sense of large volumes of streaming data in real time.

The startup, based in San Jose, Calif., emerged from stealth in April 2018, with the promise of providing advanced machine learning and artificial intelligence capabilities to meet data processing and categorization challenges.

With the new Swim DataFabric, released Sept. 18, the vendor is looking to help make it easier for more users to analyze data. The Swim DataFabric platform integrates with Microsoft Azure cloud services including IoT suite and Data Lake Storage to classify and analyze data, as well as helps make predictions in real time.

The Swim DataFabric platform helps users get the most out of their real-time data with any distributed application including IoT and edge use cases, said Krishnan Subramanian, Rishidot Research chief research advisor.

“Gone are those days where REST is a reasonable interface for real-time data because of latency and scalability issues,” Subramanian said. “This is where Swim’s WARP protocol makes more sense and I think it is going to change how the distributed applications are developed as well as the user experience for these applications.”

Why the Swim DataFabric is needed

I think it is going to change how the distributed applications are developed as well as the user experience for these applications.
Krishnan SubramanianChief research advisor, Rishidot Research

A big IT challenge today is that users are getting streams of data from assets that are essentially boundless, said Simon Crosby, CTO at Swim. “A huge focus in the product is on really making it extraordinarily simple for customers to plug in their data streams and to build the model for them, taking all the pain out of understanding what’s in their data,” Crosby said.

Swim’s technology is being used by cities across the U.S. to help with road traffic management. The vendor has a partnership with Trafficware for a program that receives data from traffic sensors as part of a system that helps predict traffic flows.

The Swim DataFabric platform moves the vendor into a different space. The Swim DataFabric is focused on enabling customers that are Microsoft Azure cloud adopters to benefit from the Swim platform.

“It has an ability to translate any old data format from the edge into the CDM (Common Data Model) format which Microsoft uses for the ADLS (Azure Data Lake Storage) Gen2,” Crosby said. “So, a Microsoft user can now just click on the Swim DataFabric, which will figure out what is in the data, then labels the data and deposits it into ADLS.”

Screenshot of Swim architecture
Swim architecture

With the labelled data in the data lake, Crosby explained that the user can then use whatever additional data analysis tool they want, such as Microsoft’s Power BI or Azure Databricks.

He noted that Swim also has a customer that has chosen to use Swim technology on Amazon Web Services, but he emphasized that the Swim DataFabric platform is mainly optimized for Azure, due to that platform’s strong tooling and lifecycle management capabilities.

Swim DataFabric digital twin

One of the key capabilities that the Swim DataFabric provides is what is known as a digital twin model. The basic idea is that a data model is created that is a twin or a duplicate of something that exists in the real world.

“What we want is independent, concurrent, parallel processing of things, each of which is a digital twin of a real-world data source,” Crosby explained.

The advantage of the digital twin approach is fast processing as well as the ability to correlate and understand the state of data. With the large volumes of data that can come from IoT and edge devices, Crosby emphasized that understanding the state of a device is increasingly valuable.

“Everything in Swim is about transforming data into streamed insights,” Crosby said.

Go to Original Article
Author:

Microsoft 365 Business update targets nonprofits

Microsoft is rolling out some updates to its productivity offerings, hoping to capture the business of nonprofits looking for a path to digital transformation.

Digital transformation tools can help organizations improve the security, cost-effectiveness and efficiency of their processes. Small nonprofits often can’t afford these systems. Recognizing this, Microsoft is offering 10 free Microsoft 365 Business licenses for nonprofits, a productivity suite that includes access to Word, Excel, Outlook, OneDrive, Teams, SharePoint and more. Each additional license will cost $5 per month.

These free licenses do not give access to the Dynamics 365 CRM system and its Nonprofit Accelerator, but Microsoft does provide discounts to nonprofits looking to adopt its CRM.

The Microsoft Digital Skills Center for Nonprofits is another digital transformation resource that is already available, launched in partnership with learning platform TechSoup. This service provides free product training for nonprofits on how to use Microsoft 365 Business, among other services.

On Oct. 1, Microsoft plans to launch the Nonprofit Operations Toolkit. Built on Power Platform, this system integrates PowerApps, Flow, cloud storage and Excel to help nonprofits manage projects and awards management systems, such as tracking donor transactions. This system will include extra security features to ensure donor privacy, said Justin Spelhaug, general manager of technology for social impact at Microsoft.

Microsoft 365 has numerous security features, even at the individual level. If an employee accidentally tries to send sensitive information, the system can stop it. And if an employee leaves a mobile device on a bus, the nonprofit can wipe information from the phone.

“Nonprofits have some incredibly important info in IT systems about beneficiaries,” Spelhaug said. “Maintaining trust with donors and with beneficiaries is the lifeblood of organizations, and part of maintaining trust is having appropriate security backstops.”

Combined with Microsoft’s other tools, these systems can form the basis of a digital transformation strategy.

How can companies use these systems?

Meals on Wheels of Greenville County in South Carolina uses Microsoft 365 Business to optimize the delivery of meals to homebound and senior citizens.

One of the best things a nonprofit can do, but often doesn’t do, is to operate like a business.
Catriona CarlisleExecutive director of Meals on Wheels of Greenville

The biggest challenge Meals on Wheels had in crafting its own digital transformation strategy was training volunteers to use new technology, said Catriona Carlisle, executive director of Meals on Wheels of Greenville. The initial 15 courses offered through the Digital Skills Center will help them more easily train volunteers in the future, she said.

“The majority of our volunteers have smartphones,” Carlisle said. “They know tech, so it was a natural transition. But we have some volunteers with us almost 50 years who struggled with the change.”

By removing manual data entry, adding mobile management capabilities and automating volunteer scheduling, the nonprofit was able to expand, Carlisle said. Microsoft 365 Business saved the nonprofit time and money, giving them the chance to initiate a partnership with two local agencies that added 400 to 500 meals a day to their food supply, which had been serving 1,500 meals a day. They were also able to source the food from locals, rather than distant organizations, and began to send extra meals to a local school for children with specials needs and disabilities.

Meals on Wheels looks forward to the new changes, Carlisle said.

“One of the best things a nonprofit can do, but often doesn’t do, is to operate like a business,” she said. “We need to make business decisions to make sure we’re around for the future, looking not only at making investments here and now, but also for the future.”

Go to Original Article
Author: