Tag Archives: AUSTIN

DattoCon 2018: Focus on MSPs poised for SMB growth

AUSTIN, Texas — Managed service providers stand on the cusp of explosive growth in the small and medium-sized business market, but they must focus their sales and look for points of differentiation to cash in.

That’s one takeaway from this week’s DattoCon, Datto Inc.’s annual partner conference. The three-day event attracted about 1,650 service providers, a record for the Norwalk, Conn., data protection, networking and managed service provider (MSP) business management vendor. The event concluded June 20.

“The SMB spend rate is climbing really fast,” said Austin McChord, CEO and founder of Datto. “The opportunity is absolutely massive.”

Austin McChord, founder and CEO, DattoAustin McChord

Citing research from MarketsandMarkets, McChord said SMB IT spend will grow from about $40 billion in 2017 to nearly $72 billion in 2022. He estimated half of SMB transactions currently involve MSPs — so while the SMB market is already large, there is still room for the MSP industry to capture more business.

“Small businesses will be scrambling to rethink their IT infrastructure [and will] need to turn to managed service providers,” McChord said during his DattoCon keynote address.

Valuations climb

While demand expands, MSP market valuations are also growing.

Paul Dippell, CEO of Service Leadership, a company that benchmarks the financial performance and operational maturity of solution providers, said the arrival of private equity investment firms is one force driving valuations. IT services companies, such as Reliam, are partnering with private equity firms to do deals.

“Private equity groups have found the MSP industry,” Dippell said during a DattoCon presentation.

The equity investor’s typical target — MSPs with $5 million or more in annual profit — are rare, which makes those companies more valuable. Meanwhile, MSP owners looking to sell to such buyers may need to acquire another MSP to boost the bottom line and be deemed desirable, Dippell noted. An MSP with $3 million in profit, for example, may look to acquire an MSP with $2 million in profit. But even MSPs at the $2 million profit mark are unusual.

Deals have been rampant in the MSP and cloud consulting sectors of late, including Green House Data’s merger with Infront Consulting Group in May.

Chart showing recent mergers and acquisitions in the IT services industry
New buyers in the MSP market, such as private equity groups, are driving up service provider valuations.

MSP industry execs: Focus is key

High growth and ample valuations, of course, are not guaranteed. Industry executives suggested several factors that could improve a service provider’s ability to make the most of these trends. Among those factors is focus: Pick a target market and stick to it. That target can then become a point of differentiation.

“They have got to differentiate themselves — just figure out what they are best at and sell that,” McChord said.

That approach could mean only selling services to dental practices or biomedical research companies, he said. The idea is to understand an industry and its lingo, cultivate vendor partners to serve that space, and develop a differentiated business model. Indiscriminate and opportunistic selling, on the other hand, may backfire, McChord suggested.

Work in a fishbowl, not an ocean.
David Pencefounder and CEO, Acumen IT

“Just taking any revenue that shows up almost always puts you in a bad spot,” he said.

David Pence, founder and CEO at service provider Acumen IT, based in Greenville, S.C., echoed McChord’s sentiments.

“Work in a fishbowl, not an ocean,” he told attendees during a DattoCon presentation on MSP selling strategies.

He suggested MSPs create a dream 100 list of customers to pursue rather than buy a mailing list and trigger an email blast to 100,000 companies. And when service providers call on their dream list, they should focus on delivering a superior customer experience rather than providing technology.

“It’s not technology and blinking lights,” Pence said. “When you are talking to customers, be consultative. Don’t be a tech engineer.”

Key attributes

Dippell cited focused selling as one of five key attributes of the top performing service providers. Companies that have a well-defined target customer profile tend to outperform their peers who don’t.

The other attributes, he said, are whether the service provider charges for technology assessments, drives technology standards across customers, conducts quarterly business reviews and cross-sells the breadth of its service portfolio to all customers.

A yes on those counts can help service providers join the ranks of best-in-class companies.

Kubernetes roadmap looks to smooth container management bumps

AUSTIN, Texas — “This job is too hard.”

It wasn’t a message the DevOps faithful at KubeCon 2017 last week might have expected from a Microsoft distinguished engineer and Kubernetes co-creator.

Brendan Burns, Microsoft Azure’s director of engineering, introduced a personal project called Metaparticle at the annual gathering of Kubernetes users and contributors. With Metaparticle, which translates complex distributed systems concepts into snippets of Java and JavaScript code, Burns aims to make distributed systems a Computer Science 101-level exercise.

In that same vein, Kubernetes project leaders know the container management platform will only get rapid acceptance if it is accessible to more people. The Cloud Native Computing Foundation (CNCF) revealed features on the Kubernetes roadmap and introduced a Kubernetes mentoring program for administrators to make it easier to manage clusters across multiple clouds.

Third-party integrations, such as Pivotal Cloud Foundry 2.0, which is now available, will also improve on-premises Kubernetes management and, eventually, hybrid cloud management for enterprises, said Larry Carvalho, an analyst at IDC.

Traditional enterprise IT vendors run hands-on training programs — Pivotal Labs, Red Hat Open Innovation Labs, IBM Cloud Garage — to impart distributed systems skills to enterprise IT staff, Carvalho said. “[These programs] not only lead a horse to water, but force it down his throat,” he said.

“Startups are going gangbusters, but more than half of enterprises still don’t have a production workload in containers,” Carvalho said. “There’s an opportunity, but for them to start adopting it really requires a culture shift.”

Kubernetes users want secure multicluster management

Enterprises with some Kubernetes experience echoed Burns’ desire for simplicity, particularly to manage multiple container orchestration clusters, as all got their first look at the Kubernetes roadmap for 2018.

Production-ready, federated Kubernetes clusters topped the wish list for Rick Moss, infrastructure operations engineer for MailChannels, an email service provider in Vancouver, B.C..

“We want to be able to set up and tear down Kubernetes in different clouds, and federation is the only way to do that securely,” Moss said.

One can use multiple separate clusters for multi-cloud Kubernetes deployments, but rather than stand up and debug a new cluster, Moss said he wants the ability to just roll out part of the same system. However, Kubernetes federation last saw a major update in Kubernetes release 1.5 last year, and it’s been difficult to operate in real-world environments. Kubernetes is at release 1.9 at the time of publication.

It’s not easy to do hybrid [cloud deployments] today, but Cluster API will be the great equalizer for deploying Kubernetes on different systems.
Aparna SinhaKubernetes project management lead, Google

Bloomberg LP engineers said they’re not interested in the nascent federated clusters, but will track their progress in 2018. In the meantime, engineers at the financial services company headquartered in New York must occasionally restart specific hosts in on-premises Kubernetes clusters, and they want instance addressability within Kubernetes to help with that. The ability to dynamically provision local persistent storage volumes would help move stateful apps closer to production on Kubernetes, said Steven Bower, search and data science infrastructure lead at Bloomberg.

Enterprise IT shops also look forward to the Kubernetes roadmap’s security features disclosed by Kubernetes project managers at KubeCon. Pluggable ID, for example, will allow Kubernetes identity management and role-based access control to plug into existing identity management systems, such as the Lightweight Directory Access Protocol (LDAP).

“It’s nice they have identity management support for Amazon [Web Services] and Google Cloud [Platform], but on-premises LDAP is where they need to focus,” Bower said.

A special-interest group within the CNCF will integrate with SPIFFE, which stands for Secure Production Identity Framework for Everyone, an open source project that defines a set of standards to identify and secure communications between web-based services. It’s still too early to tell if it will succeed, Bower said.

Brendan Burns, distinguished engineer at Microsoft Azure
Microsoft’s Brendan Burns presents the Metaparticle distributed systems management project at KubeCon 2017.

Cluster API project aspires to be ‘the great equalizer’

KubeCon attendees also saw Cluster API, a plan by the SIG-Cluster-Lifecycle group to create a set of standards to install Kubernetes clusters in multiple infrastructures.

“It’s a declarative way of deploying and upgrading clusters that abstracts the infrastructure behind Kubernetes,” said Aparna Sinha, project management lead for Kubernetes at Google. “It’s not easy to do hybrid [cloud deployments] today, but Cluster API will be the great equalizer for deploying Kubernetes on different systems.”

Also in the works is a declarative application management project that builds on the open source ksonnet configuration tools to define applications on Kubernetes in a nonrestrictive way, Sinha said. Though it’s still in its early stages, there is a working group.

Another trend expected in 2018 is increased attention to serverless technologies and how they compete with and integrate with containers. Several open source function-as-a-service projects are currently in process, but the CNCF has yet to align itself with any of them. CNCF officials think the community should remain neutral, but KubeCon observers said they think one will naturally emerge and eventually earn support from the CNCF next year.

Beth Pariseau is senior news writer for TechTarget’s Data Center and Virtualization Media Group. Write to her at bpariseau@techtarget.com or follow @PariseauTT on Twitter.

Freese: Cyber-risk management is the key to good infosec hygiene

AUSTIN — Don Freese said infosec professionals lead with fear and emotion too often when discussing security issues when they should be speaking a language that C-level executives and board members understand: risk.

Freese, deputy assistant director of the FBI and former head of the bureau’s National Cyber Investigative Joint Task Force (NCIJTF), spoke Monday morning at the (ISC)2 Security Congress about the importance of cyber-risk management and how the lack of proper practices is hurting enterprise security postures.

“When we start to use emotion and fear to drive the conversation – and often times it’s said in the security game that our worst problem is people – we’re failing in that fundamental message,” Freese said during a keynote discussion with Brandon Dunlap, senior manager of security, risk and compliance at Amazon.

Trying to spur executives into action through fear isn’t effective, Freese said. Instead, security professionals need to identify and measure the various risks to an organization and determine which ones are most pressing and need a portion of the organization’s limited resources in order to be mitigated. “That’s the way we connect with the business world,” he said. “We want to talk about increasing the rigor in how we manage risk.”

Good cyber-risk management starts, Freese said, with enterprise security teams distinguishing between a risk and a threat. However, Freese said that “regrettably, …often times we conflate the two [risks and threats],” which lead to every conceivable risk being viewed as an impending threat.

“That’s simply not a good way to communicate what we’re trying to do. It’s not giving us traction in the world about how we prioritize our resources against those particular threats,” Freese said, adding that it confuses the message. “We’re crying wolf.”

Instead, security teams must delineate between what cyber threats are possible (pretty much everything, he said) and what’s probable (a much smaller and more manageable pool) while analyzing the intent and capability of the potential threat actor, the frequency of the threat and the potential impact of a successful attack.

“If we can start the conversation with not only probability but describe the frequency and the magnitude of the impacts based on the intent and capability, then we start to set up a much more understandable paradigm,” Freese said. “And let me pause and say it’s difficult to do, and that’s why we’re not doing it yet.”

Cybersecurity insurance: not the answer, yet

Dunlap asked Freese about the growth of the cybersecurity insurance market and if it could help organizations with cyber-risk management. “Cyber insurance hasn’t really settled in as a real robust mechanism yet. It’s still mostly business insurance, but that’s because we don’t measure the risks very well,” Freese said.

However, Freese said insurance actuaries are working on the issue, and there is potential for collaboration between the two fields. “There are several different actuarial groups that are looking at cyber risk to measure that in a way that’s quantifiable for pure insurance purposes,” he said.

Still, he said organizations must at least start moving toward a defined cyber-risk management plan. Freese said in his role at the FBI, he’s worked with companies across the globe on addressing cybersecurity issues and threats, and he stressed that the companies that are successful and don’t find themselves in a data breach headline all had one thing in common.

“They’re managing risk in a very measurable, very incremental and consistent type of way,” he said. “They know what’s going on in their networks and they know what type of data they have.”

How can MSPs evolve into cybersecurity companies?

AUSTIN, Texas — Cybersecurity has become a crucial business area that channel companies are trying to wrap their heads around, especially as their customers grow more wary of the emerging threats.

That theme was brought up again and again at CompTIA Inc.’s ChannelCon 2017 event, held this week in Austin, where channel partners discussed their ongoing evolutions as cybersecurity companies. Conference sessions demonstrated many channel executives remained in the very early stages of their transition to cybersecurity and were still thinking out their first steps. More established players in the cybersecurity market revealed the disparate paths to evolve their security businesses. The vendors crowded the exhibition areas, meanwhile, looking to push channel firms deeper into the underpenetrated market.

The security journey for MSPs

Explaining their successful security practices, managed services providers (MSPs) pointed to a few key considerations before transitioning into cybersecurity companies. One of those considerations is that MSPs must either establish security capabilities internally or partner up to provide security to their customers.

In a ChannelCon 2017 panel discussion, MJ Shoer, CTO at Internet and Telephone, a company based in North Andover, Mass., said his journey to security has “been an interesting ride.” About a year ago, Internet and Telephone, which was recently acquired by Onepath, built out a security platform based on the National Institute of Standards and Technology cybersecurity framework and Open Systems Interconnection reference model, he said. The company then developed a model where, under the Onepath umbrella of business units, it owns a security assessment division as well as a managed security division. These divisions are prohibited from communicating with each other, creating the ability to examine the customer’s security posture with assessments such as penetration testing and vulnerability scans yet also provide managed security services.

If a [penetration] test reveals there are ports open that shouldn’t be, we throw ourselves under the bus, just as we would any other MSP.
Matt ShoerCTO, Internet and Telephone

“If a [penetration] test reveals there are ports open that shouldn’t be, we throw ourselves under the bus, just as we would any other MSP. … We’ll point out where the deficiencies are, and we’ve got to turn around quickly and remediate those for the customer’s benefit,” Shoer said.

Despite the buzz in the market that MSPs need to become managed security services providers (MSSPs), he said he doesn’t think “you can truly be both unless you can set up the kind of division” that his company has. However, he acknowledged that this security business model isn’t feasible for every MSP. Internet and Telephone could do it because of “our size and scale,” he said.

Jim Turner, president of Hilltop Consultants Inc., an MSP based in Washington, D.C., said that while Hilltop has developed security capabilities, it doesn’t call itself an MSSP. The company, which focuses on the legal vertical, takes an approach to security similar to vendor management, he said.

“What I have are people who are on the inside, who are knowledgeable on security, and they’re managing the different vendors that we have partnered with to provide the [security] services,” Turner said.

“We are having a lot of success in generating revenue by providing services for our clients with our partners. We’re able to keep out the competitors that are claiming that they’re MSSPs, even though I know that they’re not really MSSPs,” he added.

Working with law firms, he noted Hilltop has to grapple with unique security concerns. Hackers, for example, would relish the chance to break into MSPs because they aggregate data for accessing their law firm clients, which in turn aggregate sensitive data, including contracts and intellectual property, of all their business clients.

“Hackers want to get in,” he said.

Statosphere Networks, an MSP based in Evanston, Ill., built in-house security capabilities after deciding to pivot its business to cybersecurity several years ago. “A lot of things need to be internally done before you add security solutions and services into your portfolio,” said Kevin Rubin, Stratosphere’s president.

“The challenge that you’ll have in the managed services environment is we have IT leaders versus security professionals,” he noted. “It is very challenging when you go to your engineering staff and say, ‘I have this great idea about services and solutions from a security standpoint that I’d like to roll out,’ because it’s kind of an unknown turf for them. … So, you have to build your own set of internal team [members] to focus on security if you’re going to do something in house.”

Dmitry Bezrukov, a ChannelCon attendee and CEO of ITsecura, a three-person MSP based in Oregon House, Calif., said his company places a lot of emphasis on cybersecurity and continues to develop its security practice.

“Even though I think we’re doing very good … there is always [something] you can do better,” Bezrukov said. “We are striving for that unreachable excellence, and security … is not straightforward.”

Educating people on security has become one of his top missions, he added.

Vendors urge partners to embrace cybersecurity

Intronis MSP Solutions by Barracuda, a security and data protection provider, was one of many vendors at the ChannelCon 2017 event with insight to offer channel companies.

According to Neal Bradbury, co-founder and vice president of channel development at Intronis, the advanced security market is underpenetrated by MSPs. Citing findings from the company’s latest study of the managed services market, he said that while MSPs commonly have foundational security offerings like antivirus, only about 15% of MSPs were offering advanced security services such as security information and event management, advanced threat protection and compliance services.

Sean Sykes, managing director at Avast Software, said the biggest issue that MSPs are faced with today is business transformation. “For [MSPs] to continue to be relevant … they need to be finding ways to evolve their business. What they need are tools that are going to allow them to serve that need to be not only an IT consultant, but a security consultant at the same time,” he said.

“The reality is this: In the market today, there is an increase in the number of attacks targeting the SMB, and there is an incredible shortage of cybersecurity professionals here in the U.S.,” Sykes added. Customers will turn to their providers for cybersecurity support, and those that can support them as cybersecurity companies will do well, while those who can’t might find themselves in a position to be replaced by a competitor.

Paraphrasing a quote that has resonated with him, he said, “Every individual in an IT job today is [in] a cybersecurity role, whether they know it or not.'”

Powered by WPeMatico

Microsoft and Docker collaboration puts Linux containers on Windows

AUSTIN, Texas — Microsoft might have been late to embrace Linux, but its new open source initiative that runs…

“;
}
});

/**
* remove unnecessary class from ul
*/
$(“#inlineregform”).find( “ul” ).removeClass(“default-list”);

/**
* Replace “errorMessageInput” class with “sign-up-error-msg” class
*/
function renameErrorMsgClass() {
$(“.errorMessageInput”).each(function() {
if ($(this).hasClass(“hidden”)) {
$(this).removeClass(“errorMessageInput hidden”).addClass(“sign-up-error-msg hidden”);
} else {
$(this).removeClass(“errorMessageInput”).addClass(“sign-up-error-msg”);
}
});
}

/**
* when validation function is called, replace “errorMessageInput” with “sign-up-error-msg”
* before return
*/
function validateThis(v, form) {
var validateReturn = urValidation.validate(v, form);
renameErrorMsgClass();
return validateReturn;
}

/**
* DoC pop-up window js – included in moScripts.js which is not included in responsive page
*/
$(“#inlineRegistration”).on(“click”,”a.consentWindow”, function(e) {
window.open(this.href, “Consent”, “width=500,height=600,scrollbars=1”);
e.preventDefault();
});

Linux-based containers on Windows Server is an attempt to make up for lost time.

The ability to run Linux containers on Windows won’t entice Linux shops to migrate to Windows, but it could appeal to smaller companies or larger Windows shops that also need a Linux component.

“The simplicity of managing just one flavor of operating system can make a lot of sense,” said Ezra Gottheil, principal analyst at Technology Business Research based in Hampton, N.H.

The project also aims to lighten the load on Docker’s engineering team — the company only employs about 150 people — to create a Windows-compatible version of Docker Datacenter and other Docker components. The ability to run Linux containers on Windows will allow Windows Container users to run Docker Datacenter, which itself is based on a collection of Linux containers, on top of Windows Server.

We have Linux, Mac and Windows. The more we can move stuff back and forth, the better off we are.
Simon Webstersecurity engineer, University Corporation for Atmospheric Research

In fact, part of the motivation for the project — which is still in the works — was the challenge for Microsoft to support Linux containers in Azure, said Taylor Brown, principal lead program manager at Microsoft.

“[Customers] are just giving us a container image and we have to figure out how to run that, how to make it an efficient experience, all while maintaining multi-tenant isolation,” Brown said. “The ability to run Linux and Windows side by side is really valuable for us.”

The project, which Microsoft revealed here at DockerCon, also promises to give developers more platform options and enable operations teams to worry less about where and how they deploy containerized applications.

“It’s going to be a big deal for the future for interoperability, moving stuff from platform to platform,” said Simon Webster, security engineer at the University Corporation for Atmospheric Research based in Boulder, Colo. “We have Linux, Mac and Windows. The more we can move stuff back and forth, the better off we are.”

Portability across different clouds is also appealing, especially for a semi-governmental agency, as it could make it a lot easier to collect competitive bids, he added. Others saw the feature as more of a checkbox for Microsoft or novelty that wouldn’t appeal to most enterprises.

“It’s nice that you can more natively run Linux stuff on Windows, but I don’t think it’s something large companies would use in production,” said Nima Esmaili Mokaram, a software engineer at Quicken Loans. “I can see smaller .NET shops that have a couple Linux containers they want to run too; that would be a great audience. But if you have the capital to buy some Linux servers, that enhances the performance and potentially would be less headaches.”

Microsoft, in partnership with Docker, rolled out two flavors of Windows-based containers — Windows Server Containers and Hyper-V Containers — with the release of Windows Server 2016 last year. However, Docker’s containerization model began as a Linux project, and the open source community has championed most of the development and innovation that’s made containerization the technology du jour.

Microsoft’s plan to run Linux containers on Windows Server isn’t a technical breakthrough, and it borrows concepts that Intel’s Clear Containers and VMware’s Photon projects advanced. The company plans to work with the open source community using Docker’s LinuxKit to create a custom Hyper-V VM running a lightweight OS purpose-built to host a container image. Like similar VM-optimization projects, Microsoft’s approach would allow the VM to shed unnecessary components and drivers; the reduced overhead improves efficiency over a standard VM.

Microsoft’s awkward Linux embrace

Docker and Microsoft have continued to work closely despite the historical divisions among Windows and Linux practitioners.

“Microsoft’s hesitancy of supporting other platforms is gone,” TBR’s Gottheil said. “The company is much more open to interoperability than it was under [former CEO Steve Ballmer], although Ballmer wasn’t really all that bad.”

Even if Microsoft is anxious to bury the hatchet, the divide was still palpable at DockerCon.

“Historically, there’s been a huge divide: You’re either a Windows guy or a Linux guy,” Webster said. “The fact that Microsoft has taken this step means that they’re potentially bringing together different IT groups. Breaking down walls between those two would be useful.”

That divide was the topic of discussion at a DockerCon session about media and analyst perspectives on Docker. (It also carried over to the public DockerCon Slack channel, where a comment that Windows developers are not “real developers” collected dozens of thumbs-up from attendees.)

The Windows community often still feels segregated, but things like a shared Docker experience will help bridge the two worlds and foster communication, said Donnie Berkholz, research director at 451 Research, during the session.

“Right now, there are hardly grounds for a conversation between a Linux- or Mac-based developer and a Windows developer.”

Nick Martin is executive editor for TechTarget’s Modern Infrastructure e-zine, and former senior site editor for SearchServerVirtualization.com. Contact him at nmartin@techtarget.com.

Next Steps

Docker for Windows Server 2016 GA details

Container support debuts in Windows Server 2016

How do Windows Server containers affect applications?

Powered by WPeMatico