Tag Archives: determine

Breaking down the Exchange Online vs. on-premises choice

We all know the cloud is there, but how does an organization determine if a move from an on-premises platform is the right one?

Many companies currently using Exchange Server cannot escape from the siren call of the cloud. Untold numbers of organizations will weigh the pros and cons of Exchange Online vs. on-premises Exchange Server. There are many reasons to move to the cloud, just as there are ones to stay put.

Whether the cloud is better requires some deeper analysis. I’ve spent most of the last eight years migrating organizations of every size to Office 365. Over that time, I’ve grown familiar with the motivations to move to the cloud, as well as the ones to maintain the status quo.

This article will dig into the Exchange Online vs. on-premises Exchange Server debate and examine the differences between the two offerings, as well as which has the advantage in certain areas.

Is Exchange Online less expensive?

In many cases, the first selling point of Exchange Online is the cost. Since Exchange Online and Exchange on premises are very different, it’s difficult to do an apples-to-apples comparison. To get started, you must look at several factors.

The first factor to weigh is how long you plan to keep your on-premises servers. If you upgrade your on-premises servers every three years, then it’s likely those costs will exceed the payments for Exchange Online. If you plan to keep your on-premises Exchange servers for 10 years, then you’ll likely pay considerably less than Exchange Online.

There are a number of costs associated with on-premises Exchange, such as hardware, electricity, data center space and repair costs. Due to all of these factors, the real answer is a lot more complicated than the de facto response from Microsoft that the cloud is always cheaper. Of course, it’s to the vendor’s benefit to get as many companies signed up for an Office 365 subscription as possible.

Is Exchange Online more reliable?

Just as there are several ways to look at the question of cost, it’s also difficult to determine reliability in the Exchange Online vs. on-premises equation.

Microsoft touts its 99.9% uptime guarantee for Office 365. Upon closer inspection, does that assurance hold up?

Open any Office 365 tenant at any time and look at the service health dashboard. Every tenant I check has items marked in red almost every day, but those customers still pay for the full subscription. I’m not saying Office 365 has a lot of downtime, but that 99.9% uptime guarantee is more gray than it is black and white.

[embedded content]

What are the perks and drawbacks
of a switch to hosted email?

As for on-premises Exchange, there is no way to evaluate the overall reliability of Exchange Server. I’ve seen organizations that almost never have problems, while others experience numerous major outages. I don’t think Office 365 is more reliable than on-premises Exchange, but my expectation is data loss is less likely with Exchange Online.

Exchange Server is a very complicated and difficult product to manage. Unless you have some very talented Exchange admins, Exchange Online is the more stable choice.

Do you get newer features with Exchange Online?

In this area, there is no doubt which platform has the advantage. Due to its nature as a cloud service, Exchange Online gets new features well before on-premises Exchange. Not only that, but there are many features that are exclusive to Exchange Online. For a company that wants all the latest and greatest features, the clear choice is Exchange Online.

Every organization has specific needs it must consider, and quite often the traditional on-premises mail system does the job.

However, there is a downside to the constant stream of new features. It can take time for both users and administrators to recover from the culture shock that sets in after the migration to Exchange Online when they realize the feature set changes constantly. There is always something new to learn. Many workers prefer to come into work without spending time to learn about new features in the email system.

What’s the final verdict?

Now that you’ve gone through the Exchange Online vs. on-premises deliberation, which is better? With the sheer number of factors to consider, there is no definitive answer.

Every organization has specific needs it must consider, and quite often the traditional on-premises mail system does the job. For example, a company that relies on public folders might see some difficulties migrating that feature to Exchange Online and decide to stay with the on-premises Exchange.

It’s no secret Microsoft wants its customers to move to the company’s cloud services, but they continue to develop on-premises versions of their software.

Microsoft plans to release Exchange 2019 later this year. When that offering arrives, take the time to evaluate all the features in that release and determine whether it’s worth moving to the cloud. For some organizations, on-premises email might continue to be a better fit.

New VEP Charter promises vulnerability transparency

The White House wants more transparency in how federal agencies determine whether or not to disclose software vulnerabilities, but there are early questions regarding how it might work.

The Vulnerabilities Equities Process (VEP) was designed to organize how federal agencies would review vulnerabilities and decide if a flaw should be kept secret for use in intelligence or law enforcement operations or disclosed to vendors. The new VEP Charter announced by Rob Joyce, special assistant to the President and cybersecurity coordinator for the National Security Council, aims to ensure the government conducts “the VEP in a manner that can withstand a high degree of scrutiny and oversight from the citizens it serves.”

“I believe that conducting this risk/benefit analysis is a vital responsibility of the Federal Government,” Joyce wrote in a public statement. “Although I don’t believe withholding all vulnerabilities for operations is a responsible position, we see many nations choose it. I also know of no nation that has chosen to disclose every vulnerability it discovers.”

Joyce laid out the “key tenets” of the new VEP Charter, including increased transparency and an annual report, improved standardization of the process regarding the interests of various stakeholders and increased accountability.

“We make it clear that departments and agencies with protective missions participate in VEP discussions, as well as other departments and agencies that have broader equities, like the Department of State and the Department of Commerce. We also clarify what categories of vulnerabilities are submitted to the process and ensure that any decision not to disclose a vulnerability will be reevaluated regularly,” Joyce wrote. “There are still important reasons to keep many of the specific vulnerabilities evaluated in the process classified, but we will release an annual report that provides metrics about the process to further inform the public about the VEP and its outcomes.”

Questions about the VEP Charter

The VEP has previously been criticized by experts for being optional rather than being codified into law, but the VEP Charter does not include language making the process a requirement nor does it acknowledge the PATCH Act, a bill proposed in Congress which would enforce a framework for using the VEP.

Heather West, senior policy manager and Americas principal at Mozilla, noted in a blog post that “many of the goals of the PATCH Act [are] covered in this process release, [but] our overarching goal in codifying the VEP in law to ensure compliance and permanence cannot be met by unilateral executive action.”

Early readings of the VEP Charter have revealed what some consider a conflict of interest, in that the NSA is designated as the VEP Executive Secretariat with the responsibility to “facilitate information flow, discussions, determinations, documentation, and recordkeeping for the process.”

However, the VEP Charter also states that any flaw found in NSA-certified equipment or systems should be “reported to NSA as soon as practical. NSA will assume responsibility for this vulnerability and submit it formally through the VEP Executive Secretariat.”

Additionally some have taken issue with the following clause in the VEP Charter: “The [U.S. government’s] decision to disclose or restrict vulnerability information could be subject to restrictions by foreign or private sector partners of the USG, such as Non-Disclosure Agreements, Memoranda of Understanding, or other agreements that constrain USG options for disclosing vulnerability information.”

Edward Snowden said on Twitter that this could be considered an “enormous loophole permitting digital arms brokers to exempt critical flaws in U.S. infrastructure from disclosure” by using an NDA.

DHS cyberinsurance research could improve security

The Department of Homeland Security has undertaken a long-term cyberinsurance study to determine if insurance can help improve cybersecurity overall, but experts said that will depend on the data gathered.

The DHS began researching cyberinsurance in 2014 by gathering breach data into its Cyber Incident Data and Analysis Repository (CIDAR). DHS uses CIDAR to collect cyber incident data along 16 categories, including the type, severity and timeline of an incident, the apparent goal of the attacker, contributing causes, specific control failures, assets compromised, detection and mitigation techniques, and the cost of the attack.

According to the DHS, it hoped to “promote greater understanding about the financial and operational impacts of cyber events.”

“Optimally, such a repository could enable a novel information sharing capability among the federal government, enterprise risk owners, and insurers that increases shared awareness about current and historical cyber risk conditions and helps identify longer-term cyber risk trends,” the DHS wrote in a report about the value proposition of CIDAR. “This information sharing approach could help not only enhance existing cyber risk mitigation strategies but also improve and expand upon existing cybersecurity insurance offerings.”

The full cyberinsurance study by the DHS could take 10 to 15 years to complete, but Matt Shabat, strategist and performance manager in the DHS Office of Cybersecurity and Communications, told TechRepublic that he hopes there can be short-term improvements to cybersecurity with analysis of the data as it is gathered.

Shabat said he hopes the added context gathered by CIDAR will improve the usefulness of its data compared to other threat intelligence sharing platforms. Experts said this was especially important because as Ken Spinner, vice president of global field engineering at Varonis, told SearchSecurity, “A data repository is only as good as the data within it, and its success will likely depend on how useful and thorough the data is.”

“Sector-based Information Sharing and Analysis Centers have been implemented over a decade ago, so creating a centralized cyber incident data repository for the purpose of sharing intelligence across sectors is a logical next step and a commendable endeavor,” Spinner added. “A data repository could have greater use beyond its original intent by helping researchers find patterns in security incidents and criminal tactics.”

Philip Lieberman, president of Lieberman Software, a cybersecurity company headquartered in Los Angeles, said speed was the key to threat intel sharing.

“The DHS study on cyberinsurance is a tough program to implement because of missing federal laws and protocols to provide safe harbor to companies that share intrusion information,” Lieberman told SearchSecurity. “The data will be of little use in helping others unless threat dissemination is done within hours of an active breach.”

Many organizations may be reluctant to share meaningful data because of the difficulty in anonymizing it and the potential for their disclosure to be used against them.
Scott Petryco-founder and CEO of Authentic8

Scott Petry, co-founder and CEO of Authentic8, a secure cloud-based browser company headquartered in Mountain View, Calif., said the 16 data elements used by the DHS could provide “a pretty comprehensive overview of exploits and responses, if a significant number of organizations were to contribute to CIDAR.”

“The value of the data would be in the volume and its accuracy. Neither feel like short term benefits, but there’s no question that understanding more about breaches can help prevent similar events,” Petry told SearchSecurity. “But many organizations may be reluctant to share meaningful data because of the difficulty in anonymizing it and the potential for their disclosure to be used against them. It goes against their nature for organizations to share detailed breach information.”

The DHS appears to understand these concerns and outlined potential ways to overcome the “perceived obstacles” to enterprises sharing attack data with CIDAR, although experts noted many of the suggestions offered by the DHS may not be as effective as desired because they tend to boil down to working together with organizations rather than offering innovative solutions to these longstanding issues.

DHS did not respond to requests for comment at the time of this post.

Using cyberinsurance to improve security

Still, experts said if the DHS can gather quality data, the cyberinsurance study could help enterprises to improve security.

Spinner said cyberinsurance is a valid risk mitigation tool.

“Counterintuitively, having a cyberinsurance policy can foster a culture of security. Think of it this way: When it comes to auto insurance, safer drivers who opt for the latest safety features on their vehicles can receive a discount,” Spinner said. “Similarly, organizations that follow best practices and take appropriate steps to safeguard the data on their networks can also be rewarded with lower a lower rate quote.”

Lieberman said the efficacy of cyberinsurance on security is limited because the “industry is in its infancy with both insurer and insured being not entirely clear as to what constitutes due and ordinary care of IT systems to keep them free of intruders.”

“Cyberinsurance does make sense if there are clear definitions of minimal security requirements that can be objectively tested and verified. To date, no such clear definitions nor tests exist,” Lieberman said. “DHS would do the best for companies and taxpayers by assisting the administration and [the] legislative branch in drafting clear guidelines with both practices and tests that would provide safe harbor for companies that adopt their processes.”

Petry said the best way for cyberinsurance to help improve security would be to require “an organization to meet certain security standards before writing the policy and by creating an ongoing compliance requirement.”

“It’s a big market, and insurers are certainly making money, but that doesn’t mean it’s a mature market. Many organizations require their vendors to carry cyberinsurance, which will continue to fuel that growth, but the insurers aren’t taking reasonable steps to understand the exposure of the organizations they’re underwriting. When I get health insurance, they want to know if I’m a smoker and what my blood pressure is. Cyberinsurance doesn’t carry any of the same real-world assessments of ‘the patient.'”

Spinner said the arrangement between the cybersecurity industry and cyberinsurance is “very much still a work in progress.”

“The cybersecurity market is evolving rapidly, to some extent it is still in the experimental phase in that providers are continuing to learn what approach works best, just as companies are trying to figure out just how much insurance is adequate,” Spinner said. “It’s a moving target and we’ll continue to see the industry and policies evolve. The industry needs to work towards a standard for assessing risk so they can accurately determine rates.”