Tag Archives: Advanced

Oracle Cloud Infrastructure updates hone in on security

SAN FRANCISCO — Oracle hopes a focus on advanced security can help its market-lagging IaaS gain ground against the likes of AWS, Microsoft and Google.

A new feature called Maximum Security Zones lets customers denote enclaves within their Oracle Cloud Infrastructure (OCI) environments that have all security measures turned on by default. Resources within the zones are limited to configurations that are known to be secure. The system will also prevent alterations to configurations and provide continuous monitoring and defenses against anomalies, Oracle said on the opening day of its OpenWorld conference.

Through Maximum Security Zones, customers “will be better protected from the consequences of misconfigurations than they are in other cloud environments today,” Oracle said in an obvious allusion to recent data breaches, such as the Capital One-AWS hack, which have been blamed on misconfigured systems that gave intruders a way in.

“Ultimately, our goal is to deliver to you a fully autonomous cloud,” said Oracle executive chairman and CTO Larry Ellison, during a keynote. 

“If you spend the night drinking and get into your Ford F-150 and crash it, that’s not Ford’s problem,” he said. “If you get into an autonomous Tesla, it should get you home safely.”

Oracle wants to differentiate itself and OCI from AWS, which consistently promotes a shared responsibility model for security between itself and customers. “We’re trying to leapfrog that construct,” said Vinay Kumar, vice president of product management for Oracle Cloud Infrastructure.

“The cloud has always been about, you have to bring your own expertise and architecture to get this right,” said Leo Leung, senior director of products and strategy at OCI. “Think about this as a best-practice deployment automatically. … We’re going to turn all the security on and let the customer decide what is ultimately right for them.”

Security is too important to rely solely on human effort.
Holger MuellerVice president and principal analyst, Constellation Research.

Oracle’s Autonomous Database, which is expected to be a big focal point at this year’s OpenWorld, will benefit from a new service called Oracle Data Safe. This provides a set of controls for securing the database beyond built-in features such as always-on encryption and will be included as part of the cost of Oracle Database Cloud services, according to a statement.

Finally, Oracle announced Cloud Guard, which it says can spot threats and misconfigurations and “hunt down and kill” them automatically. It wasn’t immediately clear whether Cloud Guard is a homegrown Oracle product or made by a third-party vendor. Security vendor Check Point offers an IaaS security product called CloudGuard for use with OCI.

Starting in 2017, Oracle began to talk up new autonomous management and security features for its database, and the OpenWorld announcements repeat that mantra, said Holger Mueller, an analyst at Constellation Research in Cupertino, Calif. “Security is too important to rely solely on human effort,” he said.

OCI expansions target disaster recovery, compliance

Oracle also said it will broadly expand OCI’s global cloud footprint, with the launch of 20 new regions by the end of next year. The rollout will bring Oracle’s region count to 36, spread across North America, Europe, South America, the Middle East, Asia-Pacific, India and Australia.

This expansion will add multiple regions in certain geographies, allowing for localized disaster recovery scenarios as well as improved regulatory compliance around data location. Oracle plans to add multi-region support in every country it offers OCI and claimed this approach is superior to the practice of including multiple availability zones in a single region.

Oracle’s recently announced cloud interoperability partnership with Microsoft is also getting a boost. The interconnect that ties together OCI and Azure, now available in Virginia and London, will also be offered in the Western U.S., Asia and Europe over the next nine months, according to a statement. In most cases, Oracle is leasing data center space from providers such as Equinix, according to Kumar.

Holger MuellerHolger Mueller

SaaS vendors are another key customer target for Oracle with OCI. To that end, it announced new integrated third-party billing capabilities for the OCI software marketplace released earlier this year. Oracle also cited SaaS providers who are taking advantage of Oracle Cloud Infrastructure for their own underlying infrastructure, including McAfee and Cisco.

There’s something of value for enterprise customers in OCI attracting more independent software vendors, an area where Oracle also lags against the likes of AWS, Microsoft and Google, according to Mueller.

“In contrast to enterprises, they bring a lot of workloads, often to be transferred from on-premises or even other clouds to their preferred vendor,” he said. “For the IaaS vendor, that means a lot of scale, in a market that lives by economies of scale: More workloads means lower prices.”

Go to Original Article
Author:

Data ethics issues create minefields for analytics teams

GRANTS PASS, Ore. — AI technologies and other advanced analytics tools make it easier for data analysts to uncover potentially valuable information on customers, patients and other people. But, too often, consultant Donald Farmer said, organizations don’t ask themselves a basic ethical question before launching an analytics project: Should we?

In the age of GDPR and like-minded privacy laws, though, ignoring data ethics isn’t a good business practice for companies, Farmer warned in a roundtable discussion he led at the 2019 Pacific Northwest BI & Analytics Summit. IT and analytics teams need to be guided by a framework of ethics rules and motivated by management to put those rules into practice, he said.

Otherwise, a company runs the risk of crossing the line in mining and using personal data — and, typically, not as the result of a nefarious plan to do so, according to Farmer, principal of analytics consultancy TreeHive Strategy in Woodinville, Wash. “It’s not that most people are devious — they’re just led blindly into things,” he said, adding that analytics applications often have “unforeseen consequences.”

For example, he noted that smart TVs connected to home networks can monitor whether people watch the ads in shows they’ve recorded and then go to an advertiser’s website. But acting on that information for marketing purposes might strike some prospective customers as creepy, he said.

Shawn Rogers, senior director of analytic strategy and communications-related functions at vendor Tibco Software Inc., pointed to a trial program that retailer Nordstrom launched in 2012 to track the movements of shoppers in its stores via the Wi-Fi signals from their cell phones. Customers complained about the practice after Nordstrom disclosed what it was doing, and the company stopped the tracking in 2013.

“I think transparency, permission and context are important in this area,” Rogers said during the session on data ethics at the summit, an annual event that brings together a small group of consultants and vendor executives to discuss BI, analytics and data management trends.

AI algorithms add new ethical questions

Being transparent about the use of analytics data is further complicated now by the growing adoption of AI tools and machine learning algorithms, Farmer and other participants said. Increasingly, companies are augmenting — or replacing — human involvement in the analytics process with “algorithmic engagement,” as Farmer put it. But automated algorithms are often a black box to users.

Mike Ferguson, managing director of U.K.-based consulting firm Intelligent Business Strategies Ltd., said the legal department at a financial services company he works with killed a project aimed at automating the loan approval process because the data scientists who developed the deep learning models to do the analytics couldn’t fully explain how the models worked.

We’ve gone from a bottom-up approach of everybody grabbing data and doing something with it to more of a top-down approach.
Mike FergusonManaging director, Intelligent Business Strategies Ltd.

And that isn’t an isolated incident in Ferguson’s experience. “There’s a loggerheads battle going on now in organizations between the legal and data science teams,” he said, adding that the specter of hefty fines for GDPR violations is spurring corporate lawyers to vet analytics applications more closely. As a result, data scientists are focusing more on explainable AI to try to justify the use of algorithms, he said.

The increased vetting is driven more by legal concerns than data ethics issues per se, Ferguson said in an interview after the session. But he thinks that the two are intertwined and that the ability of analytics teams to get unfettered access to data sets is increasingly in question for both legal and ethical reasons.

“It’s pretty clear that legal is throwing their weight around on data governance,” he said. “We’ve gone from a bottom-up approach of everybody grabbing data and doing something with it to more of a top-down approach.”

Jill Dyché, an independent consultant who’s based in Los Angeles, said she expects explainable AI to become “less of an option and more of a mandate” in organizations over the next 12 months.

Code of ethics not enough on data analytics

Staying on the right side of the data ethics line takes more than publishing a corporate code of ethics for employees to follow, Farmer said. He cited Enron’s 64-page ethics code, which didn’t stop the energy company from engaging in the infamous accounting fraud scheme that led to bankruptcy and the sale of its assets. Similarly, he sees such codes having little effect in preventing ethical missteps on analytics.

“Just having a code of ethics does absolutely nothing,” Farmer said. “It might even get in the way of good ethical practices, because people just point to it [and say], ‘We’ve got that covered.'”

Instead, he recommended that IT and analytics managers take a rules-based approach to data ethics that can be applied to all three phases of analytics projects: the upfront research process, design and development of analytics applications, and deployment and use of the applications.

Go to Original Article
Author:

For Sale – Meraki Bundle (MX64, MS120-8LP, MR33, 100 MDM licences) 3 year licence

Brand new and still boxed with unclaimed license key for 3 years.

  • 1x Meraki MX64 Advanced Security License and support 3years
  • 1x Meraki MR Enterprise License 3 years
  • 1x Meraki MS120-8LP Enterprise License and support 3 years
  • 2x Meraki AC Power Cords UK

Due to the cost and size of the item I would rather the buyer comes and collects in person. The license has not been claimed, so it can be added to an existing or new meraki registration.

I will add pictures tonighttomorrow but it is all still boxed.

Price and currency: 1200.00
Delivery: Goods must be exchanged in person
Payment method: Cash on collection
Location: Hampshire
Advertised elsewhere?: Advertised elsewhere
Prefer goods collected?: I prefer the goods to be collected

______________________________________________________
This message is automatically inserted in all classifieds forum threads.
By replying to this thread you agree to abide by the trading rules detailed here.
Please be advised, all buyers and sellers should satisfy themselves that the other party is genuine by providing the following via private conversation to each other after negotiations are complete and prior to dispatching goods and making payment:

  • Landline telephone number. Make a call to check out the area code and number are correct, too
  • Name and address including postcode
  • Valid e-mail address

DO NOT proceed with a deal until you are completely satisfied with all details being correct. It’s in your best interest to check out these details yourself.

2018 MIT Sloan CIO Symposium: A SearchCIO guide

Introduction

Today’s enterprise can be divided into two groups: the departments that are acquiring advanced digital capabilities and those that are lagging behind. This bifurcation of digital prowess was evident at the 2018 MIT Sloan CIO Symposium, where we asked CIOs and digital experts to expound on the factors driving digitalization at enterprises and the barriers holding them back. Not surprisingly, the departments that are customer-facing, such as marketing, are leading the digital transformation charge.

While the transition to a digitalized enterprise is happening at varied speeds for most companies, the need to develop a viable digital business mode is universally recognized. Indeed, this year’s event was all about taking action — it is no longer enough just to have a vision for digital transformation, and the conference underscored that: sessions featured leading CIOs, IT practitioners, consultants and academics from across the globe dispensing hard-won advice on methods for planning and executing a future-forward digital transformation strategy.

In this SearchCIO conference guide, experience the 2018 MIT Sloan CIO Symposium by delving into our comprehensive coverage. Topics include building an intelligent enterprise, talent recruitment, the expanding CIO role and integration of emerging technologies like AI, machine learning, cloud and more.

To view our complete collection of video interviews filmed at this year’s event, please see our video guide: “MIT CIO 2018 videos: Honing a digital leadership strategy.”

1Thriving in a digital economy

Digital transformation strategy and advice

Implementing a digital transformation strategy requires a clear set of objectives, IT-business alignment, recruitment of the right talent, self-disruption and building what experts call an “intelligent enterprise,” among other things. In this section, the pros discuss the intricacies of leading the digital transformation charge.

2Technology transformation

Utilizing emergent tech like AI, machine learning and cloud

Every digital transformation requires a future-forward vision that takes advantage of up-and-coming tools and technologies. In this section, academics and IT executives discuss the enterprise challenges, benefits, questions and wide-ranging potential that AI, machine learning, edge computing, big data and more bring to the enterprise.

3Evolving CIO role

The CIO’s ever-expanding role in a digital world

Digital transformation not only brings with it new technologies and processes, it also brings new dimensions and responsibilities to the CIO role. In this section, CIOs and IT executives detail the CIO’s place in an increasingly digital, threat-laden and customer-driven world and offer timely advice for staying on top of it all.

4Videos

Interviews filmed on site

During the 2018 MIT Sloan CIO Symposium, SearchCIO staff had the pleasure of conducting several one-on-one video interviews with consultants and IT executives on the MIT campus in Cambridge, Mass. Below is a sampling of the videos.

A link to our full collection of videos filmed at the 2018 MIT Sloan CIO Symposium can be found at the top of this guide.

Driving digital transformation success with Agile


In this SearchCIO video, Bharti Airtel’s CIO Mehta, MIT Sloan CIO Leadership Award winner, explains why implementing Agile methodologies can help organizations scale their digital transformation projects.

For Sale – Bundle for Sale (AMD FX 8320 Cpu-Board-16gb Ram-Cooler) + Modular Psu

CPU: – AMD FX 8320 Processor – Black Edition – 8 Core

Cooler :
be quiet Dark Rock C1 Advanced Intel/AMD CPU Air Cooler

Motherboard : – Asus M5A97 Version 2

RAM : – 2 packs of Corsair 8gb (2x4gb) Ddr3 1600mhz Low Profile Vengeance Memory Kit White Colour Cl9 1.35v (so there are 4 sticks in total)

£150 delivered
————————————————————–
PSU : –
EVGA SuperNOVA 550 G2, 80+ GOLD 550W, Fully Modular

£40 delivered

all parts in excellent condition out a pc i build begining of the year

would consider splitting if i got buyers for all parts

open to options

Price and currency: £150 , £40
Delivery: Delivery cost is included within my country
Payment method: paypal gift or bt
Location: london
Advertised elsewhere?: Not advertised elsewhere
Prefer goods collected?: I have no preference

______________________________________________________
This message is automatically inserted in all classifieds forum threads.
By replying to this thread you agree to abide by the trading rules detailed here.
Please be advised, all buyers and sellers should satisfy themselves that the other party is genuine by providing the following via private conversation to each other after negotiations are complete and prior to dispatching goods and making payment:

  • Landline telephone number. Make a call to check out the area code and number are correct, too
  • Name and address including postcode
  • Valid e-mail address

DO NOT proceed with a deal until you are completely satisfied with all details being correct. It’s in your best interest to check out these details yourself.

MU-MIMO technology boosts system capacity for WLANs

It was legendary science-fiction writer Arthur C. Clarke who wrote, “Any sufficiently advanced technology is indistinguishable from magic.”

Many would put wireless LAN in the magical category. And if that’s the case, multiple input, multiple output (MIMO) and, most recently, multiuser MIMO (MU-MIMO) technology would really have to be the rabbits in the hat. Even those of us who’ve spent the majority of our careers in wireless networking are amazed at what those technologies can do.

MIMO’s been with us since 802.11n, and it unlocked the amazing performance improvements that extend into the current era. Yet, MIMO is still little understood, even among many network engineers. No surprise there, as MIMO seems to violate many of the laws that govern communications theory.

In a nutshell, MIMO technology takes two-dimensional radio waves — you can think of them as having just frequency and amplitude, which are the variables that we use to modulate a carrier signal, thus embossing the information we wish to transmit on this simple structure — and makes them three-dimensional. The third dimension is space, and that’s why MIMO is often referred to as spatial multiplexing. More dimensions result in a greater capacity to carry information. That’s why MIMO yields such amazing results, without violating any physical laws whatsoever.

Overcoming the wireless interference problem

But there’s more to the performance of wireless communications than simply cramming more bits into a given transmission cycle. Wireless has historically been a serial-access medium: Only one transmitter can be active in any given location at a particular frequency at any given moment in time. Two transmitters in close proximity attempting simultaneous transmission will likely cause mutual interference and at least some degradation to overall system capacity. While licensed radio services, like cellular, can schedule transmissions to avoid this problem, the unlicensed bands have no such controls; stations cannot coordinate with one another and must accept as reality any interference they encounter.

Now, several stations can receive the data they need with less overall waiting. System capacity goes up.

In response, the wireless industry developed many techniques to deal with the potential damage inherent in interference, mostly related to how information is modulated and coded before it’s sent over the air. But a large problem remains: Only one station can transmit at any moment in time. Until recently, this has meant only one receiving client station could be served — again, at any moment in time — and any others desiring communication would have to wait. The result: Overall system capacity was limited.

And that’s just the problem MU-MIMO technology solves. In yet another incarnation of magic, MU-MIMO enables an access point to transmit — simultaneously — to multiple clients in one transmit cycle, with each client receiving a unique data stream from the others also sent at the same time. Now, several stations can receive the data they need with less overall waiting. System capacity goes up. And more users go home — or, better said, stay at work — happy.

802.11ax standard introduces new flavor of MU-MIMO technology

Now, how this technique is actually implemented is so complex that the math involved would make great bedtime reading. Suffice it to say, MU-MIMO technology works very, very well — Farpoint Group’s own testing shows performance gains close to the theoretical maximum. Your mileage will likely vary, but the potential here is enormous. The upcoming 802.11ax standard is expected to add bi-directional MU-MIMO, meaning stations will be able to transmit simultaneously to an access point. Now, we’re talking real magic.

Farpoint Group recommends new Wi-Fi infrastructure and client purchases specify support for MU-MIMO — yes, you’ll need new gear on both ends; field upgrades are not possible in most cases. But you’ll be glad you specified this requirement. Of course, MU-MIMO technology isn’t really magic, but it certainly looks that way.

GPU implementation is about more than deep learning

When you consider a typical GPU implementation, you probably think of some advanced AI application. But that’s not the only place businesses are putting the chips to work.

“[GPUs] are obviously applicable for Google and Facebook and companies doing AI. But for startups like ours that have to justify capital spend in today’s business value, we still want that speed,” said Kyle Hubert, CTO at Simulmedia Inc.

The New York-based advertising technology company is using GPUs to make fairly traditional processes, like data reporting and business intelligence dashboards, work faster. Using a platform from MapD, Simulmedia has built a reporting and data querying tool that lets sales staff and others in the organization visualize how certain television ads are performing and answer any client inquiries as they come in.

Using GPUs for more than deep learning

Kyle Hubert, CTO, SimulmediaKyle Hubert

GPU technology is getting lots of attention today, primarily due to how businesses are using it. The chips power the training underlying some of the most advanced AI use cases, like image recognition, natural language translation and self-driving cars. But, of course, they were originally built to power video game graphics. Their main appeal is speedy processing power. And while that may be crucial for enabling neural networks to churn through millions of training examples, there are also other use cases in which the speed that comes from a GPU implementation is beneficial.

Simulmedia, founded in 2008, helps clients better target advertising on television networks. Initially, the team used spreadsheets to track metrics on how clients’ advertisements performed. But the data was too large — Simulmedia uses a combination of Nielsen and Experian data sets to target ads and assess effectiveness — and the visualization options were too limited.

Reports had to be built by the operations team, and there was little capability to do ad hoc queries. The MapD tool enables sales and product management teams to view data visualization reports and to do their own queries using a graphical interface or through SQL code.

Business focus pays off in GPU experience

There’s a lot of implicit knowledge that’s required to get GPUs up and running.
Kyle HubertCTO, Simulmedia

Some benefits of a GPU implementation focused on a standard business process go beyond simply speeding up that process. Hubert said it also prepares the business to implement the chips in a more pervasive way and prepares for a more AI-driven future.

He said the process of predicting which ads will perform best during particular time slots and on certain networks is heavy on data science. Simulmedia is looking at adding deep learning to its targeting, and these models will train on GPUs. Hubert said starting with GPUs in a standard business application has helped the team build a solid foundation on which to build out more GPU capability.

“There’s a lot of implicit knowledge that’s required to get GPUs up and running,” he said.

Aside from building institutional knowledge around how GPUs work, starting by applying the chips to more traditional use cases also helps to justify the cost, which can be substantial.

“They’re costly when you say, ‘I want a bunch of GPUs, and I don’t know what kind of results I’m going to get,'” Hubert said. “That’s a lot of capital investment when you don’t know your returns. When you do a dual-track approach, you can say, ‘I can get these GPUs, set them up for business users now, and I have a concrete ability to get immediate gratification. Then, I can carve out some of that to be future-looking.'”

Windows Server version 1709 hits turbulence upon release

Enterprises that use DevOps methodologies for advanced cloud-based applications will likely gain a new appreciation…

“;
}
});

/**
* remove unnecessary class from ul
*/
$(“#inlineregform”).find( “ul” ).removeClass(“default-list”);

/**
* Replace “errorMessageInput” class with “sign-up-error-msg” class
*/
function renameErrorMsgClass() {
$(“.errorMessageInput”).each(function() {
if ($(this).hasClass(“hidden”)) {
$(this).removeClass(“errorMessageInput hidden”).addClass(“sign-up-error-msg hidden”);
} else {
$(this).removeClass(“errorMessageInput”).addClass(“sign-up-error-msg”);
}
});
}

/**
* when validation function is called, replace “errorMessageInput” with “sign-up-error-msg”
* before return
*/
function validateThis(v, form) {
var validateReturn = urValidation.validate(v, form);
renameErrorMsgClass();
return validateReturn;
}

/**
* DoC pop-up window js – included in moScripts.js which is not included in responsive page
*/
$(“#inlineRegistration”).on(“click”,”a.consentWindow”, function(e) {
window.open(this.href, “Consent”, “width=500,height=600,scrollbars=1”);
e.preventDefault();
});

for Microsoft and Windows Server. That’s because the release of Windows Server version 1709, which came out in October, improves container support and has added functions that fortify its software-defined networking capabilities.

Every six months, Microsoft plans to introduce a new edition of Windows Server for the needs of these businesses that want the newest features and updates. Admins need to know what’s in Windows Server version 1709 and how it differs from the original Windows Server 2016 release that was introduced in October 2016. Here is a roundup of those changes and several others that are worthy of further scrutiny.

Microsoft makes containers the focus

Microsoft changed the mission of Nano Server in Windows Server version 1709. No longer considered a lighter version of Server Core to host various infrastructure workloads, Nano Server is now only available as a base image for containers. This role change allowed Microsoft to shrink Nano Server to about 80 MB, a drop from about 400 MB. This reduction means Nano Server no longer includes Windows PowerShell, .NET Core and Windows Management Instrumentation by default. Microsoft also removed the servicing stack from Nano Server, so admins have to redeploy the image for every update or patch. And all troubleshooting? That’s done in Docker, too.

There are other container improvements in Windows Server version 1709:

  • The Server Core container image is much smaller. According to Microsoft, it is just under 3 GB when it had been nearly 6 GB in the Windows Server 2016 release-to-manufacturing (RTM) version.
  • Windows Server version 1709 supports Linux containers on Hyper-V. These containers act like Docker containers but have kernel isolation provided by Hyper-V to so that they are completely independent. By comparison, traditional containers share a kernel but virtualize the rest of the OS.

For admins with significant investments in containers, these are great changes. For a business without a need for application virtualization, Microsoft says the updated Server Core in the Semi-Annual Channel release is where admins in those enterprises should put their infrastructure workloads.

Say aloha to Project Honolulu

Around the time Microsoft released Windows Server version 1709, the company also provided a technical preview of Project Honolulu — a free, GUI-based remote server management tool. Project Honolulu makes it easier to manage Server Core for admins who aren’t fluent in PowerShell.

Project Honolulu is a responsive web interface that enables admins to manage multiple remote servers, both on premises and in the cloud. It runs on a client machine or on a Windows Server instance and has similar functionality to local Microsoft Management Console-based GUI tools and Server Manager. Admins can use Project Honolulu to manage machines that run Windows Server 2012, including Server Core and the free Hyper-V Server.

Project Honolulu wraps up a number of administrative tools into a unified interface. It makes Server Core management less onerous and improves things to the point where I can recommend Server Core as the preferred installation option for any infrastructure servers you plan to deploy.

Microsoft improves SDN features

Windows Server version 1709 also added enhancements to its networking features, such as these two that were designed specifically for software-defined networking (SDN).

  • This Semi-Annual Channel release extends support for shielded VMs to Linux workloads. Microsoft introduced shielded VMs in Windows Server 2016 RTM. The feature enables these VMs to only run on authentic, verified hypervisor hosts. They remain encrypted and unbootable if an admin tries to access them from another host.
  • Microsoft added Virtual Network Encryption, which enables admins to mark subnets that connect different VMs as “Encryption Enabled” to require nonclear text transmissions over those links.

There were also several improvements in IPv6 support as that technology moves closer to widespread use in production. Those changes include support for domain name system configuration using router advertisements, flow labels for more efficient load balancing and the deprecation of Intra-Site Automatic Tunnel Addressing Protocol and 6to4 support.

Storage Spaces Direct drops from version 1709

In a curious move, Microsoft pulled support for Storage Spaces Direct (S2D) clusters, one of the better aspects of the original Windows Server 2016 release, in Windows Server version 1709.

S2D creates clusters of file servers with directly attached storage. This provides an easier and more cost-effective storage option for companies that would normally take a cluster of servers and attach them to a storage area network or a just a bunch of disks enclosure. S2D displays all of the directly attached disks as one big storage space, which the admin divvies into volumes.

Admins cannot create new S2D clusters on version 1709, and a machine cannot participate in any existing S2D cluster. If you use S2D clusters — or plan to — version 1709 is not for you. Microsoft says S2D is alive and well as a technology, but the company just couldn’t get it right in time for the 1709 release.

Growing pains for Windows Server

As Microsoft will offer a new version of Windows Server every six months, the removal of S2D should make admins wonder if the company will continue to play feature roulette in the Semi-Annual Channel. If an organization adopts a new feature, what happens if it’s pulled in the next release? More conservative businesses might want to wait for Windows Server version 1803 to make sure further features don’t fall by the wayside.

This raises another question: If Microsoft can’t hit the six-month targets, then why promise them at all? It’s too early to make a final judgment, but businesses that aren’t all-in on containers might want to wait until version 1803 to make sure other features aren’t removed before they commit to the Semi-Annual Channel.

Next Steps

Server Core can help and hinder IT

Pets vs. cattle and the future of management

Windows Server 2016 innovations challenge admins