Tag Archives: professionals

With support for Windows 7 ending, a look back at the OS

With Microsoft’s support for Windows 7 ending this week, tech experts and IT professionals remembered the venerable operating system as a reliable and trustworthy solution for its time.

The OS was launched in 2009, and its official end of life came Tuesday, Jan. 14.

Industry observers spoke of Windows 7 ending, remembering the good and the bad of an OS that managed to hold its ground during the explosive rise of mobile devices and the growing popularity of web applications.

An old reliable

Stephen Kleynhans, research vice president at Gartner, said Windows 7 was a significant step forward from Windows XP, the system that had previously gained dominance in the enterprise.

Stephen KleynhansStephen Kleynhans

“Windows 7 kind of defined computing for most enterprises over the last decade,” he said. “You could argue it was the first version of Windows designed with some level of true security in mind.”

Windows 7 introduced several new security features, including enhanced Encrypting File System protection, increased control of administrator privileges and allowing for multiple firewall policies on a single system.

The OS, according to Kleynhans, also provided a comfortable familiarity for PC users.

“It was a really solid platform that businesses could build on,” he said. “It was a good, solid, reliable OS that wasn’t too flashy, but supported the hardware on the market.”

“It didn’t put much strain on its users,” he added. “It fit in with what they knew.”

Eric Klein, analyst at VDC Research Group Inc., said the launch of Windows 7 was a positive move from Microsoft following the “debacle” that was Windows Vista — the immediate predecessor of Windows 7, released in 2007.

“Vista was a very big black eye for Microsoft,” he said. “Windows 7 was more well-refined and very stable.”

Eric KleinEric Klein

The fact that Windows 7 could be more easily administered than previous iterations of the OS, Klein said, was another factor in its enterprise adoption.

“So many businesses, small businesses included, really were all-in for Windows 7,” he said. “It was reliable and securable.”

Windows 7’s longevity, Klein said, was also due to slower hardware refresh rates, as companies often adopt new OSes when buying new computers. With web applications, there is less of a need for individual desktops to have high-end horsepower — meaning users can get by with older machines for longer.

Mark BowkerMark Bowker

“Ultimately, it was a well-tuned OS,” said Mark Bowker, senior analyst at Enterprise Strategy Group. “It worked, so it became the old reliable for a lot of organizations. Therefore, it remains on a lot of organizations’ computers, even at its end of life.”

Even Microsoft saw the value many enterprises placed in Windows 7 and responded by continuing support, provided customers pay for the service, according to Bowker. The company is allowing customers to pay for extended support for a maximum of three years past the January 14 end of life.

Early struggles for Windows 7

Kleynhans said, although the OS is remembered fondly, the switch from Windows XP was far from a seamless one.

“What people tend to forget about the transition from XP to 7 was that it was actually pretty painful,” he said. “I think a lot of people gloss over the fact that the early days with Windows 7 were kind of rough.”

The biggest issue with that transition was with compatibility, Kleynhans said.

“At the time, a lot of applications that ran on XP and were developed on XP were not developed with a secure environment in mind,” he said. “When they were dropped into Windows 7, with its tighter security, a lot of them stopped working.”

Daniel BeatoDaniel Beato

Daniel Beato, director of technology at IT consulting firm TNTMAX, recalled some grumbling about a hard transition from Windows XP.

“At first, like with Windows 10, everyone was complaining,” he said. “As it matured, it became something [enterprises] relied on.”

A worthy successor?

Windows 7 is survived by Windows 10, an OS that experts said is in a better position to deal with modern computing.

“Windows 7 has fallen behind,” Kleynhans said. “It’s a great legacy system, but it’s not really what we want for the 2020s.”

Companies, said Bowker, may be hesitant to upgrade OSes, given the complications of the change. Still, he said, Windows 10 features make the switch more alluring for IT admins.

“Windows 10, especially with Office 365, starts to provide a lot of analytics back to IT. That data can be used to see how efficiently [an organization] is working,” he said. “[Windows 10] really opens eyes with the way you can secure a desktop… the way you can authenticate users. These things become attractive [and prompt a switch].”

Klein said news this week of a serious security vulnerability in Windows underscored the importance of regular support.

“[The vulnerability] speaks to the point that users cannot feel at ease, regardless of the fact that, in 2020, Windows is a very, very enterprise-worthy and robust operating system that is very secure,” he said. “Unfortunately, these things pop up over time.”

The news, Klein said, only underlines the fact that, while some companies may wish to remain with Windows 7, there is a large community of hackers who are aware of these vulnerabilities — and aware that the company is ending support for the OS.

Beato said he still had customers working on Windows 7, but most people with whom he worked had made the switch to Windows 10. Microsoft, he said, had learned from Windows XP and provided a solid pathway to upgrade from Windows 7 to Windows 10.

The future of Windows

Klein noted that news about the next version of Windows would likely be coming soon. He wondered whether the trend toward keeping the smallest amount of data possible on local PCs would affect its design.

“Personally, I’ve found Microsoft to be the most interesting [of the OS vendors] to watch,” he said, calling attention to the company’s willingness to take risks and innovate, as compared to Google and Apple. “They’ve clearly turned the page from the [former Microsoft CEO Steve] Ballmer era.”

Go to Original Article

Citrix’s performance analytics service gets granular

Citrix introduced an analytics service to help IT professionals better identify the cause of slow application performance within its Virtual Apps and Desktops platform.

The company announced the general availability of the service, called Citrix Analytics for Performance, at its Citrix Summit, an event for the company’s business partners, in Orlando on Monday. The service carries an additional cost.

Steve Wilson, the company’s vice president of product for workspace ecosystem and analytics, said many IT admins must deal with performance problems as part of the nature of distributed applications. When they receive a call from workers complaining about performance, he said, it’s hard to determine the root cause — be it a capacity issue, a network problem or an issue with the employee’s device.

Performance, he said, is a frequent pain point for employees, especially remote and international workers.

“There are huge challenges that, from a performance perspective, are really hard to understand,” he said, adding that the tools available to IT professionals have not been ideal in identifying issues. “It’s all been very technical, very down in the weeds … it’s been hard to understand what [users] are seeing and how to make that actionable.”

Part of the problem, according to Wilson, is that traditional performance-measuring tools focus on server infrastructure. Keeping track of such metrics is important, he said, but they do not tell the whole story.

“Often, what [IT professionals] got was the aggregate view; it wasn’t personalized,” he said.

When the aggregate performance of the IT infrastructure is “good,” Wilson said, that could mean that half an organization’s users are seeing good performance, a quarter are seeing great performance, but a quarter are experiencing poor performance.

Steve Wilson, vice president of product for workspace ecosystem and analytics, CitrixSteve Wilson

With its performance analytics service, Citrix is offering a more granular picture of performance by providing metrics on individual employees, beyond those of the company as a whole. That measurement, which Citrix calls a user experience or UX score, evaluates such factors as an employee’s machine performance, user logon time, network latency and network stability.

“With this tool, as a system administrator, you can come in and see the entire population,” Wilson said. “It starts with the top-level experience score, but you can very quickly break that down [to personal performance].”

Wilson said IT admins who had tested the product said this information helped them address performance issues more expeditiously.

“The feedback we’ve gotten is that they’ve been able to very quickly get to root causes,” he said. “They’ve been able to drill down in a way that’s easy to understand.”

A proactive approach

Eric Klein, analyst, VDC Research GroupEric Klein

Eric Klein, analyst at VDC Research Group Inc., said the service represents a more proactive approach to performance problems, as opposed to identifying issues through remote access of an employee’s computer.

“If something starts to degrade from a performance perspective — like an app not behaving or slowing down — you can identify problems before users become frustrated,” he said.

Mark Bowker, senior analyst, Enterprise Strategy GroupMark Bowker

Klein said IT admins would likely welcome any tool that, like this one, could “give time back” to them.

“IT is always being asked to do more with less, though budgets have slowly been growing over the past few years,” he said. “[Administrators] are always looking for tools that will not only automate processes but save time.”

Enterprise Strategy Group senior analyst Mark Bowker said in a press release from Citrix announcing the news that companies must examine user experience to ensure they provide employees with secure and consistent access to needed applications.

IT is always being asked to do more with less.
Eric KleinAnalyst, VDC Research Group

“Key to providing this seamless experience is having continuous visibility into network systems and applications to quickly spot and mitigate issues before they affect productivity,” he said in the release.

Wilson said the performance analytics service was the product of Citrix’s push to the cloud during the past few years. One of the early benefits of that process, he said, has been in the analytics field; the company has been able to apply machine learning to the data it has garnered and derive insights from it.

“We do see a broad opportunity around analytics,” he said. “That’s something you’ll see more and more of from us.”

Go to Original Article

Windows 10 issues top list of most read stories for IT pros

Windows 10 — and the challenges posed to IT professionals by its updates — dominated the enterprise desktop discussion in 2019. Troubleshooting and understanding the eccentricities of 2019’s Windows 10 issues comprised many of our top 10 most popular stories this year.

With the sunset of Windows 7 scheduled for the first month of 2020, interest in other Microsoft OSes, including Windows 10, may intensify in the coming year.

Below is a countdown of the top ten most-read SearchEnterpriseDesktop stories, based on page views.

  1. Micro apps, AI to power new version of Citrix Workspace

Citrix announced a new version of Citrix Workspace, which enables IT admins to provide employees with virtual access to an organization’s desktop and applications, at May’s Citrix Synergy event in Atlanta. The company cited micro apps or small, task-based applications as a key feature, saying they would handle complicated tasks more efficiently by bringing them into a unified work feed. The addition of micro apps was made possible through a $200 million acquisition of Sapho in 2018.

  1. Lenovo to launch ThinkBook brand, next-gen ThinkPad x1

Lenovo started a new subbrand — called ThinkBook — this past spring, with two laptops aimed at younger employees in the workforce. The 13- and 14-inch laptops were intended to incorporate a sleek design with robust security, reliability and support services. The company also launched a laptop for advanced business users, ThinkPad X1 Extreme Gen 2, and the ultrasmall desktop ThinkCentre M90n-1 Nano in the same time frame.

  1. Learn about the device-as-a-service model and its use cases

The device-as-a-service model, in which a vendor leases devices to a business, may help IT admins fulfill their responsibility to support, maintain and repair equipment. The model has its pros and cons. It can provide a single point of contact for troubleshooting and enable more frequent hardware refreshes, but it can also limit an organization’s device choices and pose complications for a company’s BYOD plan.

  1. Lenovo powers new ThinkPad series with AMD Ryzen Pro processors

Lenovo released three Windows 10 laptops with AMD processors this past spring, the first time it has used non-Intel chips in its higher-end ThinkPad T and X series devices. The company hoped its T495, T495s and X395 computers would provide better performance and security at a lower cost; the company said the AMD-powered T and X series laptops saw an 18% increase over the previous generation.

  1. Windows 10 security breach highlights third-party vulnerabilities

Microsoft detected a security vulnerability in Windows 10, introduced through Huawei PCManager driver software. Microsoft Defender Advanced Threat Protection, a feature that finds and blocks potential compromises, found the problem before the vulnerability could cause serious damage, but industry professionals said the incident highlighted the risks posed by third-party kernels such as device drivers and the importance of working with trusted companies.

  1. Samsung Notebook 9 Pro 2-in-1 impresses with specs and looks

Samsung released a redesign of its flagship Windows 10 laptop this year, opting for an aluminum chassis in place of the plastic from previous iterations. The device offered comparable specs to other high-end laptop offerings, with a slate of features including a backlit keyboard, a variety of inputs and the Samsung Active Pen.

  1. With the new Windows 10 OS update, trust but verify

Dave Sobel, senior director and managed services provider at SolarWinds in Austin, Texas, expounded on the then-forthcoming May 2019 Windows 10 update a month before its scheduled release. Sobel acknowledged the security importance of patching systems but stressed that IT professionals remain vigilant for complications — notable, as the Windows 10 update came in the wake of an October 2018 patch that deleted files of users who work with Known Folder redirection.

  1. Citrix CEO David Henshall addresses Citrix news, sale rumors

In a Q&A, Citrix CEO David Henshall talked about the future of the 30-year-old company, downplaying rumors that it would be sold. Henshall spoke of the venerable firm’s history of connecting people and information on demand and saw the coming years as a time when Citrix would continue to simplify and ease that connection to encourage productivity.

  1. Latest Windows 10 update issues cause more freezing problems

The April 9 Windows 10 update caused device freezing upon launch. Those in IT had already noted freezing in devices using Sophos Endpoint Protection; after a few days, they learned that the patch was clashing with antivirus software, causing freezing both during startup and over the course of regular operation of the computer. Microsoft updated its support page to acknowledge the issue and provided workarounds shortly thereafter.

  1. 1. IT takes the good with the bad in Windows 10 1903 update

After experiencing problems with previous Windows 10 updates, June’s 1903 version came with initial positive — but wary — reception. Microsoft’s Windows-as-a-service model drew complaints for the way it implemented updates. Among its changes, 1903 enabled IT professionals to pause feature and monthly updates for up to 35 days. Also new was Windows Sandbox, providing IT with the ability to test application installations without compromising a machine. The new version of Windows 10 did not launch bug-free, however; issues with Wi-Fi connectivity, Bluetooth device connection and USB devices causing the rearrangement of drive letters were reported.

Go to Original Article

Swim DataFabric platform helps to understand edge streaming data

The new Swim DataFabric platform aims to help IT professionals categorize and make sense of large volumes of streaming data in real time.

The startup, based in San Jose, Calif., emerged from stealth in April 2018, with the promise of providing advanced machine learning and artificial intelligence capabilities to meet data processing and categorization challenges.

With the new Swim DataFabric, released Sept. 18, the vendor is looking to help make it easier for more users to analyze data. The Swim DataFabric platform integrates with Microsoft Azure cloud services including IoT suite and Data Lake Storage to classify and analyze data, as well as helps make predictions in real time.

The Swim DataFabric platform helps users get the most out of their real-time data with any distributed application including IoT and edge use cases, said Krishnan Subramanian, Rishidot Research chief research advisor.

“Gone are those days where REST is a reasonable interface for real-time data because of latency and scalability issues,” Subramanian said. “This is where Swim’s WARP protocol makes more sense and I think it is going to change how the distributed applications are developed as well as the user experience for these applications.”

Why the Swim DataFabric is needed

I think it is going to change how the distributed applications are developed as well as the user experience for these applications.
Krishnan SubramanianChief research advisor, Rishidot Research

A big IT challenge today is that users are getting streams of data from assets that are essentially boundless, said Simon Crosby, CTO at Swim. “A huge focus in the product is on really making it extraordinarily simple for customers to plug in their data streams and to build the model for them, taking all the pain out of understanding what’s in their data,” Crosby said.

Swim’s technology is being used by cities across the U.S. to help with road traffic management. The vendor has a partnership with Trafficware for a program that receives data from traffic sensors as part of a system that helps predict traffic flows.

The Swim DataFabric platform moves the vendor into a different space. The Swim DataFabric is focused on enabling customers that are Microsoft Azure cloud adopters to benefit from the Swim platform.

“It has an ability to translate any old data format from the edge into the CDM (Common Data Model) format which Microsoft uses for the ADLS (Azure Data Lake Storage) Gen2,” Crosby said. “So, a Microsoft user can now just click on the Swim DataFabric, which will figure out what is in the data, then labels the data and deposits it into ADLS.”

Screenshot of Swim architecture
Swim architecture

With the labelled data in the data lake, Crosby explained that the user can then use whatever additional data analysis tool they want, such as Microsoft’s Power BI or Azure Databricks.

He noted that Swim also has a customer that has chosen to use Swim technology on Amazon Web Services, but he emphasized that the Swim DataFabric platform is mainly optimized for Azure, due to that platform’s strong tooling and lifecycle management capabilities.

Swim DataFabric digital twin

One of the key capabilities that the Swim DataFabric provides is what is known as a digital twin model. The basic idea is that a data model is created that is a twin or a duplicate of something that exists in the real world.

“What we want is independent, concurrent, parallel processing of things, each of which is a digital twin of a real-world data source,” Crosby explained.

The advantage of the digital twin approach is fast processing as well as the ability to correlate and understand the state of data. With the large volumes of data that can come from IoT and edge devices, Crosby emphasized that understanding the state of a device is increasingly valuable.

“Everything in Swim is about transforming data into streamed insights,” Crosby said.

Go to Original Article

A quick take on the State of Hybrid Cloud survey

What does hybrid cloud mean to IT professionals, and why are so many companies using it? Microsoft conducted a survey with research firm Kantar TNS in January 2018, asking more than 1700 respondents to chime in. Surveys were collected from IT professionals, developers, and business decision makers to identify how they perceive hybrid cloud, what motivates adoption, and what features they see as most important. Survey participants in the United States, the United Kingdom, Germany, and India were asked their thoughts about hybrid, which for the survey was defined as consisting of “private cloud or on-premises resources/applications integrated with one or more public clouds”. We’ve created a summary infographic of the survey that you can review. A few survey highlights:

  • Hybrid is common, with a total of 67 percent of respondents now using or planning to deploy a hybrid cloud. Many of those hybrid users have made the move recently, 54 percent of users in the past two years.
  • Cost, a consistent IT experience, and the ability to scale quickly were all given as important reasons for moving to hybrid cloud.
  • The perceived benefits of hybrid cloud, as well as some of the challenges, vary by the geographic location of respondents. For example, increased security was the top benefit cited in the United Kingdom and Germany, while the top United States benefit was better scalability of compute resources.
  • The top use case given for hybrid cloud was controlling where important data is stored at 71 percent. Using the cloud for backup and disaster recovery was a close second at 69 percent.


We invite you to download and share our infographic on the state of today’s hybrid cloud. For a more complete review of the State of Hybrid Cloud 2018 survey findings, watch the on-demand webinar Among the Clouds: Enterprises Still Prefer Hybrid in 2018.

For more information about hybrid networking, identity, management and data on Azure, you can also check out this new Azure Essentials segment, Integrating Azure with your on-premises infrastructure.

State of Hybrid Cloud 2018 survey

Participants for this online survey were recruited from (non-Microsoft) local market lists selected by Microsoft and the international research firm Kantar TNS, which was hired to conduct the outreach. Survey participants included IT professionals, professional developers, and business decision makers/influencers who use, are planning, or have considered a hybrid cloud deployment. Surveyed company sizes were from mid-market to enterprise (250+). The survey was conducted January 4 – 24, 2018. For the purposes of this survey, hybrid cloud was defined as follows. Hybrid cloud consists of private cloud or on-premises resources/applications integrated with one or more public clouds.

Attending Black Hat USA 2018? Here’s what to expect from Microsoft.

Black Hat USA 2018 brings together professionals at all career levels, encouraging growth and collaboration among academia, world-class researchers, and leaders in the public and private sectors. This is an exciting time as our Microsoft researchers, partners, and security experts will showcase the latest collaborations in defense strategies for cybersecurity, highlight solutions for security vulnerabilities in applications, and bring together an ecosystem of intelligent security solutions. Our objective is to arm business, government, and consumers with deeply integrated intelligence and threat protection capabilities across platforms and products.

Security researchers play an essential role in Microsoft’s security strategy and are key to community-based defense. To show our appreciation for their hard work and partnership, each year at Black Hat USA, the Microsoft Security Response Center (MSRC) highlights the contributions of these researchers through the list of “Top 100” security researchers reporting to Microsoft (either directly or through a third party) during the previous 12 months. While one criterion for the ranking is volume of fixed reports a researcher has made, the severity and impact of the reports is very important to the ranking also. Given the number of individuals reporting to Microsoft, anyone ranked among the Top 100 is among some of the top talent in the industry.

In addition to unveiling the Top 100 and showcasing Microsoft security solutions at Booth #652, there are a number of featured Microsoft speakers and sessions:

Join us at these sessions during the week of August 4-9, 2018 in Las Vegas and continue the discussion with us in Booth #652, where we will have product demonstrations, theatre presentations, and an opportunity to learn more about our Top 100 and meet with some of Microsoft’s security experts and partners.

New types of authentication take root across the enterprise

BOSTON — When IT professionals develop a strategy for user password and authentication management, they must consider the two key metrics of security and usability.

IT professionals are looking for ways to minimize the reliance on passwords as the lone authentication factor, especially because 81% of hacking breaches occur due to stolen or weak passwords, according to Verizon’s 2017 Data Breach Investigations Report. Adding other types of authentication to supplement — or even replace — user passwords can ensure security improves without hurting usability.

“Simply put, the world has a password problem,” said Brett McDowell, executive director of the FIDO Alliance, based in Wakefield, Mass., here in a session at Identiverse.

A future without passwords?

Types of authentication that only require a single verification factor could be much more secure if users adopted complex, harder-to-predict passwords, but this pushes up against the idea of usability. The need for complex passwords, along with the 90- to 180-day password refreshes that are an industry standard in the enterprise, means that reliance on passwords alone can’t meet security and usability standards at the same time.

“If users are being asked to create and remember incredibly complex passwords, IT isn’t doing its job,” said Don D’Souza, a cybersecurity manager at Fannie Mae, based in Washington, D.C.

IT professionals today are turning to two-factor authentication, relying on biometric and cryptographic methods to supplement passwords. The FIDO Alliance, a user authentication trade association, pushes for two-factor authentication that entirely excludes passwords in their current form.

We want to take user vulnerability out of the picture.
Brett McDowellexecutive director, FIDO Alliance

McDowell broke down authentication methods into three categories:

  • something you know, such as a traditional password or a PIN;
  • something you possess, such as a mobile device or a token card; and
  • something you are, which includes biometric authentication methods, such as voice, fingerprint or gesture recognition.

The FIDO Alliance advocates for organizations to shift toward the latter two of these options.

“We want to take user vulnerability out of the picture,” McDowell said.

Taking away password autonomy from the user could improve security in many areas, but none more directly than phishing. Even if a user falls for a phishing email, his authentication is not compromised if two-factor authentication is in place, because the hacker lacks the cryptographic or biometric authentication access factor.

“With user passwords as a single-factor authentication, the only real protection against phishing is testing and training,” D’Souza said.

Trickle-down benefits of new types of authentication

Added types of authentication increase the burden on IT when it comes to privileged access management (PAM) and staying up-to-date on user information. But as organizations move away from passwords entirely, IT doesn’t need to worry as much about hackers gaining access to authentication information, because that is only one piece of the puzzle. This also leads to the benefit of cutting down on account access privileges, said Ken Robertson, a principal technologist at GE, based in Boston.

With stronger types of authentication in place, for example, IT can feel more comfortable handing over some simple administrative tasks to users — thereby limiting its own access to user desktops. IT professionals won’t love giving up access privilege, however.

“People typically start a PAM program for password management,” Robertson said. “But limiting IT logon use cases minimizes vulnerabilities.”

Organizations are taking steps toward multifactor authentication that doesn’t include passwords, but the changes can’t happen immediately.

“We will have a lot of two-factor authentication across multiple systems in the next few years, and we’re looking into ways to limit user passwords,” D’Souza said.

The growing ties between networking roles and automation

For years now, network professionals have heard they need to adapt to changing technologies or risk extinction. The messages are plentiful:

  • Learn these programming skills to stay relevant.
  • Take these training courses, but don’t get too vendor-focused.
  • Change your mindset.
  • Change your organization’s culture.
  • Change your skill sets to keep up with shifting networking roles and responsibilities.

All of these suggestions are increasingly valid and can prove valuable in the evolving networking industry. In fact, skill sets revolving around network programmability, cloud computing and cybersecurity are central to in-demand IT positions and roles, according to Mark Leary, directing analyst at [email protected], part of Cisco Services, who discussed a Cisco-sponsored report about IT jobs and skill sets, which was released by research firm IDC.

As those networking roles and responsibilities evolve, network professionals are also evolving. For example, organizations seek IT staff with skills in Python, Java, Linux, development, administration, support and engineering, among others, Leary said. In the evolution of networking jobs, employees need to be able to communicate with other teams, including security and developers, for business cross-projects and initiatives.

The dirty word: Automation

This industry and skill set evolution is necessary for the transition to the automated network, according to Zeus Kerravala, founder of ZK Research in Westminster, Mass. Modern network infrastructure doesn’t work well with heavily manual command-line interface configurations, he said during a recent Cisco webinar on the evolution of network engineering. Instead, it uses automation and APIs to deliver services and information.

But automation has traditionally been considered a dirty word that sends employees in all industries into a panic, just like it did in the 1800s and 1900s. Automation was expected to steal jobs and replace human intelligence. But as network automation use cases have matured, Kerravala said, employees and organizations increasingly see how automating menial network tasks can benefit productivity.

To automate, however, network professionals need programming skills to determine the desired network output. They need to be able to tell the network what they want it to do.

All of this brings me to an obvious term that’s integral to automation and network programming: program, which means to input data into a machine to cause it to do a certain thing. Another definition says to program is “to provide a series of instructions.” If someone wants to give effective instructions, a person must understand the purpose of the instructions being relayed. A person needs the foundation — or the why of it all — to get to the actual how.

Regarding network automation, the why is to ultimately achieve network readiness for what the network needs to handle, whether that’s new applications or more traffic, Cisco’s Leary said.

“One of the reasons you develop skills in network programming is to leverage all the automation tools,” he said. “As a result, you’re making use of those technologies and data to make sure your network isn’t just up and available, but [is] now network-ready.”

Vendors have a part in this, too

The impetus also falls on the networking vendors to provide products that help professionals in their networking roles.

But network readiness — and the related issue of network programmability — goes beyond skills and the ability to input data, according to Lee Doyle, principal analyst at Doyle Research. The impetus also falls on the networking vendors to provide products that help professionals in their networking roles.

Yes, we’ve seen the early versions of products focused on achieving expressed intent and outcomes. But we’ve also seen the hazy sheen of marketing fade away to reveal frizzled shreds of hype.

Ultimately, we need to determine what we want to accomplish with our networks and why. This likely results in myriad opinions, but most of us would consider growth beneficial. Learning new things offers the opportunity for more knowledge. Knowledge can benefit the employee, the organization and maybe even society. This idea may gravitate toward the idyllic, but consider some effects of remaining stagnant: irrelevant skills or knowledge, lost productivity and inefficacy.

“A business needs to be agile, but it’s only as agile as its least agile component,” Kerravala said. While Kerravala considered that component to be the network, the network could encompass organizations, vendors and network professionals.

So, I bring these questions to you — the network professional. Do you think you need to learn new skills in order to keep up with shifting networking roles? Do you want to reskill? Or, do you think vendors need to up their game?

Federal HR wants to modernize cybersecurity recruiting, pay

The U.S. Dept. of Homeland Security wants dramatic changes in hiring and management of cybersecurity professionals. It seeks 21st Century HR practices and technologies, with a goal of making the federal HR program as competitive as the private sector.

This effort will streamline hiring and improve cybersecurity recruiting. DHS wants a pay system for cybersecurity professionals based on “individual’s skills and capabilities.” New HR technologies are sought as well.

The proposed federal HR improvements are in a request for information to vendors. In this knowledge gathering effort vendors are asked to estimate the cost, and outline the expertise and technologies needed to achieve this reform. It doesn’t obligate the government but sets the stage for contract proposals. Its goals are sweeping.

DHS, for instance, said it wanted to end 20th Century federal HR practices, such as annual reviews. Instead, it wants 21st Century methods, such as continuous performance management.

The goal is modernizing federal HR technologies and processes, but with a focus on improving cybersecurity recruiting and retention.

Analysts see DHS moving in the right direction

HR analysts contacted about the planned federal cybersecurity recruiting reform seemed impressed.

“The scope of this is really big and it’s very ambitious,” said Kyle Lagunas, research manager in IDC’s talent acquisition and staffing research practice. “I’m really encouraged to see this. It really captures, I think, where the industry is going.”

It’s all in the right direction.
Josh Bersinfounder and principal, Bersin by Deloitte Consulting

“This sounds like good stuff to me,” said Josh Bersin, founder and principal of Bersin by Deloitte Consulting. “It’s all in the right direction,” he said.

Both analysts said that if DHS achieves its goals it will rank with leading businesses in HR best practices.

DHS employs some 11,000 cybersecurity professionals and leads government efforts to secure public and private critical infrastructure systems.

The U.S. said in 2016 that there weren’t enough cybersecurity professionals to meet federal HR needs. President Barack Obama’s administration called for a “government-wide” federal HR cybersecurity recruitment strategy. President Donald Trump’s administration is reaching out to vendors for specifics.

DHS published its request for information for reforming federal HR in early May, asking for cost estimates and ideas for modernizing cybersecurity hiring and management. It sought specific capabilities such as the ability to process as many as 75,000 applicants per year. It wants, as well, applicant assessment technologies. This can include virtual environments, for testing “real-world application of technical cybersecurity competencies.”

Feds boldly make a case for reform of cybersecurity recruiting

But what distinguished this particular federal HR request, from so many other government requests for information, was its dramatic framing of the goal.

The 20th Century way of recruiting involves posting a job and “hoping the right candidates apply,” said DHS in its request to vendors. The new 21st Century method — is to “strategically recruit from a variety of sources on an ongoing basis, and use up-to-date, cybersecurity-focused standards and validated tools to screen, assess and select talent.”

DHS also wants to adopt “market-sensitive pay” to more readily compete for people, a smart move, according to Lagunas. “If they want to bring in top cybersecurity talent they are going to have to make sure they are very competitive in their pay and practices.”

In what may be a nod to the growing contingent workforce, DHS wants a federal HR plan for “dynamic careers.” This involves “streamlined movement” from the private sector to government and back again.

The deadline for vendor responses to the government’s request for information is May 25.

Windows Server hardening still weighs heavily on admins

In these heady times of software-defined technologies and container virtualization, many IT professionals continue to grapple with an issue that has persisted since the advent of the server: security.

Ever since businesses discovered the advantages of sharing resources in a client-server arrangement, there have also been intruders attempting to bypass the protections at the perimeter of the network. These attackers angle for any weak point — outdated protocols, known vulnerabilities in unpatched systems — or go the direct route and deliver a phishing email in the hopes that a user will click on a link to unleash a malicious payload onto the network.

Windows Server hardening remains top of mind for most admins. Just as there are many ways to infiltrate a system, there are multiple ways to blunt those attacks. The following compilation highlights the most-viewed tutorials on SearchWindowsServer in 2017, several of which addressed the ways IT can reduce exposure to a server-based attack.

5. Manage Linux servers with a Windows admin’s toolkit

While not every Windows administrator is comfortable away from the familiarity of point-and-click GUI management tools, more in IT are taking cues from the world of DevOps to implement automation routines.

It took a while, but Microsoft eventually realized that spurning Linux also steered away potential customers. About 40% of the workloads on the Azure platform run some variation of Linux, Microsoft is a Platinum member of the Linux Foundation, and the company released SQL Server for Linux in September.

Many Windows shops now have a sprinkling of servers that use the open source operating system, and those administrators must figure out the best way to manage and monitor those Linux workloads. The cross-platform PowerShell Core management and automation tool promises to address this need, but until the offering reaches full maturity, this tip provides several options to help address the heterogeneous nature of many environments.

4. Disable SMB v1 for further Windows Server hardening

Unpatched Windows systems are tempting targets for ransomware and the latest malware du jour, Bitcoin miners.

A layered security approach helps, but it’s even better to pull out threat enablers by the roots to blunt future attacks. Long before the spate of cyberattacks in early 2017 that hinged on an exploit in Server Message Block (SMB) v1 that locked up thousands of Windows machines around the world, administrators had been warned to disable the outdated protocol. This tip details the techniques to search for signs of SMB v1 and how to extinguish it from the data center.

3. Microsoft LAPS puts a lock on local admin passwords

For the sake of convenience, many Windows shops will use the same administrator password on each machine. While this practice helps administrators with the troubleshooting or configuration process, it’s also tremendously insecure. If that credential falls into the wrong hands, an intruder can roam through the network until they obtain ultimate system access — domain administrator privileges. Microsoft introduced its Local Administrator Password Solution (LAPS) in 2015 to help Windows Server hardening efforts. This explainer details the underpinnings of LAPS and how to tune it for your organization’s needs.

2. Chocolatey sweetens software installations on servers

While not every Windows administrator is comfortable away from the familiarity of point-and-click GUI management tools, more in IT are taking cues from the world of DevOps to implement automation routines. Microsoft offers a number of tools to install applications, but a package manager helps streamline this process through automated routines that pull in the right version of the software and make upgrades less of a chore. This tip walks administrators through the features of the Chocolatey package manager, ways to automate software installations and how an enterprise with special requirements can develop a more secure deployment method.

1. Reduce risks through managed service accounts

Most organizations employ service accounts for enterprise-grade applications such as Exchange Server or SQL Server. These accounts provide the necessary elevated authorizations needed to run the program’s services. To avoid downtime, quite often administrators either do not set an expiration date on a service account password or will use the same password for each service account. Needless to say, this procedure makes less work for an industrious intruder to compromise a business. A managed service account automatically generates new passwords to remove the need for administrative intervention. This tip explains how to use this feature to lock down these accounts as part of IT’s overall Windows Server hardening efforts.