Tag Archives: Windows

For Sale – iMac 2017 – 27 5k – Not even 1 month old.

I have tried to move over from windows to mac but it just isn’t for me, I know about bootcamp but that is really not what I want.

I purchased this imac on the 09.01.2020 and it still can have applecare applied until the 09.03.2020 if you wish.

Mint condition and will be fully boxed.

Specs

i5 3.8hz
Radeon 580 pro 8gb
2tb fusion drive
40gb memory
27 5k screen

Not looking for any trades and collection only please due to size, weight & value.

I’ll take some photos in a while.

Go to Original Article
Author:

Battle lines over Windows Server 2008 migration drawn

With technical support for Windows Server 2008 ending this week, the battle between Microsoft and AWS for the hearts and wallets of its corporate users is underway.

At its re:Invent conference last month, AWS introduced its appropriately named AWS End-of-Support Migration Program (EMP) for Windows Server, aimed at helping users with their Windows Server 2008 migration efforts. The program promises to make it easier to shift users’ existing Windows Server 2008 workloads over to newer versions of Windows running on servers in AWS’ data centers. The EMP technology decouples the applications from the underlying operating system, thereby allowing AWS partners to migrate mission-critical applications over to the newer versions of Windows Server.

The technology reportedly identifies whatever dependencies the application has on Windows Server 2008 and then pulls together the resources needed for applications to run on the updated version of Windows Server. The package of software includes all applications files, runtimes, components and deployment tools, along with an engine that redirects API calls from your application to files within the package, the company said.

Punching back in a blog this week, Vijay Kumar, director of Windows Server and Azure products at Microsoft, stressed the advantages of his company’s products for users undergoing Windows 2008 server migration efforts. Users can deploy Windows Server workloads in Azure a number of ways, he wrote, including the company’s Virtual Machines on Azure, Azure VMware Solutions and Azure Dedicated Host. Users can also apply Azure Hybrid Benefit service to leverage their existing Windows Server licenses in Azure.

Kumar also noted that users can take advantage of Microsoft’s Extended Security Updates program specifically aimed at Windows Server 2008/R2 users, which provides an additional three years of security updates. This can buy users more time to plan their transition paths for core applications and services, he wrote.

The battle to own Windows Server 2008 migration

AWS has long targeted Windows Server users and, in fact, has convinced more than a few IT shops to switch over to AWS EC2 cloud environment. It stepped up those efforts with the introduction of its AWS-Microsoft Workload Competency program for partners last fall, according to one analyst.

[AWS] had as many as 14,000 Windows Server customers running on EC2 as of July 2019. That number is a fivefold increase over 2015.
Meaghan McGrathSenior analyst, Technology Business Review

“[AWS] had as many as 14,000 Windows Server customers running on EC2 as of July 2019,” said Meaghan McGrath, a senior analyst at Technology Business Review. “That number is a fivefold increase over 2015.”

Microsoft has stemmed some of the bleeding, however, McGrath added. For instance, the company has convinced many of its partners to push its free migration assessment program, which gives users a more precise estimate of what their total cost of ownership will be by keeping their SQL Server workloads in Microsoft environments compared to migrating them to AWS’s EC2. But the company is also applying some financial pressure, as well.

“As of last fall, there is a caveat in the Software Assurance contracts among [SQL Server] users that made it much more expensive for them to bring their licenses over to another vendor’s hosted environment,” McGrath said. “The other financial incentive is [Microsoft’s] Azure Hyper Benefit program, which offers users a discount on Azure services for migrating their workloads from licensed software.”

32-bit apps snagging Windows Server 2008 migration efforts

Last summer, Microsoft officials said the operating system still represents 60% of the company’s overall server installed base — a number that’s likely so large because it’s the last 32-bit version of Windows Server. Many corporate users developed customized applications for the platform, which can be expensive and time-consuming to migrate to 64-bit platforms. Users can also have difficulty migrating a 32-bit app to a 64-bit environment that was purchased from a reputable third-party developer, typically because that developer has discontinued support for that offering.

Paul DeloryPaul Delory

“When you are dealing with a [Windows Server] 2008 app, you can’t assume there will be a 64-bit version of that app available,” said Paul Delory, a research director at Gartner. “Users have to coordinate with all their vendors from whom they bought commercial software to know if they are supporting their app on the new OS. If not, you have to factor in the associated costs there.”

Still, the added expense of adapting your existing 32-bit app on Windows Server 2008 is not nearly as expensive as remaining with your existing versions of the operating system and associated applications. With the product going out of technical support this week, users will have to pay for Microsoft’s Extended Support, which could double the cost for the technical support they were getting under their initial services agreement.

“You can go to extended support, which gets you three years’ worth of updates, but that requires you to have Software Assurance,” Delory said. “Extended support costs you 75% of your annual licensing costs, and SA [Software Assurance] is an additional 25%, making it twice as much.”

He said a practical and less expensive solution for users facing this situation is to consider gravitating to a SaaS-based offering such as Office 365 or a similar offering with the same capabilities.

“Something like [Office 365] will be the path of least resistance for many companies because it offers them the chance to sidestep some of these problems,” Delory said. “You can make these problems someone else’s in exchange for a reasonable monthly fee.”

Other options for users leaning away from a Windows Server 2008 migration are much less attractive. They can leave the server in place and mitigate the vulnerabilities as best they can, Delory said, or tuck it behind a firewall and whitelist only certain IP addresses or leave certain ports open.

“You can bring in an Intrusion Prevention System to detect vulnerabilities, but that system must have an understanding of Windows Server 2008 vulnerabilities and be able to maintain them across all your applications,” Delory said.

Go to Original Article
Author:

With support for Windows 7 ending, a look back at the OS

With Microsoft’s support for Windows 7 ending this week, tech experts and IT professionals remembered the venerable operating system as a reliable and trustworthy solution for its time.

The OS was launched in 2009, and its official end of life came Tuesday, Jan. 14.

Industry observers spoke of Windows 7 ending, remembering the good and the bad of an OS that managed to hold its ground during the explosive rise of mobile devices and the growing popularity of web applications.

An old reliable

Stephen Kleynhans, research vice president at Gartner, said Windows 7 was a significant step forward from Windows XP, the system that had previously gained dominance in the enterprise.

Stephen KleynhansStephen Kleynhans

“Windows 7 kind of defined computing for most enterprises over the last decade,” he said. “You could argue it was the first version of Windows designed with some level of true security in mind.”

Windows 7 introduced several new security features, including enhanced Encrypting File System protection, increased control of administrator privileges and allowing for multiple firewall policies on a single system.

The OS, according to Kleynhans, also provided a comfortable familiarity for PC users.

“It was a really solid platform that businesses could build on,” he said. “It was a good, solid, reliable OS that wasn’t too flashy, but supported the hardware on the market.”

“It didn’t put much strain on its users,” he added. “It fit in with what they knew.”

Eric Klein, analyst at VDC Research Group Inc., said the launch of Windows 7 was a positive move from Microsoft following the “debacle” that was Windows Vista — the immediate predecessor of Windows 7, released in 2007.

“Vista was a very big black eye for Microsoft,” he said. “Windows 7 was more well-refined and very stable.”

Eric KleinEric Klein

The fact that Windows 7 could be more easily administered than previous iterations of the OS, Klein said, was another factor in its enterprise adoption.

“So many businesses, small businesses included, really were all-in for Windows 7,” he said. “It was reliable and securable.”

Windows 7’s longevity, Klein said, was also due to slower hardware refresh rates, as companies often adopt new OSes when buying new computers. With web applications, there is less of a need for individual desktops to have high-end horsepower — meaning users can get by with older machines for longer.

Mark BowkerMark Bowker

“Ultimately, it was a well-tuned OS,” said Mark Bowker, senior analyst at Enterprise Strategy Group. “It worked, so it became the old reliable for a lot of organizations. Therefore, it remains on a lot of organizations’ computers, even at its end of life.”

Even Microsoft saw the value many enterprises placed in Windows 7 and responded by continuing support, provided customers pay for the service, according to Bowker. The company is allowing customers to pay for extended support for a maximum of three years past the January 14 end of life.

Early struggles for Windows 7

Kleynhans said, although the OS is remembered fondly, the switch from Windows XP was far from a seamless one.

“What people tend to forget about the transition from XP to 7 was that it was actually pretty painful,” he said. “I think a lot of people gloss over the fact that the early days with Windows 7 were kind of rough.”

The biggest issue with that transition was with compatibility, Kleynhans said.

“At the time, a lot of applications that ran on XP and were developed on XP were not developed with a secure environment in mind,” he said. “When they were dropped into Windows 7, with its tighter security, a lot of them stopped working.”

Daniel BeatoDaniel Beato

Daniel Beato, director of technology at IT consulting firm TNTMAX, recalled some grumbling about a hard transition from Windows XP.

“At first, like with Windows 10, everyone was complaining,” he said. “As it matured, it became something [enterprises] relied on.”

A worthy successor?

Windows 7 is survived by Windows 10, an OS that experts said is in a better position to deal with modern computing.

“Windows 7 has fallen behind,” Kleynhans said. “It’s a great legacy system, but it’s not really what we want for the 2020s.”

Companies, said Bowker, may be hesitant to upgrade OSes, given the complications of the change. Still, he said, Windows 10 features make the switch more alluring for IT admins.

“Windows 10, especially with Office 365, starts to provide a lot of analytics back to IT. That data can be used to see how efficiently [an organization] is working,” he said. “[Windows 10] really opens eyes with the way you can secure a desktop… the way you can authenticate users. These things become attractive [and prompt a switch].”

Klein said news this week of a serious security vulnerability in Windows underscored the importance of regular support.

“[The vulnerability] speaks to the point that users cannot feel at ease, regardless of the fact that, in 2020, Windows is a very, very enterprise-worthy and robust operating system that is very secure,” he said. “Unfortunately, these things pop up over time.”

The news, Klein said, only underlines the fact that, while some companies may wish to remain with Windows 7, there is a large community of hackers who are aware of these vulnerabilities — and aware that the company is ending support for the OS.

Beato said he still had customers working on Windows 7, but most people with whom he worked had made the switch to Windows 10. Microsoft, he said, had learned from Windows XP and provided a solid pathway to upgrade from Windows 7 to Windows 10.

The future of Windows

Klein noted that news about the next version of Windows would likely be coming soon. He wondered whether the trend toward keeping the smallest amount of data possible on local PCs would affect its design.

“Personally, I’ve found Microsoft to be the most interesting [of the OS vendors] to watch,” he said, calling attention to the company’s willingness to take risks and innovate, as compared to Google and Apple. “They’ve clearly turned the page from the [former Microsoft CEO Steve] Ballmer era.”

Go to Original Article
Author:

Using wsusscn2.cab to find missing Windows updates

Keeping your Windows Server and Windows desktop systems updated can be tricky, and finding missing patches in conventional ways might not be reliable.

There are a few reasons why important security patches might not get installed. They could be mistakenly declined in Windows Server Update Services or get overlooked in environments that a lack an internet connection.

Microsoft provides a Windows Update offline scan file, also known as wsusscn2.cab, to help you check Windows systems for missing updates. The CAB file contains information about most patches for Windows and Microsoft applications distributed through Windows Update.

The challenge with the wsusscn2.cab file is its size. It weighs in around 650 MB, and distributing it to all the servers to perform a scan can be tricky and time-consuming. This tutorial explains how to avoid those issues and run it on all of your servers in a secure and timely manner using IIS for file transfer instead of SMB or PowerShell sessions.

Requirements for offline scanning

There are some simple requirements to use this tutorial:

  • a server or PC running Windows Server 2012 or newer or Windows 10;
  • a domain account with local administrator on the servers you want to scan; and
  • PowerShell remoting enabled on the servers you want to scan.

Step 1. Install IIS

First, we need a web server we can use to distribute the wsusscn2.cab file. There are several ways to copy the file, but they all have different drawbacks.

For example, we could distribute the wsusscn2.cab file with a regular file share, but that requires a double-hop. You could also copy the wsusscn2.cab file over a PowerShell session, but that causes a lot of overhead and is extremely slow for large files. An easier and more secure way to distribute the file is through HTTP and IIS.

Installing on Windows Server

Start PowerShell as admin and type the following to install IIS:

Install-WindowsFeature -name Web-Server -IncludeManagementTools

Installing on Windows 10

Start PowerShell as an admin and type the following to install IIS:

Enable-WindowsOptionalFeature -Online -FeatureName IIS-WebServer

The IIS role should be installed. The default site will point to the root folder of the C drive.

We can now proceed to download wsusscn2.cab from Microsoft.

Step 2. Download wsusscn2.cab

The link for this file can be tricky to find. You can either download it from this link and save it to the C drive or run the following script as admin on the IIS server:

# Default Site path, change if necessary
$IISFolderPath = "C:inetpubwwwroot"

# Download wsusscn2.cab
Start-BitsTransfer -Source "http://go.microsoft.com/fwlink/?linkid=74689" -Destination "$IISFolderPathwsusscn2.cab"

The script downloads the file to the wwwroot folder. We can verify the download by browsing to http:///wsusscn2.cab.

You also need to get the hash value of wsusscn2.cab to verify it. After saving it, run the following PowerShell command to check the file hash:

(Get-FileHash C:inetpubwwwrootwsusscn2.cab).Hash

31997CD01B8790CA68A02F3A351F812A38639FA49FEC7346E28F7153A8ABBA05

Step 3. Run the check on a server

Next, you can use a PowerShell script to download and scan for missing updates on a PC or server using the wsusscn2.cab file. You can run the script on at least Windows Server 2008 or newer to avoid compatibility issues. To do this in a secure and effective manner over HTTP, we get the file hash of the downloaded wsusscn2.cab file and compare it with the file hash of the CAB file on the IIS server.

We can also use the file hash to see when Microsoft releases a new version of wsusscn2.cab.

Copy and save the following script as Get-MissingUpdates.ps1:

Param(
    [parameter(mandatory)]
    [string]$FileHash,

    [parameter(mandatory)]
    [string]$Wsusscn2Url
)


Function Get-Hash($Path){
    
    $Stream = New-Object System.IO.FileStream($Path,[System.IO.FileMode]::Open) 
    
    $StringBuilder = New-Object System.Text.StringBuilder 
    $HashCreate = [System.Security.Cryptography.HashAlgorithm]::Create("SHA256").ComputeHash($Stream)
    $HashCreate | Foreach {
        $StringBuilder.Append($_.ToString("x2")) | Out-Null
    }
    $Stream.Close() 
    $StringBuilder.ToString() 
}

$DataFolder = "$env:ProgramDataWSUS Offline Catalog"
$CabPath = "$DataFolderwsusscn2.cab"

# Create download dir
mkdir $DataFolder -Force | Out-Null

# Check if cab exists
$CabExists = Test-Path $CabPath


# Compare hashes if download is needed
if($CabExists){
    Write-Verbose "Comparing hashes of wsusscn2.cab"
    
    $HashMatch = $Hash -ne (Get-Hash -Path $CabPath)

    if($HashMatch){   
        Write-Warning "Filehash of $CabPath did not match $($FileHash) - downloading"
        Remove-Item $CabPath -Force
    }
    Else{
        Write-Verbose "Hashes matched"
    }
}

# Download wsus2scn.cab if it dosen't exist or hashes mismatch
if(!$CabExists -or $HashMatch -eq $false){
    Write-Verbose "Downloading wsusscn2.cab"
    # Works on Windows Server 2008 as well
    (New-Object System.Net.WebClient).DownloadFile($Wsusscn2Url, $CabPath)

    if($Hash -ne (Get-Hash -Path $CabPath)){
        Throw "$CabPath did not match $($FileHash)"
    }

}

Write-Verbose "Checking digital signature of wsusscn2.cab"

$CertificateIssuer = "CN=Microsoft Code Signing PCA, O=Microsoft Corporation, L=Redmond, S=Washington, C=US"
$Signature = Get-AuthenticodeSignature -FilePath $CabPath
$SignatureOk = $Signature.SignerCertificate.Issuer -eq $CertificateIssuer -and $Signature.Status -eq "Valid"


If(!$SignatureOk){
    Throw "Signature of wsusscn2.cab is invalid!"
}


Write-Verbose "Creating Windows Update session"
$UpdateSession = New-Object -ComObject Microsoft.Update.Session
$UpdateServiceManager  = New-Object -ComObject Microsoft.Update.ServiceManager 

$UpdateService = $UpdateServiceManager.AddScanPackageService("Offline Sync Service", $CabPath, 1) 

Write-Verbose "Creating Windows Update Searcher"
$UpdateSearcher = $UpdateSession.CreateUpdateSearcher()  
$UpdateSearcher.ServerSelection = 3
$UpdateSearcher.ServiceID = $UpdateService.ServiceID.ToString()
 
Write-Verbose "Searching for missing updates"
$SearchResult = $UpdateSearcher.Search("IsInstalled=0")

$Updates = $SearchResult.Updates

$UpdateSummary = [PSCustomObject]@{

    ComputerName = $env:COMPUTERNAME    
    MissingUpdatesCount = $Updates.Count
    Vulnerabilities = $Updates | Foreach {
        $_.CveIDs
    }
    MissingUpdates = $Updates | Select Title, MsrcSeverity, @{Name="KBArticleIDs";Expression={$_.KBArticleIDs}}
}

Return $UpdateSummary

Run the script on one of the servers of computers to check for missing updates. To do this, copy the script to the machine and run the script with the URL to the wsusscn2.cab on the IIS server and the hash value from step two:

PS51> Get-MissingUpdates.ps1 -Wsusscn2Url "http://
  
   /wsusscn2.cab" -FileHash 31997CD01B8790CA68A02F3A351F812A38639FA49FEC7346E28F7153A8ABBA05
  

If there are missing updates, you should see output similar to the following:

ComputerName     MissingUpdatesCount Vulnerabilities  MissingUpdates
------------     ------------------- ---------------  --------------
UNSECURESERVER                    14 {CVE-2006-4685, CVE-2006-4686,
CVE-2019-1079, CVE-2019-1079...} {@{Title=MSXML 6.0 RTM Security Updat

If the machine is not missing updates, then you should see this type of output:

ComputerName MissingUpdatesCount Vulnerabilities MissingUpdates
------------ ------------------- --------------- --------------
SECURESERVER                   0

The script gives a summary of the number of missing updates, what those updates are and the vulnerabilities they patch.

This process is a great deal faster than searching for missing updates online. But this manual method is not efficient when checking a fleet of servers, so let’s learn how to run the script on all systems and collect the output.

Step 4. Run the scanning script on multiple servers at once

The easiest way to collect missing updates from all servers with PowerShell is with a PowerShell job. The PowerShell jobs run in parallel on all computers, and you can fetch the results.

On a PC or server, save the file from the previous step to the C drive — or another directory of your choice — and run the following as a user with admin permissions on your systems:

# The servers you want to collect missing updates from
$Computers = @(
        'server1',
        'server2',
        'server3'
)

# These are the arguments that will be sent to the remote servers
$RemoteArgs = @(
    # File hash from step 2
    "31997CD01B8790CA68A02F3A351F812A38639FA49FEC7346E28F7153A8ABBA05",
    "http://$env:COMPUTERNAME/wsusscn2.cab"
)

$Params = @{
    ComputerName = $Computers
    ArgumentList = $RemoteArgs
    AsJob        = $True
    # Filepath to the script on the server/computer you are running this command on
    FilePath = "C:ScriptsGet-MissingUpdates.ps1"
    # Maximum number of active jobs
    ThrottleLimit = 20
}

$Job = Invoke-Command @Params

# Wait for all jobs to finish
$Job | Wait-Job

# Collect Results from the jobs
$Results = $Job | Receive-Job

# Show results
$Results

This runs the Get-MissingUpdates.ps1 script on all servers in the $Computers variable in parallel to save time and make it easier to collect the results.

You should run these PowerShell jobs regularly to catch servers with a malfunctioning Windows Update and to be sure important updates get installed.

Go to Original Article
Author:

Windows 10 issues continue with Autopilot update

Although the latest issue with a Windows 10 update does not appear to have caused damage, users spoke of the need for further testing and oversight as Microsoft rolls out new versions of its OS.

Last week, Microsoft issued an update intended solely for Windows Autopilot-configured computers to the general public. Autopilot is a way for enterprises to automatically set up and configure new devices.

In its notes about the release, Microsoft indicated that the update should have no effect on users who don’t use Autopilot. Still, users offered mixed reactions to the news — especially given that past updates have caused Windows 10 issues such as data deletion, device freezing and the re-lettering of a computer’s drives.

William Warren, the owner of Brunswick, Maryland-based IT services company Emmanuel Technology Consulting, said he was concerned about continuing issues with Windows updates.

“They just can’t seem to get it right,” he said. “When you’re talking about an install base as diverse as [the one] Windows has to deal with, I don’t expect things to be totally problem-free but, with something like this, it’s like, ‘C’mon guys, pay attention.'”

William WarrenWilliam Warren

Warren said he had been keeping his clients on an older version of Windows 10 and would likely have them stay there until required to update.

“At least with this one, it didn’t become a huge data destruction debacle,” he said, referring to an October 2018 patch that deleted files of users who work with Known Folder redirection.

How software is currently developed is the real culprit, according to Warren said. He referred to across-the-board issues with software, including the flawed Boeing 737 Max software that led to plane crashes.

“What people need to do is stop fast iteration,” he said. “They need a second, third or fourth pair of eyes looking at code before it’s out the door. The mindset now is to fix bugs after release.”

With such a model, Warren said, companies treat customers as beta testers — or, in some cases, alpha testers.

“The file deletion debacle never should’ve gotten past QA,” he said, adding that he believed Microsoft had curtailed its use of testers in favor of having insiders find problems. “It’s Microsoft being tone-deaf.”

He did note that the latest problem did not seem to cause major issues.

“With this one, I see it as someone who didn’t tick a box they should have, or did tick a box they shouldn’t have,” he said. “It could have been a lot worse.”

Daniel BeatoDaniel Beato

Daniel Beato, director of technology at New Jersey IT consulting firm TNTMAX, said he hadn’t used Windows Autopilot and hadn’t been affected by the recent update. Still, he noted a lack of consistency with Microsoft’s rollouts.

“I think there needs to be more testing,” he said.

Beato said there seemed to be fewer Windows 10 issues with updates over time, although minor snags, like losing Google Chrome extensions after updating, remained.

“There were not a lot of big problems, and not a lot of problems with applications,” he said. “They have done a little better.”

Beato said, although the latest problem seems to be an oversight, it’s important to recognize that everyone can make mistakes.

“There’s a human part of things that, I think, sometimes we forget,” he said.

Automation tech continues to mature

Mark BowkerMark Bowker

Mark Bowker, a senior analyst at Enterprise Strategy Group, said the tension between updating software and maintaining stability was an old one, although one that has heightened in recent years.

“Updates have always been a thorn in the side of IT pros, given their responsibility for deploying and maintaining them,” he said. “Five years ago, the cadence was much further apart. Now, it’s coming at a much faster pace, and it has some IT pros thrown.”

Updates, Bowker said, can bring two things: broken applications or an improved user experience.

Andrew HewittAndrew Hewitt

As far as Autopilot goes, Forrester Research analyst Andrew Hewitt said he believed users would still have confidence in that service.

“There’s widespread belief that these new deployment automation technologies are still fairly new. They are going to have their bugs and issues and still require a fair amount of work to get them up and running,” he said. “I expect this space to mature rapidly over the next one to two years as customers find ways to optimize the service and Microsoft continues to mature [it].”

Go to Original Article
Author:

Are containers on Windows the right choice for you?

It’s nearly the end of the road for Windows Server 2008/2008 R2. Some of the obvious migration choices are a newer version of Windows Server or moving the workload into Azure. But does a move to containers on Windows make sense?

After Jan. 14, 2020, Microsoft ends extended support for the Windows Server 2008 and 2008 R2 OSes, which also means no more security updates unless one enrolls in the Extended Security Update program. While Microsoft prefers that its customers move to the Azure cloud platform, another choice is to use containers on Windows.

Understand the two different virtualization technologies

If you are thinking about containerizing Windows Server 2008/2008 R2 workloads, then you need to consider the ways containers differ from a VM. The most basic difference is a container is much lighter than a VM. Whereas each VM has its own OS, containers share a base OS image. The container generally includes application binaries and anything else necessary to run the containerized application.

Containers share a common kernel, which has advantages and disadvantages. One advantage is containers can be extremely small in size. It is quite common for a container to be less than 100 MB, which enables them to be brought online very quickly. The low overhead of containers makes it possible to run far more containers than VMs on the host server.

However, containers share a common kernel. If the kernel fails, then all the containers that depend on it will also fail. Similarly, a poorly written application can destabilize the kernel, leading to problems with other containers on the system.

VMs vs. Docker

As a Windows administrator considering containerizing legacy Windows Server workloads, you need to consider the fundamental difference between VMs and containers. While containers do have their place, they are a poor choice for applications with high security requirements due to the shared kernel or for applications with a history of sporadic stability issues.

Another major consideration with containers is storage. Early on, containers were used almost exclusively for stateless workloads because containers could not store data persistently. Unlike a VM, shutting down a container deletes all data within the container.

Container technology has evolved to support persistent storage through the use of data volumes. Even so, it can be difficult to work with data volumes. Applications that have complex storage requirement usually aren’t a good fit for containerization. For example, database applications tend to be poor candidates for containerization due to complex storage configuration.

If you are used to managing physical or virtual Windows Server machines, you might think of setting up persistent storage as a migration specific task. While there is a requirement to provide a containerized application with the persistent storage that it needs, it’s a one-time task completed as part of the application migration. It is important to remember that containers are designed to be completely portable. A containerized application can move from a development and test environment to a production server or to a cloud host without the need to repackage the application. Setting up complex storage dependencies can undermine container portability; an organization will need to consider whether a newly containerized application will ever need to be moved to another location.

What applications are suited for containers?

As part of the decision-making process related to using containers on Windows, it is worth considering what types of applications are best suited for this type of deployment. Almost any application can be containerized, but the ideal candidate is a stateless application with varying scalability requirements. For example, a front-end web application is often an excellent choice for a containerized deployment for a few reasons. First, web applications tend to be stateless. Data is usually saved on a back-end database that is separate from the front-end application. Second, container platforms work well to meet an application’s scalability requirements. If a web application sees a usage spike, additional containers can instantly spin up to handle the demand. When the spike ebbs, it’s just a matter of deleting the containers.

Before migrating any production workloads to containers, the IT staff needs to develop the necessary expertise to deploy and manage containers. While container management is not usually overly difficult, it is completely different from VM management. Windows Server 2019 supports the use of Hyper-V containers, but you cannot use Hyper-V Manager to create, delete and migrate containers in the same way that you would perform these actions on a VM.

Containers are a product of the open source world and are therefore managed from the command line using Linux-style commands that are likely to be completely foreign to many Windows administrators. There are GUI-based container management tools, such as Kubernetes, but even these tools require some time and effort to understand. As such, having the proper training is essential to a successful container deployment.

Despite their growing popularity, containers are not an ideal fit for every workload. While some Windows Server workloads are good candidates for containerization, other workloads are better suited as VMs. As a general rule, organizations should avoid containerizing workloads that have complex, persistent storage requirements or require strict kernel isolation.

Go to Original Article
Author:

Get back on the mend with Active Directory recovery methods

Active Directory is the bedrock of most Windows environments, so it’s best to be prepared if disaster strikes.

AD is an essential component in most organizations. You should monitor and maintain AD, such as clear out user and computer accounts you no longer need. With routine care, AD will run properly, but unforeseen issues can arise. There are a few common Active Directory recovery procedures you can follow using out-of-the-box technology.

Loss of a domain controller

Many administrators see losing a domain controller as a huge disaster, but the Active Directory recovery effort is relatively simple — unless your AD was not properly designed and configured. You should never rely on a single domain controller in your domain, and large sites should have multiple domain controllers. Correctly configured site links will keep authentication and authorization working even if the site loses its domain controller.

You have two possible approaches to resolve the loss of a domain controller. The first option is to try to recover the domain controller and bring it back into service. The second option is to replace the domain controller. I recommend adopting the second approach, which requires the following actions:

  • Transfer or seize any flexible single master operation roles to an active domain controller. If you seize the role, then you must ensure that the old role holder is never brought back into service.
  • Remove the old domain controller’s account from AD. This will also remove any metadata associated with the domain controller.
  • Build a new server, join to the domain, install AD Directory Services and promote to a domain controller.
  • Allow replication to repopulate the AD data.

How to protect AD data

Protecting data can go a long way to make an Active Directory recovery less of a problem. There are a number of ways to protect AD data. These techniques, by themselves, might not be sufficient. But, when you combine them, they provide a defense in depth that should enable you to overcome most, if not all, disasters.

First, enable accidental deletion protection on all of your organizational units (OUs), as well as user and computer accounts. This won’t stop administrators from removing an account, but they will get warned and might prevent an accident.

protect from accidental deletion option
Select the option to protect from accidental deletion when creating an organizational unit in AD Administrative Center.

Recover accounts from the AD recycle bin

Another way to avoid trouble is to enable the AD recycle bin. This is an optional feature used to restore a deleted object.

Enable-ADOptionalFeature -Identity 'Recycle Bin Feature' -Scope ForestOrConfigurationSet `-Target sphinx.org -Confirm:$false

After installing the feature, you may need to enable it through AD Administrative Center. Once added, you can’t uninstall the recycle bin.

Let’s run through a scenario where a user, whose properties are shown in the screenshot below, has been deleted.

Active Directory user account
An example of a typical user account in AD, including group membership

To check for deleted user accounts, run a search in the recycle bin:

Get-ADObject -Filter {objectclass -eq 'user' -and Deleted -eq $true} -IncludeDeletedObjects

The output for this command returns a deleted object, the user with the name Emily Brunel.

Active Directory recycle bin
An AD object found in the recycle bin

For a particularly volatile AD, you may need to apply further filters to identify the account you wish to restore.

If you have a significant number of objects in the recycle bin, use the object globally unique identifier (GUID) to identify the object to restore.

Get-ADObject -Filter {ObjectGUID -eq '73969b9d-05fa-4b45-a667-79baba1ac9a3'} 
`-IncludeDeletedObjects -Properties * | Restore-ADObject

The screenshot shows the restored object and its properties, including the group membership.

restored Active Directory user account
Restoring an AD user account from recycle bin

Generate AD snapshots

The AD recycle bin helps restore an object, but what do you do when you restore an account with incorrect settings?

To fix a user account in that situation, it helps to create AD snapshots to view previous settings and restore attributes. Use the following command from an elevated prompt:

ntdsutil snapshot 'Activate Instance NTDS' Create quit quit

The Ntdsutil command-line tool installs with AD and generates the output in this screenshot when creating the snapshot.

Active Directory snapshot
The command-line output when creating an AD snapshot

You don’t need to take snapshots on every domain controller. The number of snapshots will depend on the geographic spread of your organization and the arrangement of the administration team.

The initial snapshot captures the entire AD. Subsequent snapshots take incremental changes. The frequency of snapshots should be related to the amount of movement of the data in your AD.

Restore data from a snapshot

In this test scenario, let’s assume that the group memberships of a user account have been incorrectly changed. Run the following PowerShell commands to remove the user’s group memberships:

Remove-ADGroupMember -Identity finance -Members (Get-ADUser -Identity EmilyBrunel) -Confirm:$false
Remove-ADGroupMember -Identity department1 -Members (Get-ADUser -Identity EmilyBrunel) -Confirm:$false
Remove-ADGroupMember -Identity project1 -Members (Get-ADUser -Identity EmilyBrunel) -Confirm:$false

You need to identify the snapshot from which you will restore the data. The following command lists the snapshots:

ntdsutil snapshot 'List All' quit quit
Active Directory snapshots list
The Ntdsutil utility produces a list of the available AD snapshots.

To mount the snapshot, run the following command:

ntdsutil snapshot "mount f828eb4e-3a06-4bcb-8db6-2b07b54f9d5f" quit quit

Run the following command to open the snapshot:

dsamain -dbpath 'C:$SNAP_201909161530_VOLUMEC$WindowsNTDSntds.dit' -ldapport 51389

The Dsamain utility gets added to the system when you install AD Domain Services. Note that the console you use to mount and open the snapshot is locked.

Active Directory snapshot
Mount and open the AD snapshot.

When you view the group membership of the user account in your AD, it will be empty. The following command will not return any output:

Get-ADUser -Identity EmilyBrunel -Properties memberof | select -ExpandProperty memberof

When you view the same account from your snapshot, you can see the group memberships:

Get-ADUser -Identity EmilyBrunel -Properties memberof -Server TTSDC01.sphinx.org:51389  | select -ExpandProperty memberof
CN=Project1,OU=Groups,DC=Sphinx,DC=org
CN=Department1,OU=Groups,DC=Sphinx,DC=org
CN=Finance,OU=Groups,DC=Sphinx,DC=org

To restore the group memberships, run the following:

Get-ADUser -Identity EmilyBrunel -Properties memberof -Server TTSDC01.sphinx.org:51389  | select -ExpandProperty memberof | 
ForEach-Object {Add-ADGroupMember -Identity $_ -Members (Get-ADUser -Identity EmilyBrunel)}

After reinserting the group memberships from the snapshot version of the account, add the user into those groups in your production AD.

Your user account now has the correct group memberships:

Get-ADUser -Identity EmilyBrunel -Properties memberof | select -ExpandProperty memberof
CN=Project1,OU=Groups,DC=Sphinx,DC=org
CN=Department1,OU=Groups,DC=Sphinx,DC=org
CN=Finance,OU=Groups,DC=Sphinx,DC=org

Press Ctrl-C in the console in which you ran Dsamain, and then unmount the snapshot:

ntdsutil snapshot "unmount *" quit quit

Run an authoritative restore from a backup

In the last scenario, imagine you lost a whole OU’s worth of data, including the OU. You could do an Active Directory recovery using data from the recycle bin, but that would mean restoring the OU and any OUs it contained. You would then have to restore each individual user account. This could be a tedious and error-prone process if the data in the user accounts in the OU changes frequently. The solution is to perform an authoritative restore.

Before you can perform a restore, you need a backup. We’ll use Windows Server Backup because it is readily available. Run the following PowerShell command to install:

Install-WindowsFeature -Name Windows-Server-Backup

The following code will create a backup policy and run a system state backup:

Import-Module WindowsServerBackup
$wbp = New-WBPolicy

$volume = Get-WBVolume -VolumePath C:
Add-WBVolume -Policy $wbp -Volume $volume

Add-WBSystemState $wbp

$backupLocation = New-WBBackupTarget -VolumePath R:
Add-WBBackupTarget -Policy $wbp -Target $backupLocation

Set-WBVssBackupOptions -Policy $wbp -VssCopyBackup

Start-WBBackup -Policy $wbp

The following command creates a backup of the system state, including the AD database:

Add-WBSystemState $wbp

The following code creates a scheduled backup of the system state at 8 a.m., noon, 4 p.m. and 8 p.m.

Set-WBSchedule -Policy $wbp -Schedule 08:00, 12:00, 16:00, 20:00
Set-WBPolicy -Policy $wbp

In this example, let’s say an OU called Test with some critical user accounts got deleted.

Reboot the domain controller in which you’ve performed the backup, and go into Directory Services Recovery Mode. If your domain controller is a VM, you may need to use Msconfig to set the boot option rather than using the F8 key to get to the boot options menu.

$bkup = Get-WBBackupSet | select -Last 1
Start-WBSystemStateRecovery -BackupSet $bkup -AuthoritativeSysvolRecovery

Type Y, and press Enter to restore to original location.

At the prompt, restart the domain controller to boot back into recovery mode.

You need to mark the restored OU as authoritative by using Ntdsutil:

ntdsutil
C:Windowssystem32ntdsutil.exe: activate instance NTDS
Active instance set to "NTDS".
C:Windowssystem32ntdsutil.exe: authoritative restore
authoritative restore: restore subtree "ou=test,dc=sphinx,dc=org"

A series of messages will indicate the progress of the restoration, including the number of objects restored.

Exit ntdsutil
authoritative restore: quit
C:Windowssystem32ntdsutil.exe: quit

Restart the domain controller. Use Msconfig before the reboot to reset to a normal start.

The OU will be restored on your domain controller and will replicate to the other domain controllers in AD.

A complete loss of AD requires intervention

In the unlikely event of losing your entire AD forest, you’ll need to work through the AD forest recovery guide at this link. If you have a support agreement with Microsoft, then this would be the ideal time to use it.

Go to Original Article
Author:

For VMware, DSC provides ESXi host and resource management

PowerShell Desired State Configuration has been a favorite among Windows infrastructure engineers for years, and the advent of the VMware DSC module means users who already use DSC to manage Windows servers can use it to manage VMware, too. As VMware has continued to develop the module, it has increased the numbers of vSphere components the tool can manage, including VMware Update Manager.

DSC has been the configuration management tool of choice for Windows since it was released. No other tool offers such a wide array of capabilities to manage a Windows OS in code instead of through a GUI.

VMware also uses PowerShell technology to manage vSphere. The vendor officially states that PowerCLI, its PowerShell module, is the best automation tool it offers. So, it only makes sense that VMware would eventually incorporate DSC so that its existing PowerShell customers can manage their assets in code.

Why use DSC?

Managing a machine through configuration as code is not new, especially in the world of DevOps. You can write a server’s desired state in code, which ensures you can quickly resolve any drift in configuration by applying that configuration frequently.

In vSphere, ESXi hosts, in particular, are the prime candidates for this type of management. An ESXi host’s configurations do not change often, and when they do happen to change, admins must personally make that change. This means any change in the DSC configuration will apply to the hosts.

You can use this tool to manage a number of vSphere components, such as VMware Update Manger and vSphere Standard Switch.

How the LCM works

In DSC, the LCM makes up the brains of a node.

In DSC, Local Configuration Manager (LCM) makes up the brains of a node. It takes in the configuration file and then parses and applies the change locally.

ESXi and vCenter do not have LCM, so in the context of vSphere, you must use an LCM proxy, which runs as a Windows machine with PowerShell v5.1 and PowerCLI 10.1.1.

Installing the module

Installing the module is simple, as the DSC module is part of PowerShell Gallery. It only takes a single cmdlet to install the module on your LCM proxy:

C:> Install-Module -Name VMware.vSphereDSC

Updating the module when Windows releases additional versions is also a simple task. You can use the Update-Module cmdlet in PowerCLI:

C:> Update-Module vmware.vspheredsc

Resources

DSC ties a resource to a particular area of a system it can manage. The DSC module vmware.vspheredsc, for example, can manage various aspects of vSphere, such as the following:

C:Usersdan> Get-DscResource -Module vmware.vspheredsc | Select NameName
----
Cluster
Datacenter
DatacenterFolder
DrsCluster
Folder
HACluster
PowerCLISettings
vCenterSettings
vCenterStatistics
VMHostAccount
VMHostDnsSettings
VMHostNtpSettings
VMHostSatpClaimRule
VMHostService
VMHostSettings
VMHostSyslog
VMHostTpsSettings
VMHostVss
VMHostVssBridge
VMHostVssSecurity
VMHostVssShaping
VMHostVssTeaming

Many such resources are associated with ESXi hosts. You can manage settings such as accounts, Network Time Protocol and service through DSC. For clusters, manage settings such as HAEnabled, Distributed Resource Scheduler and DRS distribution. You can view the resources DSC can manage with the Get-DSCResource cmdlet:

C:> Get-DscResource -Name Cluster -Module vmware.vspheredsc -Syntax
Cluster [String] #ResourceName
{
[DependsOn = [String[]]]
[PsDscRunAsCredential = [PSCredential]]
Server = [String]
Credential = [PSCredential]
Name = [String]
Location = [String] DatacenterName = [String]
DatacenterLocation = [String]
Ensure = [String]
[HAEnabled = [Boolean]]
[HAAdmissionControlEnabled = [Boolean]]
[HAFailoverLevel = [Int32]]
[HAIsolationResponse = [String]]
[HARestartPriority = [String]]
[DrsEnabled = [Boolean]]
[DrsAutomationLevel = [String]]
[DrsMigrationThreshold = [Int32]]
[DrsDistribution = [Int32]]
[MemoryLoadBalancing = [Int32]]
[CPUOverCommitment = [Int32]]
}

With the capabilities of DSC now available to VMware admins, as well as Windows admins, they can control a variety of server variables through code and make vSphere and vCenter automation easy and accessible. They can apply broad changes across an entire infrastructure of hosts and ensure consistent configuration.

Go to Original Article
Author:

How to install and test Windows Server 2019 IIS

Transcript – How to install and test Windows Server 2019 IIS

In this video, I want to show you how to install Internet Information Services, or IIS, and prepare it for use.

I’m logged into a domain-joined Windows Server 2019 machine and I’ve got the Server Manager open. To install IIS, click on Manage and choose the Add Roles and Features option. This launches the Add Roles and Features wizard. Click Next on the welcome screen and choose role-based or feature-based installation for the installation type and click Next.

Make sure that My Server is selected and click Next. I’m prompted to choose the roles that I want to deploy. We have an option for web server IIS. That’s the option I’m going to select. When I do that, I’m prompted to install some dependency features, so I’m going to click on Add Features and I’ll click Next.

I’m taken to the features screen. All the dependency features that I need are already being installed, so I don’t need to select anything else. I’ll click Next, Next again, Next again on the Role Services — although if you do need to install any additional role services to service the IIS role, this is where you would do it. You can always enable these features later on, so I’ll go ahead and click Next.

I’m taken to the Confirmation screen and I can review my configuration selections. Everything looks good here, so I’ll click install and IIS is being installed.

Testing Windows Server 2019 IIS

The next thing that I want to do is test IIS to make sure that it’s functional. I’m going to go ahead and close this out and then go to local server. I’m going to go to IE Enhanced Security Configuration. I’m temporarily going to turn this off just so that I can test IIS. I’ll click OK and I’ll close Server Manager.

The next thing that I want to do is find this machine’s IP address, so I’m going to right-click on the Start button and go to Run and type CMD to open a command prompt window, and then from there, I’m going to type ipconfig.

Here I have the server’s IP address, so now I can open up an Internet Explorer window and enter this IP address and Internet Information Services should respond. I’ve entered the IP address, then I press enter and I’m taken to the Internet Information Services screen. IIS is working at this point.

I’ll go ahead and close this out. If this were a real-world deployment, one of the next things that you would probably want to do is begin uploading some of the content that you’re going to use on your website so that you can begin testing it on this server.

I’ll go ahead and open up file explorer and I’ll go to this PC, driver and inetpub folder and the wwwroot subfolder. This is where you would copy all of your files for your website. You can configure IIS to use a different folder, but this is the one used by default for IIS content. You can see the files right here that make up the page that you saw a moment ago.

How to work with the Windows Server 2019 IIS bindings

Let’s take a look at a couple of the configuration options for IIS. I’m going to go ahead and open up Server Manager and what I’m going to do now is click on Tools, and then I’m going to choose the Internet Information Services (IIS) Manager. The main thing that I wanted to show you within the IIS Manager is the bindings section. The bindings allow traffic to be directed to a specific website, so you can see that, right now, we’re looking at the start page and, right here, is a listing for my IIS server.

I’m going to go ahead and expand this out and I’m going to expand the site’s container and, here, you can see the default website. This is the site that I’ve shown you just a moment ago, and then if we look over here on the Actions menu, you can see that we have a link for Bindings. When I open up the Bindings option, you can see by default we’re binding all HTTP traffic to port 80 on all IP addresses for the server.

We can edit [the site bindings] if I select [the site] and click on it. You can see that we can select a specific IP address. If the server had multiple IP addresses associated with it, we could link a different IP address to each site. We could also change the port that’s associated with a particular website. For example, if I wanted to bind this particular website to port 8080, I could do that by changing the port number. Generally, you want HTTP traffic to flow on port 80. The other thing that you can do here is to assign a hostname to the site, for example www.contoso.com or something to that effect.

The other thing that I want to show you in here is how to associate HTTPS traffic with a site. Typically, you’re going to have to have a certificate to make that happen, but assuming that that’s already in place, you click on Add and then you would change the type to HTTPS and then you can choose an IP address; you can enter a hostname; and then you would select your SSL certificate for the site.

You’ll notice that the port number is set to 443, which is the default port that’s normally used for HTTPS traffic. So, that’s how you install IIS and how you configure the bindings for a website.

+ Show Transcript

Go to Original Article
Author:

Can Windows Server Standard Really Only Run 2 Hyper-V VMs?

Q. Can Windows Server Standard Edition really only run 2 Hyper-V virtual machines?

A. No. Standard Edition can run just as many virtual machines as Datacenter Edition.

I see and field this particular question quite frequently. A misunderstanding of licensing terminology and a lot of tribal knowledge has created an image of an artificial limitation with standard edition. The two editions have licensing differences. Their Hyper-V related functional differences:

Otherwise, the two editions share functionality.

The True Limitation

The correct statement behind the misconception: a physical host with the minimum Windows Standard Edition license can operate two virtualized instances of Windows Server Standard Edition, as long as the physically-installed instance only operates the virtual machines. That’s a lot to say. But, anything less does not tell the complete story. Despite that, people try anyway. Unfortunately, they shorten it all the way down to, “you can only run two virtual machines,” which is not true.

Virtual Machines Versus Instances

First part: a “virtual machine” and an “operating system instance” are not the same thing. When you use Hyper-V Manager or Failover Cluster Manager or PowerShell to create a new virtual machine, that’s a VM. That empty, non-functional thing that you just built. Hyper-V has a hard limit of 1,024 running virtual machines. I have no idea how many total VMs it will allow. Realistically, you will run out of hardware resources long before you hit any of the stated limits. Up to this point, everything applies equally to Windows Server Standard Edition and Windows Server Datacenter Edition (and Hyper-V Server, as well).

The previous paragraph refers to functional limits. The misstatement that got us here sources from licensing limits. Licenses are legal things. You give money to Microsoft, they allow you to run their product. For this discussion, their operating system products concern us. The licenses in question allow us to run instances of Windows Server. Each distinct, active Windows kernel requires sufficient licensing.

Explaining the “Two”

The “two” is the most truthful part of the misconception. One Windows Server Standard Edition license pack allows for two virtualized instances of Windows Server. You need a certain number of license packs to reach a minimum level (see our eBook on the subject for more information). As a quick synopsis, the minimum license purchase applies to a single host and grants:

  • One physically-installed instance of Windows Server Standard Edition
  • Two virtualized instances of Windows Server Standard Edition

This does not explain everything — only enough to get through this article. Read the linked eBook for more details. Consult your license reseller. Insufficient licensing can cost you a great deal in fines. Take this seriously and talk to trained counsel.

What if I Need More Than Two Virtual Machines on Windows Server Standard Edition?

If you need to run three or more virtual instances of Windows Server, then you buy more licenses for the host. Each time you satisfy the licensing requirements, you have the legal right to run another two Windows Server Standard instances. Due to the per-core licensing model introduced with Windows Server 2016, the minimums vary based on the total number of cores in a system. See the previously-linked eBook for more information.

What About Other Operating Systems?

If you need to run Linux or BSD instances, then you run them (some distributions do have paid licensing requirements; the distribution manufacturer makes the rules). Linux and BSD instances do not count against the Windows Server instances in any way. If you need to run instances of desktop Windows, then need one Windows license per instance at the very leastI do not like to discuss licensing desktop Windows as it has complications and nuances. Definitely consult a licensing expert about those situations. In any case, the two virtualized instances granted by a Windows Server Standard license can only apply to Windows Server Standard.

What About Datacenter Edition?

Mostly, people choose Datacenter Edition for the features. If you need Storage Spaces Direct, then only Datacenter Edition can help you. However, Datacenter Edition allows for an unlimited number of running Windows Server instances. If you run enough on a single host, then the cost for Windows Server Standard eventually meets or exceeds the cost of Datacenter Edition. The exact point depends on the discounts you qualify for. You can expect to break even somewhere around ten to twelve virtual instances.

What About Failover Clustering?

Both Standard and Datacenter Edition can participate as full members in a failover cluster. Each physical host must have sufficient licenses to operate the maximum number of virtual machines it might ever run simultaneously. Consult with your license reseller for more information.

Go to Original Article
Author: Eric Siron