Tag Archives: Server

Wanted – 2 or 4 drive NAS for Plex

I’ll admit, I used an external server to do all my transcoding I’m unable to answer the questions re: streaming.

Overall, yes, the HDDs have a fair few hours on them (36200 hours each) but using the Synology SMART check on them shows no issues. In terms of usage, it’s quite little, hence the sale so there has been very little read/write to the drives overall for the age (which is approximately 3-4 years from memory).

It’s literally the NAS, the power supply and the drives that I’ve got (no box etc) and postage is likely to be pricey, all in I’d probably be shooting for £135 delivered. £100 for the NAS, £25 for the pair of drives and £10 for delivery. There aren’t many to compare prices on here, but its comparable with eBay after fees etc.

Go to Original Article
Author:

Wanted – 2 or 4 drive NAS for Plex

I’ll admit, I used an external server to do all my transcoding I’m unable to answer the questions re: streaming.

Overall, yes, the HDDs have a fair few hours on them (36200 hours each) but using the Synology SMART check on them shows no issues. In terms of usage, it’s quite little, hence the sale so there has been very little read/write to the drives overall for the age (which is approximately 3-4 years from memory).

It’s literally the NAS, the power supply and the drives that I’ve got (no box etc) and postage is likely to be pricey, all in I’d probably be shooting for £135 delivered. £100 for the NAS, £25 for the pair of drives and £10 for delivery. There aren’t many to compare prices on here, but its comparable with eBay after fees etc.

Go to Original Article
Author:

Battle lines over Windows Server 2008 migration drawn

With technical support for Windows Server 2008 ending this week, the battle between Microsoft and AWS for the hearts and wallets of its corporate users is underway.

At its re:Invent conference last month, AWS introduced its appropriately named AWS End-of-Support Migration Program (EMP) for Windows Server, aimed at helping users with their Windows Server 2008 migration efforts. The program promises to make it easier to shift users’ existing Windows Server 2008 workloads over to newer versions of Windows running on servers in AWS’ data centers. The EMP technology decouples the applications from the underlying operating system, thereby allowing AWS partners to migrate mission-critical applications over to the newer versions of Windows Server.

The technology reportedly identifies whatever dependencies the application has on Windows Server 2008 and then pulls together the resources needed for applications to run on the updated version of Windows Server. The package of software includes all applications files, runtimes, components and deployment tools, along with an engine that redirects API calls from your application to files within the package, the company said.

Punching back in a blog this week, Vijay Kumar, director of Windows Server and Azure products at Microsoft, stressed the advantages of his company’s products for users undergoing Windows 2008 server migration efforts. Users can deploy Windows Server workloads in Azure a number of ways, he wrote, including the company’s Virtual Machines on Azure, Azure VMware Solutions and Azure Dedicated Host. Users can also apply Azure Hybrid Benefit service to leverage their existing Windows Server licenses in Azure.

Kumar also noted that users can take advantage of Microsoft’s Extended Security Updates program specifically aimed at Windows Server 2008/R2 users, which provides an additional three years of security updates. This can buy users more time to plan their transition paths for core applications and services, he wrote.

The battle to own Windows Server 2008 migration

AWS has long targeted Windows Server users and, in fact, has convinced more than a few IT shops to switch over to AWS EC2 cloud environment. It stepped up those efforts with the introduction of its AWS-Microsoft Workload Competency program for partners last fall, according to one analyst.

[AWS] had as many as 14,000 Windows Server customers running on EC2 as of July 2019. That number is a fivefold increase over 2015.
Meaghan McGrathSenior analyst, Technology Business Review

“[AWS] had as many as 14,000 Windows Server customers running on EC2 as of July 2019,” said Meaghan McGrath, a senior analyst at Technology Business Review. “That number is a fivefold increase over 2015.”

Microsoft has stemmed some of the bleeding, however, McGrath added. For instance, the company has convinced many of its partners to push its free migration assessment program, which gives users a more precise estimate of what their total cost of ownership will be by keeping their SQL Server workloads in Microsoft environments compared to migrating them to AWS’s EC2. But the company is also applying some financial pressure, as well.

“As of last fall, there is a caveat in the Software Assurance contracts among [SQL Server] users that made it much more expensive for them to bring their licenses over to another vendor’s hosted environment,” McGrath said. “The other financial incentive is [Microsoft’s] Azure Hyper Benefit program, which offers users a discount on Azure services for migrating their workloads from licensed software.”

32-bit apps snagging Windows Server 2008 migration efforts

Last summer, Microsoft officials said the operating system still represents 60% of the company’s overall server installed base — a number that’s likely so large because it’s the last 32-bit version of Windows Server. Many corporate users developed customized applications for the platform, which can be expensive and time-consuming to migrate to 64-bit platforms. Users can also have difficulty migrating a 32-bit app to a 64-bit environment that was purchased from a reputable third-party developer, typically because that developer has discontinued support for that offering.

Paul DeloryPaul Delory

“When you are dealing with a [Windows Server] 2008 app, you can’t assume there will be a 64-bit version of that app available,” said Paul Delory, a research director at Gartner. “Users have to coordinate with all their vendors from whom they bought commercial software to know if they are supporting their app on the new OS. If not, you have to factor in the associated costs there.”

Still, the added expense of adapting your existing 32-bit app on Windows Server 2008 is not nearly as expensive as remaining with your existing versions of the operating system and associated applications. With the product going out of technical support this week, users will have to pay for Microsoft’s Extended Support, which could double the cost for the technical support they were getting under their initial services agreement.

“You can go to extended support, which gets you three years’ worth of updates, but that requires you to have Software Assurance,” Delory said. “Extended support costs you 75% of your annual licensing costs, and SA [Software Assurance] is an additional 25%, making it twice as much.”

He said a practical and less expensive solution for users facing this situation is to consider gravitating to a SaaS-based offering such as Office 365 or a similar offering with the same capabilities.

“Something like [Office 365] will be the path of least resistance for many companies because it offers them the chance to sidestep some of these problems,” Delory said. “You can make these problems someone else’s in exchange for a reasonable monthly fee.”

Other options for users leaning away from a Windows Server 2008 migration are much less attractive. They can leave the server in place and mitigate the vulnerabilities as best they can, Delory said, or tuck it behind a firewall and whitelist only certain IP addresses or leave certain ports open.

“You can bring in an Intrusion Prevention System to detect vulnerabilities, but that system must have an understanding of Windows Server 2008 vulnerabilities and be able to maintain them across all your applications,” Delory said.

Go to Original Article
Author:

For Sale – For parts or complete. Desktop CAD/Photoshop etc. i7, Nvidia quadro…

Selling my project PC. Has been used (successfully) as a CCTV server for the past 18 months – 2 years without ever being pushed. All parts were bought new but no retail packaging. Please assume no warranty. No operating system installed either. Selling as we’ve now upgraded to a dedicated Xeon server. Parts listed below.

Generic desktop tower case.
Supermicro C7H270-CG-ML motherboard.
Intel i7 7700 3.6 ghz with stock cooler.
PNY Nvidia quadro M2000 4gb.
Kingston hyperx fury DDR4 16gb RAM (2x8gb).
Seagate Skyhawk 4tb HDD (NO OS).
ACBEL 300w PSU.

Aside from the PSU this a solid machine with decent potential. Could easily be used for gaming with one or two changes and could be used for CAD or photoshop as is (or just change PSU). This handled HIKVision and up to 56 cameras (we had 13 on screen at any one time, could handle more) but admittedly struggled with playback on any more than four cameras at once (All 4K cameras). The case has a dent or two in it but entirely useable. Did intend to keep it for the Mrs for her photography but she’s bought a MacBook instead.

Cost around £2000 new. Asking £700 including postage but collection preferred (from Plymouth). Very open to offers as I’ve struggled to price this up to be honest.

Cheers, Chocky.

Go to Original Article
Author:

Using wsusscn2.cab to find missing Windows updates

Keeping your Windows Server and Windows desktop systems updated can be tricky, and finding missing patches in conventional ways might not be reliable.

There are a few reasons why important security patches might not get installed. They could be mistakenly declined in Windows Server Update Services or get overlooked in environments that a lack an internet connection.

Microsoft provides a Windows Update offline scan file, also known as wsusscn2.cab, to help you check Windows systems for missing updates. The CAB file contains information about most patches for Windows and Microsoft applications distributed through Windows Update.

The challenge with the wsusscn2.cab file is its size. It weighs in around 650 MB, and distributing it to all the servers to perform a scan can be tricky and time-consuming. This tutorial explains how to avoid those issues and run it on all of your servers in a secure and timely manner using IIS for file transfer instead of SMB or PowerShell sessions.

Requirements for offline scanning

There are some simple requirements to use this tutorial:

  • a server or PC running Windows Server 2012 or newer or Windows 10;
  • a domain account with local administrator on the servers you want to scan; and
  • PowerShell remoting enabled on the servers you want to scan.

Step 1. Install IIS

First, we need a web server we can use to distribute the wsusscn2.cab file. There are several ways to copy the file, but they all have different drawbacks.

For example, we could distribute the wsusscn2.cab file with a regular file share, but that requires a double-hop. You could also copy the wsusscn2.cab file over a PowerShell session, but that causes a lot of overhead and is extremely slow for large files. An easier and more secure way to distribute the file is through HTTP and IIS.

Installing on Windows Server

Start PowerShell as admin and type the following to install IIS:

Install-WindowsFeature -name Web-Server -IncludeManagementTools

Installing on Windows 10

Start PowerShell as an admin and type the following to install IIS:

Enable-WindowsOptionalFeature -Online -FeatureName IIS-WebServer

The IIS role should be installed. The default site will point to the root folder of the C drive.

We can now proceed to download wsusscn2.cab from Microsoft.

Step 2. Download wsusscn2.cab

The link for this file can be tricky to find. You can either download it from this link and save it to the C drive or run the following script as admin on the IIS server:

# Default Site path, change if necessary
$IISFolderPath = "C:inetpubwwwroot"

# Download wsusscn2.cab
Start-BitsTransfer -Source "http://go.microsoft.com/fwlink/?linkid=74689" -Destination "$IISFolderPathwsusscn2.cab"

The script downloads the file to the wwwroot folder. We can verify the download by browsing to http:///wsusscn2.cab.

You also need to get the hash value of wsusscn2.cab to verify it. After saving it, run the following PowerShell command to check the file hash:

(Get-FileHash C:inetpubwwwrootwsusscn2.cab).Hash

31997CD01B8790CA68A02F3A351F812A38639FA49FEC7346E28F7153A8ABBA05

Step 3. Run the check on a server

Next, you can use a PowerShell script to download and scan for missing updates on a PC or server using the wsusscn2.cab file. You can run the script on at least Windows Server 2008 or newer to avoid compatibility issues. To do this in a secure and effective manner over HTTP, we get the file hash of the downloaded wsusscn2.cab file and compare it with the file hash of the CAB file on the IIS server.

We can also use the file hash to see when Microsoft releases a new version of wsusscn2.cab.

Copy and save the following script as Get-MissingUpdates.ps1:

Param(
    [parameter(mandatory)]
    [string]$FileHash,

    [parameter(mandatory)]
    [string]$Wsusscn2Url
)


Function Get-Hash($Path){
    
    $Stream = New-Object System.IO.FileStream($Path,[System.IO.FileMode]::Open) 
    
    $StringBuilder = New-Object System.Text.StringBuilder 
    $HashCreate = [System.Security.Cryptography.HashAlgorithm]::Create("SHA256").ComputeHash($Stream)
    $HashCreate | Foreach {
        $StringBuilder.Append($_.ToString("x2")) | Out-Null
    }
    $Stream.Close() 
    $StringBuilder.ToString() 
}

$DataFolder = "$env:ProgramDataWSUS Offline Catalog"
$CabPath = "$DataFolderwsusscn2.cab"

# Create download dir
mkdir $DataFolder -Force | Out-Null

# Check if cab exists
$CabExists = Test-Path $CabPath


# Compare hashes if download is needed
if($CabExists){
    Write-Verbose "Comparing hashes of wsusscn2.cab"
    
    $HashMatch = $Hash -ne (Get-Hash -Path $CabPath)

    if($HashMatch){   
        Write-Warning "Filehash of $CabPath did not match $($FileHash) - downloading"
        Remove-Item $CabPath -Force
    }
    Else{
        Write-Verbose "Hashes matched"
    }
}

# Download wsus2scn.cab if it dosen't exist or hashes mismatch
if(!$CabExists -or $HashMatch -eq $false){
    Write-Verbose "Downloading wsusscn2.cab"
    # Works on Windows Server 2008 as well
    (New-Object System.Net.WebClient).DownloadFile($Wsusscn2Url, $CabPath)

    if($Hash -ne (Get-Hash -Path $CabPath)){
        Throw "$CabPath did not match $($FileHash)"
    }

}

Write-Verbose "Checking digital signature of wsusscn2.cab"

$CertificateIssuer = "CN=Microsoft Code Signing PCA, O=Microsoft Corporation, L=Redmond, S=Washington, C=US"
$Signature = Get-AuthenticodeSignature -FilePath $CabPath
$SignatureOk = $Signature.SignerCertificate.Issuer -eq $CertificateIssuer -and $Signature.Status -eq "Valid"


If(!$SignatureOk){
    Throw "Signature of wsusscn2.cab is invalid!"
}


Write-Verbose "Creating Windows Update session"
$UpdateSession = New-Object -ComObject Microsoft.Update.Session
$UpdateServiceManager  = New-Object -ComObject Microsoft.Update.ServiceManager 

$UpdateService = $UpdateServiceManager.AddScanPackageService("Offline Sync Service", $CabPath, 1) 

Write-Verbose "Creating Windows Update Searcher"
$UpdateSearcher = $UpdateSession.CreateUpdateSearcher()  
$UpdateSearcher.ServerSelection = 3
$UpdateSearcher.ServiceID = $UpdateService.ServiceID.ToString()
 
Write-Verbose "Searching for missing updates"
$SearchResult = $UpdateSearcher.Search("IsInstalled=0")

$Updates = $SearchResult.Updates

$UpdateSummary = [PSCustomObject]@{

    ComputerName = $env:COMPUTERNAME    
    MissingUpdatesCount = $Updates.Count
    Vulnerabilities = $Updates | Foreach {
        $_.CveIDs
    }
    MissingUpdates = $Updates | Select Title, MsrcSeverity, @{Name="KBArticleIDs";Expression={$_.KBArticleIDs}}
}

Return $UpdateSummary

Run the script on one of the servers of computers to check for missing updates. To do this, copy the script to the machine and run the script with the URL to the wsusscn2.cab on the IIS server and the hash value from step two:

PS51> Get-MissingUpdates.ps1 -Wsusscn2Url "http://
  
   /wsusscn2.cab" -FileHash 31997CD01B8790CA68A02F3A351F812A38639FA49FEC7346E28F7153A8ABBA05
  

If there are missing updates, you should see output similar to the following:

ComputerName     MissingUpdatesCount Vulnerabilities  MissingUpdates
------------     ------------------- ---------------  --------------
UNSECURESERVER                    14 {CVE-2006-4685, CVE-2006-4686,
CVE-2019-1079, CVE-2019-1079...} {@{Title=MSXML 6.0 RTM Security Updat

If the machine is not missing updates, then you should see this type of output:

ComputerName MissingUpdatesCount Vulnerabilities MissingUpdates
------------ ------------------- --------------- --------------
SECURESERVER                   0

The script gives a summary of the number of missing updates, what those updates are and the vulnerabilities they patch.

This process is a great deal faster than searching for missing updates online. But this manual method is not efficient when checking a fleet of servers, so let’s learn how to run the script on all systems and collect the output.

Step 4. Run the scanning script on multiple servers at once

The easiest way to collect missing updates from all servers with PowerShell is with a PowerShell job. The PowerShell jobs run in parallel on all computers, and you can fetch the results.

On a PC or server, save the file from the previous step to the C drive — or another directory of your choice — and run the following as a user with admin permissions on your systems:

# The servers you want to collect missing updates from
$Computers = @(
        'server1',
        'server2',
        'server3'
)

# These are the arguments that will be sent to the remote servers
$RemoteArgs = @(
    # File hash from step 2
    "31997CD01B8790CA68A02F3A351F812A38639FA49FEC7346E28F7153A8ABBA05",
    "http://$env:COMPUTERNAME/wsusscn2.cab"
)

$Params = @{
    ComputerName = $Computers
    ArgumentList = $RemoteArgs
    AsJob        = $True
    # Filepath to the script on the server/computer you are running this command on
    FilePath = "C:ScriptsGet-MissingUpdates.ps1"
    # Maximum number of active jobs
    ThrottleLimit = 20
}

$Job = Invoke-Command @Params

# Wait for all jobs to finish
$Job | Wait-Job

# Collect Results from the jobs
$Results = $Job | Receive-Job

# Show results
$Results

This runs the Get-MissingUpdates.ps1 script on all servers in the $Computers variable in parallel to save time and make it easier to collect the results.

You should run these PowerShell jobs regularly to catch servers with a malfunctioning Windows Update and to be sure important updates get installed.

Go to Original Article
Author:

Are containers on Windows the right choice for you?

It’s nearly the end of the road for Windows Server 2008/2008 R2. Some of the obvious migration choices are a newer version of Windows Server or moving the workload into Azure. But does a move to containers on Windows make sense?

After Jan. 14, 2020, Microsoft ends extended support for the Windows Server 2008 and 2008 R2 OSes, which also means no more security updates unless one enrolls in the Extended Security Update program. While Microsoft prefers that its customers move to the Azure cloud platform, another choice is to use containers on Windows.

Understand the two different virtualization technologies

If you are thinking about containerizing Windows Server 2008/2008 R2 workloads, then you need to consider the ways containers differ from a VM. The most basic difference is a container is much lighter than a VM. Whereas each VM has its own OS, containers share a base OS image. The container generally includes application binaries and anything else necessary to run the containerized application.

Containers share a common kernel, which has advantages and disadvantages. One advantage is containers can be extremely small in size. It is quite common for a container to be less than 100 MB, which enables them to be brought online very quickly. The low overhead of containers makes it possible to run far more containers than VMs on the host server.

However, containers share a common kernel. If the kernel fails, then all the containers that depend on it will also fail. Similarly, a poorly written application can destabilize the kernel, leading to problems with other containers on the system.

VMs vs. Docker

As a Windows administrator considering containerizing legacy Windows Server workloads, you need to consider the fundamental difference between VMs and containers. While containers do have their place, they are a poor choice for applications with high security requirements due to the shared kernel or for applications with a history of sporadic stability issues.

Another major consideration with containers is storage. Early on, containers were used almost exclusively for stateless workloads because containers could not store data persistently. Unlike a VM, shutting down a container deletes all data within the container.

Container technology has evolved to support persistent storage through the use of data volumes. Even so, it can be difficult to work with data volumes. Applications that have complex storage requirement usually aren’t a good fit for containerization. For example, database applications tend to be poor candidates for containerization due to complex storage configuration.

If you are used to managing physical or virtual Windows Server machines, you might think of setting up persistent storage as a migration specific task. While there is a requirement to provide a containerized application with the persistent storage that it needs, it’s a one-time task completed as part of the application migration. It is important to remember that containers are designed to be completely portable. A containerized application can move from a development and test environment to a production server or to a cloud host without the need to repackage the application. Setting up complex storage dependencies can undermine container portability; an organization will need to consider whether a newly containerized application will ever need to be moved to another location.

What applications are suited for containers?

As part of the decision-making process related to using containers on Windows, it is worth considering what types of applications are best suited for this type of deployment. Almost any application can be containerized, but the ideal candidate is a stateless application with varying scalability requirements. For example, a front-end web application is often an excellent choice for a containerized deployment for a few reasons. First, web applications tend to be stateless. Data is usually saved on a back-end database that is separate from the front-end application. Second, container platforms work well to meet an application’s scalability requirements. If a web application sees a usage spike, additional containers can instantly spin up to handle the demand. When the spike ebbs, it’s just a matter of deleting the containers.

Before migrating any production workloads to containers, the IT staff needs to develop the necessary expertise to deploy and manage containers. While container management is not usually overly difficult, it is completely different from VM management. Windows Server 2019 supports the use of Hyper-V containers, but you cannot use Hyper-V Manager to create, delete and migrate containers in the same way that you would perform these actions on a VM.

Containers are a product of the open source world and are therefore managed from the command line using Linux-style commands that are likely to be completely foreign to many Windows administrators. There are GUI-based container management tools, such as Kubernetes, but even these tools require some time and effort to understand. As such, having the proper training is essential to a successful container deployment.

Despite their growing popularity, containers are not an ideal fit for every workload. While some Windows Server workloads are good candidates for containerization, other workloads are better suited as VMs. As a general rule, organizations should avoid containerizing workloads that have complex, persistent storage requirements or require strict kernel isolation.

Go to Original Article
Author:

How to install and test Windows Server 2019 IIS

Transcript – How to install and test Windows Server 2019 IIS

In this video, I want to show you how to install Internet Information Services, or IIS, and prepare it for use.

I’m logged into a domain-joined Windows Server 2019 machine and I’ve got the Server Manager open. To install IIS, click on Manage and choose the Add Roles and Features option. This launches the Add Roles and Features wizard. Click Next on the welcome screen and choose role-based or feature-based installation for the installation type and click Next.

Make sure that My Server is selected and click Next. I’m prompted to choose the roles that I want to deploy. We have an option for web server IIS. That’s the option I’m going to select. When I do that, I’m prompted to install some dependency features, so I’m going to click on Add Features and I’ll click Next.

I’m taken to the features screen. All the dependency features that I need are already being installed, so I don’t need to select anything else. I’ll click Next, Next again, Next again on the Role Services — although if you do need to install any additional role services to service the IIS role, this is where you would do it. You can always enable these features later on, so I’ll go ahead and click Next.

I’m taken to the Confirmation screen and I can review my configuration selections. Everything looks good here, so I’ll click install and IIS is being installed.

Testing Windows Server 2019 IIS

The next thing that I want to do is test IIS to make sure that it’s functional. I’m going to go ahead and close this out and then go to local server. I’m going to go to IE Enhanced Security Configuration. I’m temporarily going to turn this off just so that I can test IIS. I’ll click OK and I’ll close Server Manager.

The next thing that I want to do is find this machine’s IP address, so I’m going to right-click on the Start button and go to Run and type CMD to open a command prompt window, and then from there, I’m going to type ipconfig.

Here I have the server’s IP address, so now I can open up an Internet Explorer window and enter this IP address and Internet Information Services should respond. I’ve entered the IP address, then I press enter and I’m taken to the Internet Information Services screen. IIS is working at this point.

I’ll go ahead and close this out. If this were a real-world deployment, one of the next things that you would probably want to do is begin uploading some of the content that you’re going to use on your website so that you can begin testing it on this server.

I’ll go ahead and open up file explorer and I’ll go to this PC, driver and inetpub folder and the wwwroot subfolder. This is where you would copy all of your files for your website. You can configure IIS to use a different folder, but this is the one used by default for IIS content. You can see the files right here that make up the page that you saw a moment ago.

How to work with the Windows Server 2019 IIS bindings

Let’s take a look at a couple of the configuration options for IIS. I’m going to go ahead and open up Server Manager and what I’m going to do now is click on Tools, and then I’m going to choose the Internet Information Services (IIS) Manager. The main thing that I wanted to show you within the IIS Manager is the bindings section. The bindings allow traffic to be directed to a specific website, so you can see that, right now, we’re looking at the start page and, right here, is a listing for my IIS server.

I’m going to go ahead and expand this out and I’m going to expand the site’s container and, here, you can see the default website. This is the site that I’ve shown you just a moment ago, and then if we look over here on the Actions menu, you can see that we have a link for Bindings. When I open up the Bindings option, you can see by default we’re binding all HTTP traffic to port 80 on all IP addresses for the server.

We can edit [the site bindings] if I select [the site] and click on it. You can see that we can select a specific IP address. If the server had multiple IP addresses associated with it, we could link a different IP address to each site. We could also change the port that’s associated with a particular website. For example, if I wanted to bind this particular website to port 8080, I could do that by changing the port number. Generally, you want HTTP traffic to flow on port 80. The other thing that you can do here is to assign a hostname to the site, for example www.contoso.com or something to that effect.

The other thing that I want to show you in here is how to associate HTTPS traffic with a site. Typically, you’re going to have to have a certificate to make that happen, but assuming that that’s already in place, you click on Add and then you would change the type to HTTPS and then you can choose an IP address; you can enter a hostname; and then you would select your SSL certificate for the site.

You’ll notice that the port number is set to 443, which is the default port that’s normally used for HTTPS traffic. So, that’s how you install IIS and how you configure the bindings for a website.

+ Show Transcript

Go to Original Article
Author:

Can Windows Server Standard Really Only Run 2 Hyper-V VMs?

Q. Can Windows Server Standard Edition really only run 2 Hyper-V virtual machines?

A. No. Standard Edition can run just as many virtual machines as Datacenter Edition.

I see and field this particular question quite frequently. A misunderstanding of licensing terminology and a lot of tribal knowledge has created an image of an artificial limitation with standard edition. The two editions have licensing differences. Their Hyper-V related functional differences:

Otherwise, the two editions share functionality.

The True Limitation

The correct statement behind the misconception: a physical host with the minimum Windows Standard Edition license can operate two virtualized instances of Windows Server Standard Edition, as long as the physically-installed instance only operates the virtual machines. That’s a lot to say. But, anything less does not tell the complete story. Despite that, people try anyway. Unfortunately, they shorten it all the way down to, “you can only run two virtual machines,” which is not true.

Virtual Machines Versus Instances

First part: a “virtual machine” and an “operating system instance” are not the same thing. When you use Hyper-V Manager or Failover Cluster Manager or PowerShell to create a new virtual machine, that’s a VM. That empty, non-functional thing that you just built. Hyper-V has a hard limit of 1,024 running virtual machines. I have no idea how many total VMs it will allow. Realistically, you will run out of hardware resources long before you hit any of the stated limits. Up to this point, everything applies equally to Windows Server Standard Edition and Windows Server Datacenter Edition (and Hyper-V Server, as well).

The previous paragraph refers to functional limits. The misstatement that got us here sources from licensing limits. Licenses are legal things. You give money to Microsoft, they allow you to run their product. For this discussion, their operating system products concern us. The licenses in question allow us to run instances of Windows Server. Each distinct, active Windows kernel requires sufficient licensing.

Explaining the “Two”

The “two” is the most truthful part of the misconception. One Windows Server Standard Edition license pack allows for two virtualized instances of Windows Server. You need a certain number of license packs to reach a minimum level (see our eBook on the subject for more information). As a quick synopsis, the minimum license purchase applies to a single host and grants:

  • One physically-installed instance of Windows Server Standard Edition
  • Two virtualized instances of Windows Server Standard Edition

This does not explain everything — only enough to get through this article. Read the linked eBook for more details. Consult your license reseller. Insufficient licensing can cost you a great deal in fines. Take this seriously and talk to trained counsel.

What if I Need More Than Two Virtual Machines on Windows Server Standard Edition?

If you need to run three or more virtual instances of Windows Server, then you buy more licenses for the host. Each time you satisfy the licensing requirements, you have the legal right to run another two Windows Server Standard instances. Due to the per-core licensing model introduced with Windows Server 2016, the minimums vary based on the total number of cores in a system. See the previously-linked eBook for more information.

What About Other Operating Systems?

If you need to run Linux or BSD instances, then you run them (some distributions do have paid licensing requirements; the distribution manufacturer makes the rules). Linux and BSD instances do not count against the Windows Server instances in any way. If you need to run instances of desktop Windows, then need one Windows license per instance at the very leastI do not like to discuss licensing desktop Windows as it has complications and nuances. Definitely consult a licensing expert about those situations. In any case, the two virtualized instances granted by a Windows Server Standard license can only apply to Windows Server Standard.

What About Datacenter Edition?

Mostly, people choose Datacenter Edition for the features. If you need Storage Spaces Direct, then only Datacenter Edition can help you. However, Datacenter Edition allows for an unlimited number of running Windows Server instances. If you run enough on a single host, then the cost for Windows Server Standard eventually meets or exceeds the cost of Datacenter Edition. The exact point depends on the discounts you qualify for. You can expect to break even somewhere around ten to twelve virtual instances.

What About Failover Clustering?

Both Standard and Datacenter Edition can participate as full members in a failover cluster. Each physical host must have sufficient licenses to operate the maximum number of virtual machines it might ever run simultaneously. Consult with your license reseller for more information.

Go to Original Article
Author: Eric Siron

How to repair Windows Server using Windows SFC and DISM

Over time, system files in a Windows Server installation might require a fix. You can often repair the operating…

system without taking the server down by using Windows SFC or the more robust and powerful Deployment Image Servicing and Management commands.

Windows System File Checker (SFC) and Deployment Image Servicing and Management (DISM) are administrative utilities that can alter system files, so they must be run in an administrator command prompt window.

Start with Windows SFC

The Windows SFC utility scans and verifies version information, file signatures and checksums for all protected system files on Windows desktop and server systems. If the command discovers missing protected files or alterations to existing ones, Windows SFC will attempt to replace the altered files with a pristine version from the %systemroot%system32dllcache folder.

The system logs all activities of the Windows SFC command to the %Windir%CBSCBS.log file. If the tool reports any nonrepairable errors, then you’ll want to investigate further. Search for the word corrupt to find most problems.

Windows SFC command syntax

Open a command prompt with administrator rights and run the following command to start the file checking process:

C:WindowsSystem32>sfc /scannow

The /scannow parameter instructs the command to run immediately. It can take some time to complete — up to 15 minutes on servers with large data drives is not unusual — and usually consumes 60%-80% of a single CPU for the duration of its execution. On servers with more than four cores, it will have a slight impact on performance.

Windows SFC scannow command
The Windows SFC /scannow command examines protected system files for errors.

There are times Windows SFC cannot replace altered files. This does not always indicate trouble. For example, recent Windows builds have included graphics driver data that was reported as corrupt, but the problem is with Windows file data, not the files themselves, so no repairs are needed.

If Windows SFC can’t fix it, try DISM

The DISM command is more powerful and capable than Windows SFC. It also checks a different file repository — the %windir%WinSXS folder, aka the “component store” — and is able to obtain replacement files from a variety of potential sources. Better yet, the command offers a quick way to check an image before attempting to diagnose or repair problems with that image.

Run DISM with the following parameters:

C:WindowsSystem32>dism /Online /Cleanup-Image /CheckHealth

Even on a server with a huge system volume, this command usually completes in less than 30 seconds and does not tax system resources. Unless it finds some kind of issue, the command reports back “No component store corruption detected.” If the command finds a problem, this version of DISM reports only that corruption was detected, but no supporting details.

Corruption detected? Try ScanHealth next

If DISM finds a problem, then run the following command:

C:WindowsSystem32>dism /Online /Cleanup-Image /ScanHealth

This more elaborate version of the DISM image check will report on component store corruption and indicate if repairs can be made.

If corruption is found and it can be repaired, it’s time to fire up the /RestoreHealth directive, which can also work from the /online image, or from a different targeted /source.

Run the following commands using the /RestoreHealth parameter to replace corrupt component store entries:

C:WindowsSystem32>dism /Online /Cleanup-Image /RestoreHealth

C:WindowsSystem32>dism /source: /Cleanup-Image /RestoreHealth

You can drive file replacement from the running online image easily with the same syntax as the preceding commands. But it often happens that local copies aren’t available or are no more correct than the contents of the local component store itself. In that case, use the /source directive to point to a Windows image file — a .wim file or an .esd file — or a known, good, working WinSXS folder from an identically configured machine — or a known good backup of the same machine to try alternative replacements.

By default, the DISM command will also try downloading components from the Microsoft download pages; this can be turned off with the /LimitAccess parameter. For details on the /source directive syntax, the TechNet article “Repair a Windows Image” is invaluable.

DISM is a very capable tool well beyond this basic image repair maneuver. I’ve compared it to a Swiss army knife for maintaining Windows images. Windows system admins will find DISM to be complex and sometimes challenging but well worth exploring.

Go to Original Article
Author:

Windows Server 2008 end of life: Is Azure the right path?

As the Windows Server 2008 end of life inches closer, enterprises should consider which retirement plan to pursue before security updates run out.

As of Jan. 14, Microsoft will end security updates for Windows Server 2008 and 2008 R2 machines that run in the data center. Organizations that continue to use these server operating systems will be vulnerable because hackers will inevitably continue to look for weaknesses in them, but Microsoft will not — except in rare circumstances — provide fixes for those vulnerabilities. Additionally, Microsoft will not update online technical content related to these operating systems or give any free technical support.

Although there are benefits to upgrading to a newer version of Windows Server, there may be some instances in which this is not an option. For example, your organization might need an application that is not compatible with or supported on newer Windows Server versions. Similarly, there are situations in which it is possible to migrate the server to a new operating system, but not quickly enough to complete the process before the impending end-of-support deadline.

Microsoft has a few options for those organizations that need to continue running Windows Server 2008 or 2008 R2. Although the company will no longer give updates for the aging operating system through the usual channels, customers can purchase extended security updates.

You can delay Windows Server 2008 end of life — if you can afford it

Those who wish to continue using Windows Server 2008 or 2008 R2 on premises will need Software Assurance or a subscription license to purchase extended updates. The extended updates are relatively expensive, or about 75% of the cost of a current version Windows Server license annually. This is likely Microsoft’s way of trying to get customers to migrate to a newer Windows Server version because the extended security updates cost almost as much as a Windows Server license.

The other option for those organizations that need to continue running Windows Server 2008 or 2008 R2 is to migrate those servers to the Azure cloud. Organizations that decide to switch those workloads to Azure will receive free extended security updates for three years.

Servers often have multiple dependencies, and you will need to address these as part of the migration planning.

Know what a move to Azure entails

Before migrating a Windows Server workload to the cloud, it is important to consider the pros and cons of making the switch to Azure. The most obvious benefit is financial and gives you a few years to run this OS without the hassle of having to pay for extended security updates.

Another benefit to the migration to Azure is a reduction in hardware-related costs. Windows Server 2008 was the first Windows Server version to include Hyper-V, but many organizations opted to install Windows Server 2008 onto physical hardware rather than virtualizing it. If your organization runs Windows Server 2008/2008 R2 on a physical server, then this is a perfect opportunity to retire the aging server hardware.

If your Windows Server 2008/2008 R2 workloads are virtualized, then moving those VMs to Azure can free up some capacity on the virtualization hosts for other workloads.

Learn about the financial and technical impact

One disadvantage to operating your servers in Azure is the cost. You will pay a monthly fee to run Windows Server 2008 workloads in the cloud. However, it is worth noting that Microsoft offers a program called the Azure Hybrid Benefit, which gives organizations with Windows Server licenses 40% off the cost of running eligible VMs in the cloud. To get an idea of how much your workloads might cost, you can use a calculator and find more details at this link.

Another disadvantage with moving a server workload to Azure is the increased complexity of your network infrastructure. This added complication isn’t limited just to the migrating servers. Typically, you will have to create a hybrid Active Directory environment and also create a VPN that allows secure communications between your on-premises network and the Azure cloud.

Factor in these Azure migration considerations

For organizations that decide to migrate their Windows Server 2008 workloads to Azure, there are a number of potential migration issues to consider.

Servers often have multiple dependencies, and you will need to address these as part of the migration planning. For instance, an application may need to connect to a database that is hosted on another server. In this situation, you will have to decide whether to migrate the database to Azure or whether it is acceptable for the application to perform database queries across a WAN connection.

Similarly, you will have to consider the migration’s impact on your internet bandwidth. Some of your bandwidth will be consumed by management traffic, directory synchronizations and various cloud processes. It’s important to make sure your organization has enough bandwidth available to handle this increase in traffic.

Finally, there are differences between managing cloud workloads and ones in your data center. The Azure cloud has its own management interface that you will need to learn. Additionally, you may find your current management tools either cannot manage cloud-based resources or may require a significant amount of reconfiguring. For example, a patch management product might not automatically detect your VM in Azure; you may need to either create a separate patch management infrastructure for the cloud or provide the vendor with a path to your cloud-based resources.

Go to Original Article
Author: