Tag Archives: might

Windows Server certifications retirement plan a blow to admins

If you’re a Windows admin, then you might still be reeling from the recent certification news from Microsoft.

It had been more than a year since the Windows Server 2019 release with no sign of a certification when Microsoft announced it was retiring the Microsoft Certified Solutions Associate (MCSA), Microsoft Certified Solutions Developer (MCSD), Microsoft Certified Solutions Expert (MCSE) certifications on June 30 — then consequently pushed out to Jan. 31 due to the COVID-19 pandemic — with no plans for Windows Server 2019 certifications before that deadline. In a blog released Feb. 27, the company said, “Windows Server 2019 and SQL Server 2019 content will be included in role-based certifications on an as-needed basis for certain job roles in the Azure Apps & Infrastructure and Data & AI solution areas.”

This is going to be painful for server admins who have relied on Windows Server certifications to help validate their knowledge and skills in the data center. These accreditations just went extinct in favor of the Azure-based certifications. This decision aligns with Microsoft’s marketing strategy to put the cloud in the spotlight and shift on-premises workloads to the background.

The problem: Not every customer has moved to Azure. With a little digging, it’s easy to find that many companies still have a substantial number of Windows Server workloads that remain in the data center and are most likely not going to budge anytime soon. Microsoft is notoriously guarded with how many Windows Server deployments exist on premises, but a company vice president said there were about 24 million Windows Server 2008 instances, which represented about 60% of the total Windows Server deployments at the company’s Inspire event in July 2019.

If you’re a Windows admin, then you should feel some anger, as well as concern, with this move. Despite the hype, not everything can be put in a container and into the cloud. The effort to redesign applications for this type of virtualization is not as simple as running an installation wizard. And these workloads often require a lot more resources than expected.

Administrators need to evolve and adapt with the times

Windows Server won’t be going anywhere soon, but how it is used and supported has changed. For example, it’s not as efficient to investigate an issue and try to troubleshoot it. It’s easier to redeploy the server with the application rather than try to correct the problem. In distributed application design, there is less focus on correction in favor of replacement for both speed and reliability.

The same holds true with infrastructure components such as Dynamic Host Configuration Protocol, DNS and Active Directory. It’s not ideal to simply replace these, but with solid VM templates and some scripting, the process is quick and relatively painless. This method is related to the loss of these Windows Server certifications. If it’s easier to replace, then why focus on the repair aspect? Without the need for troubleshooting, why have a certification?

This might be a bit disturbing for some admins who don’t use the “pets versus cattle” model, but virtualization and templates make this a viable option. However, just because something can be done doesn’t mean it will work in every situation. The application still needs to support this type of arrangement, and that process has been pretty slow in coming. Applications used in manufacturing, computer-aided design, electronic health record systems and education systems will have difficulty moving to a cloud-based version due to the difficulties with shifting the data away from the data center as well as the loss of customization offered by the on-premises version. Also, a move to Azure or quick replacement won’t work for companies that use on-premise resources due to regulations or preference.

While Microsoft sells and supports its on-premises products of Windows Server 2019 and Exchange Server 2019, it is doing its best to bury them in the product stack so only a Google search will unearth them. It’s difficult to see, but these products are a shrinking island. They might not go away completely, but the footprint will be a shadow of what it once was. In the world of IT, nothing ever goes away as quickly or cleanly as a vendor might hope — and Windows Server is no exception.

By necessity, the Azure certifications will be where most in IT now focus their attention. However, many companies still have legacy Windows Server workloads on site, and many of them will need to migrate to 2019. If you’re an expert in Windows Server technology, then you need to state that on your resume and, in the absence of an official certification, be prepared to cite specific examples that show your proficiency when you apply.

What you can do in the absence of Windows Server certifications

Microsoft, not its customers, should be carrying the torch for Windows Server 2019. But it’s clear the company has moved on — and I think that is a huge mistake. Microsoft education and certification are driven by its marketing and sales groups. This move doesn’t just water down some of the premier industry certifications, it lowers their relevance since they no longer cover all aspects of the core products.

Remember that you’re not the only one who sees what Microsoft is doing with Windows Server; IT managers and staff also see this trend. Don’t be surprised when these skills sets remain in demand. This could open a few more doors if you’re looking for that new opportunity, provided you can work to self-validate some of your supposedly outdated skills. You only have to look at the computer language of COBOL and continued livelihood of the mainframe to see there are jobs to support these legacy technologies that pay exceptionally well for those who know it. It’s too bad Microsoft chose to alienate an entire group of IT professionals in favor of a marketing strategy.

You should never say never to a cloud migration, but it won’t happen overnight. You shouldn’t walk away from Server 2019, despite what the marketing machine says you need to do. You should look at Azure certifications and put in effort to expand your skills portfolio, but don’t ignore your core skill sets.

Go to Original Article
Author:

RSA teams up with Yubico for passwordless authentication

The world might be one step closer to the passwordless future that security enthusiasts dream of.

On Dec. 10, RSA Security announced a strategic partnership with Yubico, the company known for its USB authentication keys, to drive passwordless authentication in the enterprise. The partnership combines Yubico’s YubiKey technology with RSA’s FIDO-powered SecurID authentication to eliminate passwords for enterprise employees, particularly those in use cases where mobile phones may not be appropriate or permitted for authentication. The combined offering, YubiKey for RSA SecurID Access, will launch in March.

Jim DucharmeJim Ducharme

In this Q&A, Jim Ducharme, vice president of identity and fraud and risk intelligence products at RSA, discusses the new Yubico partnership, FIDO as a standard and how close we are to the so-called “death of passwords.”

Editor’s note: This interview has been edited for length and clarity.

Tell me how the Yubico partnership came to be.

Jim Ducharme: I was talking to a customer and they mentioned how customers are struggling with the various use cases out there for people to prove to be who they say they are. A few years ago, I think that everybody thought that the world was just going to be taken over by mobile phone authentication and that’s all they’d ever need and they’d never need anything else. But they’re quickly realizing that they need multiple options to support the various use cases. Mobile authentication is certainly a new modern method that is convenient, given that everybody is walking around with a mobile phone, but there are a number of use cases, like call centers, remote workers and even folks who, believe it or not, don’t have a smartphone, that they still need to care for and make sure that they are who they say they are.

At RSA, we’ve had our SecurID tokens for quite a while now, but there are other use cases that we’ve found. FIDO-compliant devices were looked at as something that customers wanted to deploy. Particularly hardware-based ones like a Yubico security key. And RSA was the founding member of the FIDO subcommittee on enterprise application, but largely the uptick has been on the consumer identity side of it. We wanted to figure out how we can help the enterprise with their employee use cases, leveraging FIDO and these standards, coupled with these other use cases like call centers or areas where there is a particular device that a user needs to use and they need to prove they are who they say they are.

This customer sent me on this sort of tour of asking my customers what they thought about these use cases and I was amazed at how many customers were already looking at this solution yet finding themselves having to purchase Yubico keys from Yubico and purchase RSA from us for the FIDO backend. It’s only natural for us to bring these two strong brands together to give customers what they need sort of all-in-one box, virtually if you will. Now what we offer is more choice in how users authenticate themselves, allowing them to transform as maybe they get more comfortable with adopting mobile authentication. A lot of users don’t want to use their mobile phone for corporate authentication, but that’s slowly increasing. We wanted to make sure we were providing a platform that can allow users that flexibility of choice, but as the same time, allow our customers and the identity teams to have a single structure to support those different use cases and allow that transformation to happen over time, whether it be from hardware devices, hardware tokens, to mobile authenticators to desktop authenticators to new biometrics, et cetera.

How does this partnership with Yubico fit into RSA’s overall strategy?

Ducharme: Obviously things like a Yubico device is just another form of a passwordless authenticator. But there are plenty of passwordless authenticators out there right now — most people have them in their hands now with [Apple] Face ID and Touch ID, but that’s only part of the solution. Our focus is an identity ecosystem that surrounds the end user and their authenticator where passwords still exist. Despite these new passwordless authenticators, we still haven’t managed to get rid of the password. The help desk is still dealing with password resets, and the support costs associated with passwords are actually going up instead of down. If we’re implementing more and more passwordless authentication, why is the burden on the help desk actually going up? The reality is, most of these passwordless authentication methods are actually not passwordless at all. These biometric methods are nothing more than digital facades on top of a password, so the underlying password is still there. They’re allowing a much more passwordless experience, which is great for the end user, but the password is still there. We’re actually finding that in many cases, the help desk calls are going up because you’re not using that password as frequently as you used to, and now once a month or once a week when people have to use it, they are more apt to forget it than the password they use every single day. We’re actually seeing an increase in forgotten passwords because the more we’re taking passwords out of users’ hands, the more they’re actually forgetting it. We really have to go that last mile to truly get rid of the passwords.

Strategically, our goal is not only to have a spectrum of passwordless authentication and experiences for end users, but we also have to look at all of these other places where the password hides and eliminate those [uses]. Until we do, the burden on the help desk, the costs on the IT help desk are not going to go down, and that’s one of the important goals of moving towards the passwordless world, and that’s where our focus is.

Do you think companies are worried about lost keys and having that negatively impact availability?

Ducharme: Yes. As a matter of fact, we had a customer dinner last night and that is probably one of the number one [concerns], the notion of lost keys. The thing that’s nice about the YubiKey devices is that they sit resident within the device so the odds of losing it are less such. But it absolutely is still an issue. Whenever you have anything that you have to have, you could potentially lose it.

We need to make sure they’re easily replaceable, not just easy but cost-wise as well, and couple that with credential enrollment recovery. When they lose those devices, make sure that they still have accessibility to the systems they still need access to. Even if you don’t lose it permanently, you forget it on your desk at home and when you arrive to work, well, you can’t be out of business for the day because you left your key at home. That’s what we’re working on — what do you do when the user doesn’t have their key? We still need to be able to provide them access very securely and while not reverting back to a password. What we’re trying to do is surround these physical devices and mobile phones with these recovery mechanisms when the user doesn’t have access to their authenticator, whatever form it is. 

How much progress do you think FIDO has made in the last couple years?

Ducharme: FIDO has gotten a lot of good brand recognition. We’re seeing some uptick in it, but we think with this announcement we’re hoping to really increase the pace of adoption. The great news is we’re seeing the support in things like the browsers. It was a huge milestone when Microsoft announced its support with Windows Hello. We’re starting to get the plumbing in all the right places so we’re very optimistic about it. But the actual application, it’s still a vast minority of a lot of customers in the enterprise use case, and a lot of that has to do with the technology refresh cycles. Are they getting the browsers on the end users’ laptops? Are they using Hello for business? But honestly, these upgrade cycles to Windows Hello are happening faster than the previous Windows cycles, so I’m optimistic about it. But what we’re encouraged by is the adoption of the technology like FIDO, like Web OPM, within the browsers and the operating systems; the end user adoption, by which I mean the companies deploying these technologies to their end users, isn’t quite there yet. This is what we’re hoping to bring out.

Do you think we’re going to see the death of passwords sometime in the next several years?

Ducharme: I’ve been in the identity space now for about 20 years. During a lot of that, I would say to myself the password will never die. But I actually think we’re on the cusp of really being able to get rid of the password. I’ve seen the market understand what it’s truly going to take to get rid of the password from all facets. We have the technology now that it’s accessible with people every day with their mobile phones, wrist-based technologies and all of that. We have the ability to do so. It’s within reach. The question will be, how do we make this technology successful, and how do we make it a priority? So I really am optimistic. What we’ll have to do is push through people using passwordless experiences to help people understand that we really have to get rid of the underlying passwords. The industry’s going to have to do the work to flush out the password for the last mile. I believe the technologies and the standards exist to do so, but until we start looking at the security implications and the costs associated with those passwords and really take it seriously, we won’t do it. But I do believe we have the best opportunity to do it now.

Go to Original Article
Author:

Turning the next generation into everyday superheroes thanks to Hour of Code 2019 – Microsoft News Centre Europe

When you think of coding, your first thoughts might be about highly specialized technical know-how. But did you know that effective coding requires skills like creativity, innovation and collaboration too – all of which will be hugely important for the workforce of tomorrow?

According to Microsoft research with McKinsey, the fastest growing occupations, such as technology professionals and healthcare providers, will require a combination of digital and cognitive skills such as digital literacy, problem solving and critical thinking. Young people having access to learning tools to improve both these sets of skills is crucial – a fact non-profit organizations like JA Europe recognize through their work to get young people ready for the future of work. If young people are given the opportunity to develop their digital skills, the European Labor Market will see significant benefits when they move into the workforce. According to a LinkedIn Economic Graph report, AI Talent in the European Labour Market, training and upskilling ‘near-AI’ talent could double the size of the current AI workforce in the EU. It also found that AI skills are concentrated in a small number of countries and that this must be addressed to reduce the digital skills gap in Europe.

In conjunction with Computer Science Education Week which began yesterday and extends to December 15, Microsoft continues its multi-year commitment to Hour of Code, a global movement that introduces students to computer science and demystifies what coding is all about. Activities are running across Europe to fuel imagination and demonstrate how these skills could be used to solve some of the world’s biggest problems. As such, code has the power to turn anyone into an everyday superhero.

To bring this to life, Microsoft is inviting young people to ‘save the day’ through Computer Science. Created in partnership with MakeCode, a new Minecraft tutorial combines code, Artificial Intelligence and problem solving skills. It is inspired by various Microsoft AI for Earth projects and encourages students to use their critical thinking skills to plot where forest fires could happen, put plans in place to stop them with AI and ultimately save the Minecraft village!

Since 2012, Microsoft has helped more than 137,000 young people and educators in Europe through Hour of Code events and programs. And, as the end of the decade draws near, we are keen to support even more people to get into coding and show how it can change the world. If you’re looking to help your children or students become coding superheroes, we have developed two training guides – one for students and one aimed at educators – no cape needed!

Go forth and code!

Go to Original Article
Author: Microsoft News Center

How to repair Windows Server using Windows SFC and DISM

Over time, system files in a Windows Server installation might require a fix. You can often repair the operating…

system without taking the server down by using Windows SFC or the more robust and powerful Deployment Image Servicing and Management commands.

Windows System File Checker (SFC) and Deployment Image Servicing and Management (DISM) are administrative utilities that can alter system files, so they must be run in an administrator command prompt window.

Start with Windows SFC

The Windows SFC utility scans and verifies version information, file signatures and checksums for all protected system files on Windows desktop and server systems. If the command discovers missing protected files or alterations to existing ones, Windows SFC will attempt to replace the altered files with a pristine version from the %systemroot%system32dllcache folder.

The system logs all activities of the Windows SFC command to the %Windir%CBSCBS.log file. If the tool reports any nonrepairable errors, then you’ll want to investigate further. Search for the word corrupt to find most problems.

Windows SFC command syntax

Open a command prompt with administrator rights and run the following command to start the file checking process:

C:WindowsSystem32>sfc /scannow

The /scannow parameter instructs the command to run immediately. It can take some time to complete — up to 15 minutes on servers with large data drives is not unusual — and usually consumes 60%-80% of a single CPU for the duration of its execution. On servers with more than four cores, it will have a slight impact on performance.

Windows SFC scannow command
The Windows SFC /scannow command examines protected system files for errors.

There are times Windows SFC cannot replace altered files. This does not always indicate trouble. For example, recent Windows builds have included graphics driver data that was reported as corrupt, but the problem is with Windows file data, not the files themselves, so no repairs are needed.

If Windows SFC can’t fix it, try DISM

The DISM command is more powerful and capable than Windows SFC. It also checks a different file repository — the %windir%WinSXS folder, aka the “component store” — and is able to obtain replacement files from a variety of potential sources. Better yet, the command offers a quick way to check an image before attempting to diagnose or repair problems with that image.

Run DISM with the following parameters:

C:WindowsSystem32>dism /Online /Cleanup-Image /CheckHealth

Even on a server with a huge system volume, this command usually completes in less than 30 seconds and does not tax system resources. Unless it finds some kind of issue, the command reports back “No component store corruption detected.” If the command finds a problem, this version of DISM reports only that corruption was detected, but no supporting details.

Corruption detected? Try ScanHealth next

If DISM finds a problem, then run the following command:

C:WindowsSystem32>dism /Online /Cleanup-Image /ScanHealth

This more elaborate version of the DISM image check will report on component store corruption and indicate if repairs can be made.

If corruption is found and it can be repaired, it’s time to fire up the /RestoreHealth directive, which can also work from the /online image, or from a different targeted /source.

Run the following commands using the /RestoreHealth parameter to replace corrupt component store entries:

C:WindowsSystem32>dism /Online /Cleanup-Image /RestoreHealth

C:WindowsSystem32>dism /source: /Cleanup-Image /RestoreHealth

You can drive file replacement from the running online image easily with the same syntax as the preceding commands. But it often happens that local copies aren’t available or are no more correct than the contents of the local component store itself. In that case, use the /source directive to point to a Windows image file — a .wim file or an .esd file — or a known, good, working WinSXS folder from an identically configured machine — or a known good backup of the same machine to try alternative replacements.

By default, the DISM command will also try downloading components from the Microsoft download pages; this can be turned off with the /LimitAccess parameter. For details on the /source directive syntax, the TechNet article “Repair a Windows Image” is invaluable.

DISM is a very capable tool well beyond this basic image repair maneuver. I’ve compared it to a Swiss army knife for maintaining Windows images. Windows system admins will find DISM to be complex and sometimes challenging but well worth exploring.

Go to Original Article
Author:

Tips for ransomware protection on Windows systems

Ransomware. Just the word quickens the pulse of every Windows administrator who might have lingering doubts about the effectiveness of their security approach.

Many IT folks lose sleep over the effectiveness of their ransomware protection setup, and for good reason. Your vital Windows systems keep most companies running, and thoughts of them going offline will have many IT pros staring at the clock at 3 a.m.

Unfortunately, ransomware will hit you in some capacity, despite any measures you take, but it’s not a futile effort to shore up your defenses. The key is to fortify your systems with layers of security and then to follow best practices for both Windows and your backup products to minimize the damage.

Give a closer look at your backup setup

Backups are something companies make with the hope that they are never needed. Oftentimes, backups are a secondary task that is shuttled to an ops group to be done as a daily task that is a checkbox on some form somewhere. This is how trouble starts.

You need to make backups, but another part of the job is to secure those backups. A backup server or appliance is a very tempting target for attackers who want to plant ransomware. These servers or appliances have network access to pretty much everything in your data center. It’s your company’s safety net. If this massive repository of data got encrypted, it’s likely the company would pay a significant amount to free up those files.

Anyone with IT experience who has seen organizations wiped out after a ransomware attack might change your mind if you feel old data is not worth having in an emergency.

Most backup products are public, which means ransomware creators know how they work, such as how the agents work and their paths. With all that information, an attacker can write software tailored to your vendor’s backup product.

Now, most backup offerings have some level of ransomware protection, but you have to enable it. Most people find the setting or steps to protect their data after the backups have been wiped. Don’t wait to verify your backup product is secured against ransomware; do it today.

An old security standby comes to the fore

This also brings up a secondary practice: air-gapping.

This methodology was popular in the days of tape backup but fell out of favor with the introduction of replication.

Some would argue that data that is several weeks or several months old has little value, but is the alternative — no data — any better? Anyone with IT experience who has seen organizations wiped out after a ransomware attack might change your mind if you feel old data is not worth having in an emergency.

[embedded content]
Windows Server 2019 ransomware protection settings.

A small network-attached storage product you use for a data store dump every six months and lock away suddenly doesn’t sound like such a bad idea when the alternative is zero data. It’s a relatively inexpensive addition to the data center used as an extra repository of your data.

Think of it this way: Would you rather get hit with ransomware and lose a few months’ worth of data or all 15 years? Neither is a great situation, but one is much preferred over the other. These cold backups won’t replace your backup strategy, but rather supplements it as a relatively economical airgap. When it comes to ransomware, more layers of safeguards should be the rule.

Air-gapping is a practice that is not followed as closely now with the pervasiveness of online deduplication backup products. For organizations that can afford them, these offerings often replicate to online backup appliances in remote locations to make the data accessible.

Don’t overlook built-in ransomware protection

There are more than a few ways to mitigate the ransomware threat, but using a layered approach is recommended.

These malicious applications quickly move east-west across flat networks. Internal firewalls, whether physical or virtual, can do a lot to stop these types of attacks.

An often-overlooked option is the Windows firewall. When it first came out, the Windows firewall had a few stumbles, but Microsoft continued to develop and improve it to build a solid software firewall. This is a low-cost offering that is free but does require some administration work. The Windows firewall is not going to stop all possible ransomware, but very few products can.

Looking at the big picture, the Windows firewall gives an additional layer of protection against ransomware. It’s already there and should have little performance impact.

Go to Original Article
Author:

How to manage Server Core with PowerShell

After you first install Windows Server 2019 and reboot, you might find something unexpected: a command prompt.

While you’re sure you didn’t select the Server Core option, Microsoft now makes it the default Windows Server OS deployment for its smaller attack surface and lower system requirements. While you might remember DOS commands, those are only going to get you so far. To deploy and manage Server Core, you need to build your familiarity with PowerShell to operate this headless flavor of Windows Server.

To help you on your way, you will want to build your knowledge of PowerShell and might start with the PowerShell integrated scripting environment (ISE). PowerShell ISE offers a wealth of features for the novice PowerShell user, including auto complete of commands to context-colored commands to step you through the scripting process. The problem is PowerShell ISE requires a GUI or the “full” Windows Server. To manage Server Core, you have the command window and PowerShell in its raw form.

Start with the PowerShell basics

To start, type in powershell to get into the environment, denoted by the PS before the C: prompt. A few basic DOS commands will work, but PowerShell is a different language. Before you can add features and roles, you need to set your IP and domain. It can be done in PowerShell, but this is laborious and requires a fair amount of typing. Instead, we can take a shortcut and use sconfig to compete the setup. After that, we can use PowerShell for additional administrative work.

PowerShell uses a verb-noun format, called cmdlets, for its commands, such as Install-WindowsFeature or Get-Help. The verbs have predefined categories that are generally clear on their function. Some examples of PowerShell cmdlets are:

  • Install: Use this PowerShell verb to install software or some resource to a location or initialize an install process. This would typically be done to install a windows feature such as Dynamic Host Configuration Protocol (DHCP).
  • Set: This verb modifies existing settings in Windows resources, such as adjusting networking or other existing settings. It also works to create the resource if it did not already exist.
  • Add: Use this verb to add a resource or setting to an existing feature or role. For example, this could be used to add a scope onto the newly installed DHCP service.
  • Get: This is a resource retriever for data or contents of a resource. You could use Get to present the resolution of the display and then use Set to change it.

To install DHCP to a Server Core deployment with PowerShell, use the following commands.

Install the service:

Install-WindowsFeature –name 'dhcp'

Add a scope for DHCP:

Add-DhcpServerV4Scope –name "Office" –StartingRange 192.168.1.100 -EndRange 192.168.1.200 -SubnetMask 255.255.255.0

Set the lease time:

Set-DHCPSet-DhcpServerv4Scope -ScopeId 192.168.1.100 -LeaseDuration 1.00:00:00

Check the DHCP IPv4 scope:

Get-DhcpServerv4Scope

Additional pointers for PowerShell newcomers

Each command has a purpose and means you have to know the syntax, which is the hardest part of learning PowerShell. Not knowing what you’re looking for can be very frustrating, but there is help. The Get-Help displays the related commands for use with that function or role.

Part of the trouble for new PowerShell users is this can still be overwhelming to memorize all the commands, but there is a shortcut. As you start to type a command, the tab key auto-completes the PowerShell commands. For example, if you type Get-Help R and press the tab key, PowerShell will cycle through the commands, such as the command Remove-DHCPServerInDC, see Figure 1. When you find the command you want and hit enter, PowerShell presents additional information for using that command. Get-Help even supports wildcards, so you could type Get-Help *dhcp* to get results for commands that contain that phrase.

Get-Help command
Figure 1. Use the Get-Help command to see the syntax used with a particular PowerShell cmdlet.

The tab function in PowerShell is a savior. While this approach is a little clumsy, it is a valuable asset in a pinch due to the sheer number of commands to remember. For example, a base install of Windows 10 includes Windows PowerShell 5.1 which features more than 1,500 cmdlets. As you install additional PowerShell modules, you make more cmdlets available.

There are many PowerShell books, but do you really need them? There are extensive libraries of PowerShell code that are free to manipulate and use. Even walking through a Microsoft wizard gives the option to create the PowerShell code for the wizard you just ran. As you learn where to find PowerShell code, it becomes less of a process to write a script from scratch but more of a modification of existing code. You don’t have to be an expert; you just need to know how to manipulate the proper fields and areas.

Outside of typos, the biggest stumbling block for most beginners is not reading the screen. PowerShell does a mixed job with its error messages. The type is red when something doesn’t work, and PowerShell will give the line and character where the error occurred.

In the example in Figure 2, PowerShell threw an error due to the extra letter s at the end of the command Get-WindowsFeature. The system didn’t recognize the command, so it tagged the entire command rather than the individual letter, which can be frustrating for beginners.

PowerShell error message
Figure 2. When working with PowerShell on the command line, you don’t get precise locations of where an error occurred if you have a typo in a cmdlet name.

The key is to review your code closely, then review it again. If the command doesn’t work, you have to fix it to move forward. It helps to stop and take a deep breath, then slowly reread the code. Copying and pasting a script from the web isn’t foolproof and can introduce an error. With some time and patience, and some fundamental PowerShell knowledge of the commands, you can get moving with it a lot quicker than you might have thought.

Go to Original Article
Author:

Recreating ancestral worlds with virtual blocks – New Zealand News Centre

Whetu Paitai’s always been good at building. In fact, he might still be a builder in Australia if it weren’t for two things: a broken leg and a promise kept. Thanks to life’s strange twists he’s back home in the Coromandel, but instead of putting up houses, he’s reconstructing the world of his Tīpuna (ancestors). 

The path was laid almost a decade ago, when Whetu was in hospital with a broken leg. A university lecturer in the same ward raved to him about Minecraft. When Whetu suggested his daughter try the game she was instantly hooked – and so was he. The pair bonded over their shared passion for creating digital worlds. 

Fast forward a few years, and Whetu, now a father of four, often wished his tamariki (children) could connect with their culture by learning to speak te reo Māori, something he’d never learned to do. A promise to his wife saw them return to New Zealand and he’s never looked back since.   

Now Whetu has not only reconnected with his culture and heritage, immersing his children in te ao Māori (the Māori world), he’s found a new calling: designing games that introduce his culture to countless other children. His latest is a brand-new world built for Minecraft: Education Edition, Ngā Motu (The Islands), giving students a taste of what life was like in a traditional Māori (fortified village).  

There’s great value in little things 

Whetu is the founder of Piki Studios, a game design company he runs while home-schooling his children on the remote Coromandel Peninsula. The leap from builder to educational games developer may seem like a big one, but Whetu remembers being drawn to technology from an early age. 

“When I was a kid I enjoyed computers, but the geeky stereotype didn’t fit with the Kiwi view of being a boy. I grew up in Harataunga (Kennedy Bay), surrounded by bush. Computers went on the back-burner.” 

When he returned to New Zealand, and still a massive Minecraft fan, Whetu was seduced afresh by digital technology, so he retrained. Armed with new digital skills, he found himself helping out with the admin at his children’s kohanga reo (Māori-language preschool), and a lightbulb went on: “If I could be involved that much in my kids’ education, how much more involved could I be?” 

Whetu realised that by marrying his passion for IT with education, he could help other children learn the language and culture too by creating fun new resources 

And so his game building began. He started by creating an online game, Mahimaina (Minecraft in te reo Māori), to help children learn the language, joined by around 100 students. More games are set to follow, both online and traditional board games, which Whetu hopes will be used by schools and whānau (families) around the country.  

“There’s great value in little things,” he says. For a child, seeing their culture represented on major global platforms is incredibly empowering.”  

It was exactly what one of the world’s largest tech companies was looking for.  

Last year, Microsoft came knocking. Would Whetu like to create a uniquely Aotearoa (New Zealand) resource for Minecraft: Education Edition?  

A voyage through Aotearoa 

“It blew our minds,” says Whetu. “I knew Minecraft, but it wasn’t till we explored Minecraft: Education Edition, tweaked it, played with it and saw all the additional things it could do that we realised all the potential. This will open up so much more space for Māori and all Kiwis to learn and play in the Māori world.”  

Minecraft: Education Edition brings the world of Minecraft to classrooms around the world, offering hundreds of free lessons as well as a global educator community. Immersive game-based learning helps students build key 21st century skills including creativity, collaboration and STEM. Educators across New Zealand are already using Minecraft to transform learning, from learning programming with Hour of Code to designing sustainable villages and even reconstructing Gallipoli in-game. 

Whetu is the first to create a brand-new world immersed in te ao Māori. Characters based on his children and their friends guide young players as they walk through Ngā Motu, from the impressive waka hourua (sea-going canoe) at the beach to the with its wharenui (large meeting house) decorated with kōwhaiwhai (painted panels) and tukutuku (woven lattice). Pātaka, rua (food storage areas) and a hāngī pit for cooking can also be found in the  

Whetu has gone into painstaking detail to make sure everything has a uniquely Aotearoa flavour, right down to the kumara (sweet potato) gardens where children can create new buildings. The resource packs swap typical swords for more appropriate patu (clubs) and even the mobs will have Kiwi kids feeling right at home.  

Whetu’s younger daughter requested her favourite bird, a pīwaiwaka (fantail), you can interact with a native kunekune pig and even an extinct moa, New Zealand’s famous giant bird, complete with sound recreated by the experts at national museum Te Papa. Children can learn words in te reo Māori from the guides, or via in-game exercises. 

In future iterations, intrepid voyagers will be able to visit the taniwha (guardian) in the harbour and collect kaimoana (seafood) near some pink terraces that may remind New Zealanders of the long-lost Pink Terraces, destroyed by a volcanic eruption more than 100 years ago. All of these will add to children’s glossary of Māori words and understanding of Māori history and narratives. 

“I would love the kura (schools) to build their own or wharenui, explore the world on their own and learn how to care for the moa,” Whetu says. 

“We’re believers in learning being organic, being able to explore all the elements, because nothing in our lives exists in isolation. Our mission is for everyone to be able to play these games and see more than just what a waka is – they’ll be able to see how it fits into that whole world,” Whetu explains. 

A “serendipitous” opportunity 

This philosophy is exactly why Microsoft New Zealand’s Sam McNeill and Anne Taylor came to Piki Studios. 

Whetu’s so passionate about education and helping all kids, not just his own, understand our indigenous culture and that really shines through when you speak to him. He’s a natural teacher,” says Anne, Education Lead for Microsoft New Zealand.  

The creativity and attention to detail with which Whetu has approached this project just blew us away. What he’s created goes way beyond what we could ever have expected.” 

Whetu acknowledges getting the call from Microsoft was daunting, being a small family business dealing with a large multinational corporation. It was a relief to find he was working with people who shared the same values and goals.  

A better opportunity couldn’t have presented itself. Straight off the bat, Sam and Anne knew te ao Māori, believing in dealing honestly and genuinely with indigenous people, and we never lost any of that closeness that is so important. It was truly serendipitous.”  

The group were determined to ensure all the translations were accurate. Two professional translators, Hemi Kelly and Piripi Walker, worked with Whetu and the team to translate the language pack for the game, including the instructions. There were even some new words for some of the more in-game Minecraft items. 

“It was important to make sure te ao Māori was respected as its own being, the mana (status) and cultural IP of each artefact upheld and maintained throughout the process,” Whetu says.  

The most difficult part was the timeframe, just five short weeks. Luckily Whetu was supported by other Māori working in the tech space, making it a truly collaborative process. And of course Whetu’s children acted as in-house quality assurance keeping Dad on top of his game. 

First Harataunga, then the world? 

Soon Ngā Motu will reach an audience beyond New Zealand, as Piki Studios is now an official member of the Minecraft Partner Program, enabling it to add to the resources available in the global Minecraft Marketplace. For now, the game will be available to classrooms in New Zealand, as part of Microsoft’s Schools Agreement that provides resources such as Minecraft: Education Edition to every State and State-Integrated school. 

 “Ngā Motu is a truly amazing resource for Kiwi students and teachers and we know they’re going to absolutely love exploring and building on this world,” says Anne. 

“It’s not just Whetu’s children. We showed it to some of our global colleagues and the excitement in the room was just palpable. 

Not bad for a boy from Harataunga. 

For more information on Minecraft: Education Edition in New Zealand please visit: http://elearning.tki.org.nz/Teaching/Future-focused-learning/Minecraft or visit Piki Studios https://www.pikistudios.com/ 


Te waihanga anō i ngā ao tuku iho mā ngā poraka mariko 

Mai anō e taunga ana a Whetu Paitai ki ngā mahi waihanga. Ina, tērā pea e mahi waihanga tonu ana ia i Ahitereiria engari nā ēnei mea e rua: i whati te waewae me te ū ki te kupu oati. Nā ngā āhuatanga o te ao kua tau atu ki te wā kāinga i Hauraki, engari kāore ia i whakatū whare, kei te whakatūtū kē anō ia i te ao o ōna tīpuna. 

He mea whakatakoto tēnei ara i tōna tekau tau ki mua, i te wā i rō hōhipera a Whetu nā te whatinga o tōna waewae. I te waha pakaru haere tētahi kaiako whare wānanga ki a ia mō Minecraft. I te meatanga a Whetu ki tana tamāhine kia whakamātauria e ia te kēmu i tino rawe rawa atu ki tana tamahine – me ia anō. Ka hono tahi rāua i runga i tō rāua kaingākau ki te waihanga ao matihiko. 

Ka huri ngā tau, kua whā ngā tamariki ināianei a Whetu, me tana manako kia hono ana tamariki ki tō rātau ao Māori mā te ako ki te kōrero i te reo Māori, kāore hoki a Whetu i mōhio ki tōna reo. Nā tana kupu oati ki tana hoa wahine ka hoki mai rātau ki Aotearoa, kāore mo te hoki whakamuri.  

Nā, kua hono atu a Whetu ki tōna ao Māori, koinei hoki te ao o ana tamariki, kua whai ia te tino oranga mōna: te waihanga kēmu me te tūhono atu i tōna ahurea ki ngā tamariki huhua. Ko tāna mea hou rawa ko tētahi ao tino hou mō Minecraft: Te Putanga Mātauranga, Ngā Motu, e pā atu ai ngā ākonga ki te āhua o te ao i roto i tētahi pā Māori.  

He mea nui kei roto i ngā mea iti 

Nā Whetu i whakaara ake a Piki Studios, he kamupene waihanga kēmu e whakahaerehia ana e ia i a ia e kura ana i ana tamariki i te kāinga i te takiwā mamao o Hauraki. Ko te whakaaro pea he tino nui te neke mai i te mahi waihanga ki te waihanga kēmu mātauranga, engari i maumahara a Whetu ki tana kaingākau ki te hangarau i a ia e paku ana. 

“I ahau e tamariki ana he rawe ki ahau te raweke rorohiko, engari kāore i ū te āhua o te ihu rorohiko ki te whakaaro o te iwi o Aotearoa mō te āhua o te tama. I pakeke mai ahau i Harataunga, i waenganui o te ngahere. Ka whakarerea ngā rorohiko.” 

I tana hokinga mai ki Aotearoa, ā, ka mutu e kaingākau tonu ana ki a Minecraft, i riro te wairua o Whetu ki ngā hangarau matihiko, nā ka hoki ia ki te ako. Ka riro mai i a ia ōna pūkenga matihiko hou, ka huri ia ki te āwhina i te kōhanga reo o ana tamariki me ngā mahi tari, ā, i reira ka taka mai he whakaaro ki a ia. “Mēnā e pēnei rawa te nui o taku uru ki te mātauranga o aku tamariki, kia pēhea te whakawhānui kē atu i tōku uru atu? 

I kite a Whetu mā te hono i tōna kaingākau mō te ao rorohiko ki te mātauranga, ka taea e ia te āwhina i ētahi atu tamariki ki te ako i te reo me ngā tikanga mā te waihanga i ngā rauemi pārekareka hou.  

Nā, ka tīmata tana mahi waihanga kēmu. I tīmata ia mā te waihanga i tētahi kēmu tuihono, Mahimaina (ko Maincraft i roto i te reo Māori), hei āwhina i ana tamariki ki te ako i te reo, me ngā ākonga tata ki te 100. He nui atu anō ngā kēmu kei te whai mai, ngā kēmu tuihono me aua kēmu papa anō, ā, ko te tūmanako o Whetu ka whakamahia e ngā kura me ngā whānau puta noa i te motu.  

Hei tāna, “He mea nui kei roto i ngā mea iti.” “Mō tētahi tamaiti, ka nui te whakamana i te kite i tō rātau ao e whakaaturia ana ki ngā pūhara ā-ao nui.”  

Koinei tonu te mea e kimihia ana e tētahi o ngā kamupene hangarau nui rawa o te ao.  

I tērā tau i whakapā mai a Microsoft. Kei te hiahia a Whetu ki te waihanga i tētahi rauemi ahurei ki Aotearoa mā Minecraft: Education Edition?  

Te hīkoi i Aotearoa 

“I tino mīharo mātau,” te kī a Whetu. “I te mōhio ahau mō Minecraft, engari nō te hōpara haere i a Minecraft: Education Edition, i rāwekewekehia, i pureihia e mātau, ā, ka kite i ngā mea tāpiri ka taea kātahi ka mārama ki tōna kaha ka taea. He nui ake te wāhanga ka tuwhera mai i tēnei mō te Māori me ngā tāngata katoa o Aotearoa ki te ako me te purei i roto i te ao Māori.”  

Ka heria mai e Minecraft: Education Edition te ao o Minecraft ki ngā akomanga puta noa i te ao, e tuku ana i ngā akoranga koreutu maha rawa me tētahi hapori whakaako ā-ao. Ka āwhina ngā akoranga ā-kēmu rumaki i ngā ākonga ki te whakapakari i ngā pūkenga rau tau 21, tae atu ki te auahatanga, mahi tahi me te STEM. Kei te whakamahia kētia e ngā kaiwhakaako hei takahuri i ngā akoranga, mai i te ako i te papatono mā te Hour of Code mai i te waihanga i ngā pā toitū, me te aha me te waihanga anō i a Karipori i rō kēmu. 

Ko Whetu te mea tuatahi ki te waihanga i tētahi ao hou rawa i roto katoa i te ao Māori. Ka ārahina ngā kaitākaro tamariki e ngā kiripuaki, i takea mai ēnei i ana tamariki me ō rātau hoa, i a rātau e hīkoi haere ana i Ngā Motu, mai i te waka hourua ātaahua i tātahi ki te pā me te wharenui me ōna kōwhaiwhai, tukutuku hoki. Ka kitea anō i roto i te pā ko ngā pātaka, rua me rua hāngī mō te tunu kai.  

He tino hōhonu rawa te āhua o ngā iroirotanga i oti i a Whetu kia mau ai te āhuatanga ake o Aotearoa, tae atu ki ngā māra kūmara e taea ai e ngā tamariki te waihanga whare hou. Kua whakakapia i roto i ngā kete rauemi ngā hoari mō ngā patu, ka mutu ka tino taunga ngā tamariki o Aotearoa ki ngā māpu.  

I tono a te pekepoho kōtiro a Whetu i tana tino manu te pīwaiwaka, ka taea e koe te pāhekoheko me tētahi poaka kunekune me tētahi moa kua korehāhā me ngā tangi i hangaia anō e ngā mātanga o Te Papa. Ka taea e ngā tamariki te ako i te reo Māori mai i ngā kaiārahi, mā ngā tūmahi i rō kēmu rānei. 

I roto i ngā auau o muri mai ka taea e ngā kaumoana te toro ki ngā taniwha i roto i te whanga me te kohikohi kaimoana e tūtata ana ki ngā parehua, ā, ka hoki pea ngā whakaaro o ngā tāngata ki Ōtūkapuārangi kua ngaro noa atu, i riro atu i roto i tō pahūtanga puia neke atu i te 100 tau ki mua. Ka tāpiri ēnei mea katoa ki te kete kupu a ngā tamariki me te whakawhānui i tō rātau mōhio ki te hītori me ngā kōrero a te Māori. 

“Ka nui taku hiahia kia hangaia e ngā kura ā rātau ake pā, wharenui rānei, te hōpara i te ao me te ako me pēhea te tiaki i te moa,” te kī a Whetu. 

“E whakapono ana mātau ka pā noa mai ngā akoranga, mā te āhei ki te hōpara i ngā āhuatanga katoa, i te mea e kore e noho wehe motuhake tētahi mea i roto i ō tātau ao. Ko tā mātau whāinga kia taea e ngā tāngata katoa ēnei kēmu, ā, ka whānui ake tā rātau kite i tētahi waka – ka kite kē rātau i te urunga atu ki roto i te ao whānui,” te whakamārama a Whetu. 

He whai wāhitanga i “tūpono noa atu 

Koinei tonu te tikanga whakaaro i haere atu a Sam McNeill rāua ko Anne Taylor o Microsoft New Zealand ki a Piki Studios. 

“He tino ngākaunui a Whetu ki te mātauranga me te āwhina i ngā tamariki, kaua ko āna anake, kia mārama ki tō tātau ahurea taketake, ā, ka kitea puta tēnei i ā koe e kōrero ana ki a ia. He tino kaiako ia,” te kī a Anne, Kaiārahi Mātauranga mō Microsoft New Zealand.  

“I tino mīharo rawa atu mātau ki te auaha me ngā iroirotanga katoa i roto i ngā mahi a Whetu mō tēnei kaupapa. Kua eke kē atu ia ki āna mahi ki te taumata i manakohia e mātau.” 

I kī ia i āhua wehi ia i te waeatanga mai a Microsoft, ina he pakihi whānau iti mātau e whakariterite ana me tētahi kāporeihana nui nō te ao whānui. I tau ia i tana kite i te mahi tahi ia me ngā tāngata he ōrite ngā uara me ngā whāinga.  

“Kāore i tua atu i tēnei whai wāhitanga. Mai i te tīmatanga, i mōhio a Sam rāua ko Anne ki te ao Māori, e whakapono ana me pono me tika te mahi me te iwi taketake, ā, he mea nui kāore i ngaro taua āhuatanga piri tata. He āhuatanga tino tūpono noa.”  

I tino nganga te rōpū ki te whakarite kia tika ngā whakamāoritanga. E rua ngā kaiwhakamāori ngaio, ko Hemi Kelly rāua ko Piripi Walker i te taha o Whetu me te rōpū ki te whakamāori i te kete reo mō te kēmu, tae noa atu ki ngā tohutohu. I puta hoki ētahi kupu hou mō ētahi tuemi a MineCraft ake. 

“He mea kia manaakitia te ao Māori, te pupuri i te mana me ngā rawa hinengaro ahurea o ia taonga puta noa i te tukanga,” te kī a Whetu.  

Ko te mea uaua rawa ko te wā, e rima noa iho ngā wiki. I waimarie a Whetu i tautokona ia e ētahi atu Māori e mahi ana i roto i ngā mahi hangarau, ā, he tukanga tino mahi tēnei. Me te aha, nā ngā tamariki a Whetu i whakaū i te kounga – kia eke ai a Whetu ki runga rawa atu. 

Ko Heretaunga i te tuatahi, ā, ko te ao ā muri ake? 

Kāore e roa ka tae a Ngā Motu ki ngā minenga o te ao i tua o Aotearoa, ina kua uru a Piki Studios hei mema whaimana nō te Kaupapa Pātui a Minecraft, e taea e ia ngā rauemi te tāpiri tuihono i roto i te Wāhi Tauhokohoko o Minecraft i te ao whānui. I tēnei wā, ka wātea mātau ki ngā akomanga puta noa i Aotearoa, he wāhanga tēnei nō te Whakaaetanga Kura a Microsoft e tuku ana i ngā rauemi pērā i a Minecraft: Education Edition ki ia kura Kāwanatanga, Kura Tāuke hoki. 

“He rauemi tino whakamīharo a Ngā Motu mā ngā ākonga o Aotearoa me ngā kaiako me tō mātau mōhio anō ka tino rawe rawa atu ki a rātau te te hōpara me te waihanga i roto i tēnei ao,” te kī a Anne. 

“Ehara ko ngā tamariki anake a Whetu. I whakaaturia e mātau ki ētahi o ō mātau hoa o te ao, ā, i pupū mai te whakaongaonga i a rātau.” 

Tau kē tēnei tama nō Harataunga. 

Mō ētahi atu mōhiohio mō Minecraft: Education Edition i Aotearoa haere ki: http://elearning.tki.org.nz/Teaching/Future-focused-learning/Minecraft haere rānei ki Piki Studios https://www.pikistudios.com/ 

Hōpara i te ao o Ngā Motu me ngā akoranga a Minecraft: Education Edition i: http://aka.ms/ngamotu 

Tags: , , ,

Go to Original Article
Author: Microsoft News Center

Acronis automates ‘3-2-1’ rule for small business backup

Home office and small business owners might not always follow the “3-2-1” rule of backup: three copies of the data, on two different media, with one copy off-site. Acronis True Image 2020 is trying to make observing that rule a little easier.

True Image is Acronis’ small business backup software tool, aimed at consumers, home offices and SMBs. The 2020 version introduced a feature called Dual Protection, which automatically replicates a backup copy to the Acronis cloud when it’s creating the local copy. Aside from satisfying the 3-2-1 rule, it allows customers to have both a local copy for fast recovery and one more copy off-site as an extra failsafe.

True Image currently has 5.5 million users, according to Gaidar Magdanurov, chief marketing officer at Acronis. Magdanurov said approximately 70% of them are small businesses with fewer than five employees and “prosumers” — single-person businesses like consultants and lawyers who have a decent level of IT skill. The remaining 30% are consumers with limited IT skills who want to protect files such as music, photos and movies.

Magdanurov said businesses of all sizes understand the importance of protecting their data, and cyberattacks are not limited to large organizations. However, in the world of small business backup, cost can be a much bigger factor than in the enterprise space.

“Backup is like insurance — and nobody likes to pay for insurance,” Magdanurov said.

True Image 2020’s price didn’t change from its 2019 version. It is still split into three versions: Standard, a one-time purchase for $49.99 with no cloud backup component; Advanced, a one-year subscription for $49.99 that includes cloud features and 250 GB of Acronis Cloud Storage; and Premium, a one-year subscription for $99.99 that adds blockchain-based data certification and electronic signature capabilities.

The 2020 version also added some quality-of-life features for end users. Backup-related notifications can be pushed to the desktop tray, power management settings will suspend all scheduled backup tasks when a laptop’s battery drops below a chosen threshold and there’s an option to select only Wi-Fi networks the user trusts to upload backups to the cloud.

screenshot of Acronis True Image 2020
Acronis True Image 2020 simultaneously backs up to a local drive and the Acronis cloud.

All versions of Acronis True Image include Acronis Active Protection, a malware detection tool that uses AI to detect anomalous data changes, which could be indicators of unauthorized encryption. Rubrik Polaris Radar works similarly. Phil Goodwin, research director at IDC, said this is a better approach to combating ransomware because it thwarts attacks before they happen instead of rebuilding afterward.

“Acronis has some of the most advanced technology in the industry at this level,” Goodwin said. “It’s proactive rather than simply trying to restore data after it’s been lost.”

Acronis True Image competes against Carbonite, CrashPlan and Backblaze in the small business backup market. Goodwin said Carbonite, thanks to its acquisition of Mozy and Code42’s cloud backup customers, controls the lion’s share of this market. Magdanurov claimed a lot of new True Image customers switched from Carbonite specifically for ransomware detection or data protection for mobile devices.

They’re not backup experts, so the system has to be simple enough for them that once they set it up, it just runs.
Phil GoodwinResearch director, IDC

Other than cost, simplicity is another big concern among small business backup customers. These customers might not have the IT resources to implement or maintain complex infrastructure, even if they fully understand the benefits of having a hybrid environment or multiple clouds. Goodwin said even something as conceptually simple as the 3-2-1 rule can be more steps than a small business or home office is willing to deal with, which is why the True Image update is so significant.

“They’re not backup experts, so the system has to be simple enough for them that once they set it up, it just runs,” Goodwin said, referring to most small businesses.

For the most part, the more set-and-forget, one-and-done a backup product is, the better it is for these types of users. However, Goodwin added that Acronis made a good move adding a few customization options for the more IT-savvy power users — namely, the trusted Wi-Fi security settings and battery-saving power management feature.

Acronis True Image is aimed at small businesses of no more than five computers. Acronis Backup is geared for customers looking to protect more than that.

Go to Original Article
Author:

How to transfer FSMO roles with PowerShell

In the process of migrating to a new server or spreading the workload around, you might need to transfer FSMO roles in Active Directory from one domain controller to another.

AD relies on a concept called flexible server master operations roles, commonly referred to as FSMO roles. Domain controllers in an AD forest and domain hold one or more of these roles that handle different duties, such as keeping the AD schema in sync and synchronizing passwords across all domain controllers. You might need to spread these roles to other domain controllers to make AD operate more efficiently. As is the case when managing a Windows shop, you can manage much of your infrastructure either through the GUI or with PowerShell. There is no right or wrong way, but a script can be customized and reused, which saves some time and effort.

It’s not always easy to figure out which domain controller holds a particular role since FSMO roles tend to get spread out among various domain controllers. Then, once you’ve found the FSMO role, you need to juggle multiple windows if you try to manage them with the GUI. However, if you use PowerShell, we can both find where these FSMO roles live and easily move them to any domain controller with a script.

Before you get started

Before you can find and move FSMO roles with PowerShell, be sure to install Remote Server Administration Tools found here, which also includes the AD module. The computer you use PowerShell on should be on the domain, and you should have the appropriate permissions to move FSMO roles.

Use PowerShell to find FSMO roles

It’s not necessary to find the FSMO role holders before moving them, but it’s helpful to know the state before you make these types of significant changes.

There are two PowerShell commands we’ll use first to find the FSMO roles: Get-AdDomain and Get-AdForest. You need to use both commands since some FSMO holders reside at the forest level and some at the domain level. The AD module contains these cmdlets, so if you have that installed, you’re good to go.

First, you can find all the domain-based FSMO roles with the Get-AdDomain command. Since the Get-AdDomain returns a lot more than just FSMO role holders, you can reduce the output a bit with Select-Object:

Get-ADDomain | Select-Object InfrastructureMaster,PDCEmulator,RIDMaster | Format-List

[embedded content]
Migrating an AD domain controller

This command returns all the domain-based roles, including the Primary Domain Controller (PDC) emulator and Relative Identifier (RID) master, but we need to find the forest-level FSMO roles called domain naming master and schema master. For these FSMO roles, you need to use the Get-ADForest command.

Since Get-AdForest returns other information besides the FSMO role holders, limit the output using Select-Object to find the FSMO role holders we want.

Get-ADForest | Select-Object DomainNamingMaster,SchemaMaster | Format-List

How to transfer FSMO roles

It’s not necessary to find the FSMO role holders before moving them, but it’s helpful to know the state before you make these types of significant changes.

To save some time in the future, you can write a PowerShell function called Get-ADFSMORole that returns the FSMO role holders at the domain and the forest level in one shot.

function Get-ADFSMORole {
[CmdletBinding()]
param
()

Get-ADDomain | Select-Object InfrastructureMaster,PDCEmulator,RIDMaster
Get-ADForest | Select-Object DomainNamingMaster,SchemaMaster
}

Now that you have a single function to retrieve all the FSMO role holders, you can get to the task of moving them. To do that, call the function you made and assign a before state to a variable.

$roles = Get-ADFSMORole

With all the roles captured in a variable, you can transfer FSMO roles with a single command called Move-ADDirectoryServerOperationMasterRole. This command just handles moving FSMO roles. You can move each role individually by looping over each role name and calling the command, or you could do them all at once. Both methods work depending on how much control you need.

$destinationDc = 'DC01'
## Method 1
'DomainNamingMaster','PDCEmulator','RIDMaster','SchemaMaster','InfrastructureMaster' | ForEach-Object {
Move-ADDirectoryServerOperationMasterRole -OperationMasterRole $_ -Identity $destinationDc
}

## Method 2
Move-ADDirectoryServerOperationMasterRole -OperationMasterRole DomainNamingMaster,PDCEmulator,RIDMaster,SchemaMaster,InfrastructureMaster-Identity $destinationDc

After you run the command, use the custom Get-ADFSMORole function created earlier to confirm the roles now reside on the new domain controller.

Go to Original Article
Author: