Tag Archives: those

Wanted – PoE Switch / Unifi AC AP LR / NAS

1 x AC AP LR

How much you looking to pay for one of these?

Go to Original Article
Author:

How to install the Windows Server 2019 VPN

Many organizations rely on a virtual private network, particularly those with a large number of remote workers who need access to resources.

While there are numerous vendors selling their VPN products in the IT market, Windows administrators also have the option to use the built-in VPN that comes with Windows Server. One of the benefits of using Windows Server 2019 VPN technology is there is no additional cost to your organizations once you purchase the license.

Another perk with using a Windows Server 2019 VPN is the integration of the VPN with the server operating system reduces the number of infrastructure components that can break. An organization that uses a third-party VPN product will have an additional hoop the IT staff must jump through if remote users can’t connect to the VPN and lose access to network resources they need to do their jobs.

One relatively new feature in Windows Server 2019 VPN functionality is the Always On VPN, which some users in various message boards and blogs have speculated will eventually replace DirectAccess, which remains supported in Windows Server 2019. Microsoft cites several advantages of Always On VPN, including granular app- and traffic-based rules to restrict network access, support for both RSA and elliptic curve cryptography algorithms, and native Extensible Authentication Protocol support to enable the use of a wider variety of advanced authentication methods.

Microsoft documentation recommends organizations that currently use DirectAccess to check Always On VPN functionality before migrating their remote access processes.

The following transcript for the video tutorial by contributor Brien Posey explains how to install the Windows Server 2019 VPN role. 

In this video, I want to show you how to configure Windows Server 2019 to act as a VPN server.

Right now, I’m logged into a domain joined Windows Server 2019 machine and I’ll get the Server Manager open so let’s go ahead and get started.

The first thing that I’m going to do is click on Manage and then I’ll click on Add Roles and Features.

This is going to launch the Add Roles and Features wizard.

I’ll go ahead and click Next on the Before you begin screen.

For the installation type, I’m going to choose Role-based or feature-based installation and click Next. From there I’m going to make sure that my local server is selected. I’ll click Next.

Now I’m prompted to choose the server role that I want to deploy. You’ll notice that right here we have Remote Access. I’ll go ahead and select that now. Incidentally, in the past, this was listed as Routing and Remote Access, but now it’s just listed as a Remote Access. I’ll go ahead and click Next.

I don’t need to install any additional feature, so I’ll click Next again, and I’ll click Next [again].

Now I’m prompted to choose the Role Services that I want to install. In this case, my goal is to turn the server into a VPN, so I’m going to choose DirectAccess and VPN (RAS).

There are some additional features that are going to need to be installed to meet the various dependencies, so I’ll click Add Features and then I’ll click Next. I’ll click Next again, and I’ll click Next [again].

I’m taken to a confirmation screen where I can make sure that all of the necessary components are listed. Everything seems to be fine here, so I’ll click Install and the installation process begins.

So, after a few minutes the installation process completes. I’ll go ahead and close this out and then I’ll click on the Notifications icon. We can see that some post-deployment configuration is required. I’m going to click on the Open the Getting Started Wizard link.

I’m taken into the Configure Remote Access wizard and you’ll notice that we have three choices here: Deploy both DirectAccess and VPN, Deploy DirectAccess Only and Deploy VPN Only. I’m going to opt to Deploy VPN Only, so I’ll click on that option.

I’m taken into the Routing and Remote Access console. Here you can see our VPN server. The red icon indicates that it hasn’t yet been configured. I’m going to right-click on the VPN server and choose the Configure and Enable Routing and Remote Access option. This is going to open up the Routing and Remote Access Server Setup Wizard. I’ll go ahead and click Next.

I’m asked how I want to configure the server. You’ll notice that the very first option on the list is Remote access dial-up or VPN. That’s the option that I want to use, so I’m just going to click Next since it’s already selected.

I’m prompted to choose my connections that I want to use. Rather than using dial-up, I’m just going to use VPN, so I’ll select the VPN checkbox and click Next.

The next thing that I have to do is tell Windows which interface connects to the internet. In my case it’s this first interface, so I’m going to select that and click Next.

I have to choose how I want IP addresses to be assigned to remote clients. I want those addresses to be assigned automatically, so I’m going to make sure Automatically is selected and click Next.

The next prompt asks me if I want to use a RADIUS server for authentication. I don’t have a RADIUS server in my own organization, so I’m going to choose the option No, use Routing and Remote Access to authenticate connection requests instead. That’s selected by default, so I can simply click Next.

I’m taken to a summary screen where I have the chance to review all of the settings that I’ve enabled. If I scroll through this, everything appears to be correct. I’ll go ahead and click Finish.

You can see that the Routing and Remote Access service is starting and so now my VPN server has been enabled.

View All Videos

Go to Original Article
Author:

For Sale – Ubiquiti Unifi USG 3

Would you take £50 inc postage?

Go to Original Article
Author:

Digital transformation projects are an opportunity for healthcare CIOs

IT departments are central to digital transformation projects in healthcare. But for those projects to be successful, healthcare CIOs will need to ensure they’re ticking off the basic IT checklist while pushing their departments into new territory.

John Kravitz, CIO at Geisinger Health System in Danville, Penn., said digital transformation, or the use of digital technology to change how healthcare operates and delivers care, requires healthcare CIOs to think outside the box and consider new, digital ways to make IT and the overall health system operate more efficiently.

“Looking at transformation and how we’re about to approach that in IT, it’s extremely important that we take off the blinders and we look at things in a different way,” Kravitz said.

Before pursuing a digital transformation project, healthcare CIOs should start with the fundamentals such as making sure the healthcare organization has a solid IT infrastructure in place, according to Kravitz. At the 2019 CHIME Fall CIO Forum, Kravitz and Judy Kirby, CEO of executive search firm Kirby Partners in Heathrow, Fla., talked about why that strong IT foundation is so important and how healthcare CIOs can successfully lead digital transformation projects.

Building a strong foundation

Today’s healthcare CIOs are expected to be experts on emerging technology, yet they’re also tasked with IT basics like keeping the lights on.

Kirby Partners CEO Judy KirbyJudy Kirby

“Organizations are saying, ‘We’ve got to be digital; we’ve got to be transformational,'” Kirby said. “Yet they’re really confused on what that means and how to get there.”

For healthcare CIOs to lead digital transformation projects, Kirby said it’s necessary to get four things right first:

  1. Focus on the fundamentals

To get started, Kirby said it’s vital healthcare CIOs take stock of how the IT infrastructure is performing. Having an IT system that functions “exceptionally” can provide a strong foundation for digital transformation projects, she said.

“If you don’t have the IT train on the track, you can’t transform,” Kirby said. “So, you’ve got to do that first, you’ve got to do it well, you’ve got to do it exceptionally.”

She recommended CIOs use key performance indicators to set expectations for IT employees and to provide transparent metrics on what they need to deliver on, she said.

  1. Build up health IT leaders

Building a successful IT team means identifying weak links and finding ways to make the entire team stronger, Kirby said. Healthcare CIOs will need strong leaders to be digital transformation ambassadors, and their success will hinge on relationships within the healthcare organization. CIOs can lead by example to demonstrate how to build those relationships and provide good service, she said.

Kirby gave the example of a successful CIO who “insists on rounding,” or going out into the healthcare organization to assess employee needs and to foster relationships between IT and the clinical staff.

“When he sends his CTO out there to round, they don’t go by themselves,” she said “They go with one of their technicians who has a cartful of goodies — monitors, cables — so that when [they encounter] an issue, they try to fix it right there.”

  1. Keep the IT team engaged

Healthcare CIOs should engage their teams not just by setting expectations but by helping them meet realistic goals and celebrating the victories along the way. Celebrating success can go a long way in keeping the team engaged, she said.

“Don’t just make it when something large is going on, celebrate a lot,” Kirby said. “It keeps them happy, it keeps them successful, it keeps them wanting to do better and wanting to do more. I know you’re busy, but take the time.”

For Kirby, engagement also means taking the time to help the IT team grow and develop, she said.

  1. Communicate

Lastly, healthcare CIOs need to communicate frequently, in detail and in a way that is easy to understand, Kirby said.

“If there’s one thing we hear when we’re out there doing site visits, it’s, ‘We want a great communicator,'” Kirby said.

Leading digital transformation

Geisinger’s Kravitz comes at digital transformation from firsthand experience.

Looking at transformation and how we’re about to approach that in IT, it’s extremely important that we take off the blinders and we look at things in a different way.
John KravitzCIO, Geisinger Health System

While Kirby talked about the importance of building a strong foundation to support digital transformation projects, Kravitz spoke about how healthcare CIOs can then drive that transformation within their healthcare organizations.

He said successful digital transformation projects need executive leadership support. CIOs charged with leading the effort not just across IT but across the whole organization should make sure the IT and executive leadership teams are in sync on goals. Doing so presents a vision to employees and sets clear priorities.

Geisinger Health System CIO John KravitzJohn Kravitz

Kravitz said a good place to start is to identify three to five processes critical to the organization and then find ways to change and enhance those processes through digitization, such as making it easier for low-acuity patients in emergency rooms to receive care via telemedicine visits instead of waiting hours for an in-person visit.

“Look at those types of things where you make it a lot simpler, a lot cleaner,” Kravitz said. “Look at all the opportunities within your health system for faster service.”

Digital transformation isn’t just a top-down project, according to Kravitz. He said healthcare CIOs need to also start at the bottom by establishing performance targets for employees. Here, it’s important to assess and measure productivity, set clear goals and benchmark those goals, Kravitz said.

Kravitz said healthcare CIOs should also help to create a governance committee of executive and IT leaders from across the organization. The committee is charged with keeping the healthcare organization on the same page during the digital transformation effort. It is also responsible for establishing a communication program that provides regular progress updates and includes meetings for the project. Finally, it should work to develop what Kravitz called a “digital narrative” that will be used to explain the project and get buy-in from employees.

Go to Original Article
Author:

Wanted – Simple ATX Tower Case

Fractal Define R3

Go to Original Article
Author:

Implement automated employee onboarding with PowerShell

One of those most common tasks help desk technicians and system administrators handle is provisioning all the resources to onboard new employees.

Depending on the organization, these tasks may include creating Active Directory user accounts, creating home folders, provisioning new Office 365 mailboxes and setting up a VoIP extension in the phone system. With a little PowerShell coding, you can put together an automated employee onboarding script that does a majority of this work in little to no time.

To automate this process, it’s essential to define all the tasks. Many companies have a document that outlines the steps to onboard a new employee, such as the following:

  • create an Active Directory user account;
  • create a user folder on a file share; and
  • assign a mobile device.

When building a PowerShell script, start by researching if the basics required for automation are available. For example, does the system you’re using to assign a mobile device have an API? If not, then it can’t be completely automated. Rather than bypass this step, you can still add something to your script that sends an email to the appropriate person to complete the setup.

For other tasks that PowerShell can automate, you can start by scaffolding out some code in your script editor.

Build the framework for the script

Start by adding some essential functions to encapsulate each task. The following code is an example of build functions for each of the tasks we want to automate.

param(
[Parameter()]
[ValidateNotNullOrEmpty()]
[string]$CsvFilePath
)

function New-CompanyAdUser {
[CmdletBinding()]
param
(

)

}

function New-CompanyUserFolder {
[CmdletBinding()]
param
(

)

}

function Register-CompanyMobileDevice {
[CmdletBinding()]
param
(

)

}

function Read-Employee {
[CmdletBinding()]
param
(

)

}

This isn’t our final code. This is just a brainstorming exercise.

Add the code to receive input

Notice the param block at the top and the Read-Employee function. This function receives any type of input, such as a CSV file or database. When we create a function, it’s easy to modify the code if the method changes.

For now, we are using a CSV file to make the Read-Employee script below. By default, this function takes the CSV file path when the script runs.

function Read-Employee {
[CmdletBinding()]
param
(
[Parameter()]
[ValidateNotNullOrEmpty()]
[string]$CsvFilePath = $CsvFilePath
)

Import-Csv -Path $CsvFilePath

}

Add a Read-Employee reference below this function.

We have a CSV file from human resources that looks like this:

FirstName,LastName,Department
Adam,Bertram,Accounting
Joe,Jones,HR

We’ll give the script a name New-Employee.ps1 with the CsvFilePath parameter.

./New-Employee.ps1 -CsvFilePath './Employees.csv'

Developing the functions

Next, fill in the other functions. This is just an example but should give you a good idea of how you could build your code for the specifics your automated employee onboarding script should have. For more information on the creation of the New-CompanyAdUser function, you can find more information at this blog post.

param(
[Parameter()]
[ValidateNotNullOrEmpty()]
[string]$CsvFilePath
)

function New-CompanyAdUser {
[CmdletBinding()]
param
(
[Parameter(Mandatory)]
[ValidateNotNullOrEmpty()]
[pscustomobject]$EmployeeRecord
)

## Generate a random password
$password = [System.Web.Security.Membership]::GeneratePassword((Get-Random -Minimum 20 -Maximum 32), 3)
$secPw = ConvertTo-SecureString -String $password -AsPlainText -Force

## Generate a first initial/last name username
$userName = "$($EmployeeRecord.FirstName.Substring(0,1))$($EmployeeRecord.LastName))"

## Create the user
$NewUserParameters = @{
GivenName = $EmployeeRecord.FirstName
Surname = $EmployeeRecord.LastName
Name = $userName
AccountPassword = $secPw
}
New-AdUser @NewUserParameters

## Add the user to the department group
Add-AdGroupMember -Identity $EmployeeRecord.Department -Members $userName
}

function New-CompanyUserFolder {
[CmdletBinding()]
param
(
[Parameter(Mandatory)]
[ValidateNotNullOrEmpty()]
[pscustomobject]$EmployeeRecord
)

$fileServer = 'FS1'

$null = New-Item -Path "\\$fileServer\Users\$($EmployeeRecord.FirstName)$($EmployeeRecord.LastName)" -ItemType Directory

}

function Register-CompanyMobileDevice {
[CmdletBinding()]
param
(
[Parameter(Mandatory)]
[ValidateNotNullOrEmpty()]
[pscustomobject]$EmployeeRecord
)

## Send an email for now. If we ever can automate this, we'll do it here.
$sendMailParams = @{
'From' = '[email protected]'
'To' = '[email protected]'
'Subject' = 'A new mobile device needs to be registered'
'Body' = "Employee: $($EmployeeRecord.FirstName) $($EmployeeRecord.LastName)"
'SMTPServer' = 'smtpserver.something.local'
'SMTPPort' = '587'
}

Send-MailMessage @sendMailParams

}

function Read-Employee {
[CmdletBinding()]
param
(
[Parameter()]
[ValidateNotNullOrEmpty()]
[string]$CsvFilePath = $CsvFilePath
)

Import-Csv -Path $CsvFilePath

}

Read-Employee

Calling the functions

Once you build the functions, pass each of the employee records returned from Read-Employee to each function, as shown below.

$functions = 'New-CompanyAdUser','New-CompanyUserFolder','Register-CompanyMobileDevice'
foreach ($employee in (Read-Employee)) {
foreach ($function in $functions) {
& $function -EmployeeRecord $employee
}
}

By standardizing the function parameters to have a single parameter EmployeeRecord which coincides to a row in the CSV file, you can define the functions you want to call in an array and loop over each of them.

Click here to download the code used in this article on GitHub.

Go to Original Article
Author:

Gears 5 Versus Multiplayer Tech Test Now Open to All Xbox Live Gold Members – Xbox Wire

The Gears 5 Versus Tech Test began last weekend, inviting both those players who have pre-ordered Gears 5 and Xbox Game Pass members to rev up their Lancers and roadie run through their opponents on the way to victory. Thanks to the incredible Gears fans, both new and veteran, who made our initial Gears 5 Versus Tech Test weekend a smashing success. This weekend we’re extending the invitation to join the action to all Xbox Live Gold members as part of Xbox Live Free Play Days this weekend.

Starting at 10 a.m. PDT tomorrow, COG hopefuls who are new to the Gears 5 fight can enlist in the Boot Camp training mode and then jump right into Arcade, the new game type designed for over-the-top fun. If you’re returning for another round following last week’s Tech Test, be sure to check out fan-favorite King of the Hill, the updated competitive game type Escalation and continue your Tour of Duty, a series of challenges that grant sweet rewards. A new reward has also been added – the Wreath Bloodspray, which can be earned by completing one match during the Tech Test from 5:30 – 6:30 pm PDT on Friday, July 26. No matter your level of Gears familiarity, Gears 5 offers a game type for you.

To ensure you’re ready for battle, search the Microsoft Store for Gears 5 Tech Test and download to your library. The Tech Test is available for download right now, but servers will be offline until tomorrow at 10 a.m. PDT. We also wanted to remind you that online multiplayer will require Xbox Live Gold for Xbox Game Pass for Console members, and that because this Tech Test is to help test our servers, you might encounter some queueing as you start to play. We hope to make Gears 5 a great online experience for all players and Xbox Game Pass members on launch, and this Tech Test will help.

Thanks for your support and we look forward to seeing you online! Be sure to visit www.gears5.com and follow @GearsofWar on Facebook and Twitter to keep up-to-date with the latest on Gears 5 ahead of its worldwide launch on September 10.

Gears 5 will release on September 10 for Xbox One, Windows 10 PC and Xbox Game Pass. Early access starts on September 6 for Xbox Game Pass Ultimate members and Gears 5 Ultimate Edition purchasers. Pre-order details can be found on the Microsoft Store.

Go to Original Article
Author: Microsoft News Center

Enterprises that use data will thrive; those that don’t, won’t

There’s a growing chasm between enterprises that use data, and those that don’t.

Wayne Eckerson, founder and principal consultant of Eckerson Group, calls it the data divide, and according to Eckerson, the companies that will thrive in the future are the ones that are already embracing business intelligence no matter the industry. They’re taking human bias out of the equation and replacing it with automated decision-making based on data and analytics.

Those that are data laggards, meanwhile, are already in a troublesome spot, and those that have not embraced analytics as part of their business model at all are simply outdated.

Eckerson has more than 25 years of experience in the BI industry and is the author of two books — Secrets of Analytical Leaders: Insights from Information Insiders and Performance Dashboards: Measuring, Monitoring, and Managing Your Business.  

In the first part of a two-part Q&A, Eckerson discusses the divide between enterprises that use data and those that don’t, as well as the importance of DataOps and data strategies and how they play into the data divide. In the second part, he talks about self-service analytics, the driving force behind the recent merger and acquisition deals, and what intrigues him about the future of BI.

How stark is the data divide, the gap between enterprises that use data and those that don’t?

Wayne Eckerson: It’s pretty stark. You’ve got data laggards on one side of that divide, and that’s most of the companies out there today, and then you have the data elite, the companies [that] were born on data, they live on data, they test everything they do, they automate decisions using data and analytics — those are the companies [that] are going to take the future. Those are the companies like Google and Amazon, but also companies like Netflix and its spinoffs like Stitch Fix. They’re heavily using algorithms in their business. Humans are littered with cognitive biases that distort our perception of what’s going on out there and make it hard for us to make objective, rational, smart decisions. This data divide is a really interesting thing I’m starting to see happening that’s separating out the companies [that] are going to be competitive in the future. I think companies are really racing, spending money on data technologies, data management, data analytics, AI.

How does a DataOps strategy play into the data divide?

Headshot of Wayne Eckerson, founder and principal consultant of Eckerson GroupWayne Eckerson

Eckerson: That’s really going to be the key to the future for a lot of these data laggards who are continually spending huge amounts of resources putting out data fires — trying to fix data defects, broken jobs, these bottlenecks in development that often come from issues like uncoordinated infrastructure for data, for security. There are so many things that prevent BI teams from moving quickly and building things effectively for the business, and a lot of it is because we’re still handcrafting applications rather than industrializing them with very disciplined routines and practices. DataOps is what these companies need — first and foremost it’s looking at all the areas that are holding the flow of data back, prioritizing those and attacking those points.

What can a sound DataOps strategy do to help laggards catch up?

Eckerson: It’s improving data quality, not just at the first go-around when you build something but continuous testing to make sure that nothing is broken and users are using clean, validated data. And after that, once you’ve fixed the quality of data and the business becomes more confident that you can deliver things that make sense to them, then you can use DataOps to accelerate cycle times and build more things faster. This whole DataOps thing is a set of development practices and testing practices and deployment and operational practices all rolled into a mindset of continuous improvement that the team as a whole has to buy into and work on. There’s not a lot of companies doing it yet, but it has a lot of promise.

Data strategy differs for each company given its individual needs, but as BI evolves and becomes more widespread, more intuitive, more necessary no matter the size of the organization and no matter the industry, what will be some of the chief tenets of data strategy going forward?

Eckerson: Today, companies are racing to implement data strategies because they realize they’re … data laggard[s]. In order to not be disrupted in this whole data transformation era, they need a strategy. They need a roadmap and a blueprint for how to build a more robust infrastructure for leveraging data, for internal use, for use with customers and suppliers, and also to embed data and analytics into the products that they build and deliver. The data strategy is a desire to catch up and avoid being disrupted, and also as a way to modernize because there’s been a big leap in the technologies that have been deployed in this area — the web, the cloud, big data, big data in the cloud, and now AI and the ability to move from reactive reporting to proactive predictions and to be able to make recommendations to users and customers on the spot. This is a huge transformation that companies have to go through, and so many of them are starting at zero.

So it’s all about the architecture?

Eckerson: A fundamental part of the data strategy is the data architecture, and that’s what a lot of companies focus on. In fact, for some companies the data strategy is synonymous with the data architecture, but that’s a little shortsighted because there are lots of other elements to a data strategy that are equally important. Those include the organization — the people and how they work together to deliver data capabilities and analytic capabilities — and the culture, because you can build an elegant architecture, you can buy and deploy the most sophisticated tools. But if you don’t have a culture of analytics, if people don’t have a mindset of using data to make decisions, to weigh options to optimize processes, then it’s all for naught. It’s the people, it’s the processes, it’s the organization, it’s the culture, and then, yes, it’s the technology and the architecture too.

Editors’ note: This interview has been edited for clarity and conciseness.

Go to Original Article
Author:

Azure preparedness for Hurricane Florence

As Hurricane Florence continues its journey to the mainland, our thoughts are with those in its path. Please stay safe. We’re actively monitoring Azure infrastructure in the region. We at Microsoft have taken all precautions to protect our customers and our people.

Our datacenters (US East, US East 2, and US Gov Virginia) have been reviewed internally and externally to ensure that we are prepared for this weather event. Our onsite teams are prepared to switch to generators if utility power is unavailable or unreliable. All our emergency operating procedures have been reviewed by our team members across the datacenters, and we are ensuring that our personnel have all necessary supplies throughout the event.

As a best practice, all customers should consider their disaster recovery plans and all mission-critical applications should be taking advantage of geo-replication.

Rest assured that Microsoft is focused on the readiness and safety of our teams, as well as our customers’ business interests that rely on our datacenters. 

You can reach our handle @AzureSupport on Twitter, we are online 24/7. Any business impact to customers will be communicated through Azure Service Health in Azure portal.

If there is any change to the situation, we will keep customers informed of Microsoft’s actions through this announcement.

For guidance on Disaster Recovery best practices see references below: 

Windows troubleshooting tools to improve VM performance

Whether virtualized workloads stay on premises or move to the cloud, support for those VMs remains in the data center with the administrator.

When virtualized workloads don’t perform as expected, admins need to roll up their sleeves and break out the Windows troubleshooting tools. Windows has always had some level of built-in diagnostic ability, but it only goes so deep.

Admins need to stay on top of ways to analyze ailing VMs, but they also need to find ways to trim deployments to control resource use and costs for cloud workloads.

VM Fleet adds stress to your storage

VM Fleet tests the performance of your storage infrastructure by simulating virtual workloads. VM Fleet uses PowerShell to create a collection of VMs and run a stress test against the allocated storage.

This process verifies that your storage meets expectations before deploying VMs to production. VM Fleet doesn’t help troubleshoot issues, but it helps confirm the existing performance specifications before you ramp up your infrastructure. After the VMs are in place, you can use VM Fleet to perform controlled tests of storage auto-tiering and other technologies designed to adjust workloads during increased storage I/O.

VM Fleet tests the performance of your storage infrastructure by simulating virtual workloads.

Sysinternals utilities offer deeper insights

Two Windows troubleshooting tools from the Microsoft Sysinternals collection, Process Explorer and Process Monitor, should be staples for any Windows admin.

Process Explorer gives you in-depth detail, including the dynamic link library and memory mapped files loaded by a process. Process Explorer also lets you dig in deep to uncover issues rather than throwing more resources at an application and, thus, masking the underlying problem.

Process Explorer
Process Explorer lets administrators do a technical deep dive into Windows processes that the Task Manager can’t provide.

Process Monitor captures real-time data of process activity, and Registry and file system changes on Windows systems. It also provides detailed information on the process trees.

Administrators can use Process Monitor’s search and filtering functions to help administrator focus on particular events that occur over a longer period of time.

VMMap and RAMMap detail the memory landscape

Another Sysinternals tool called VMMap shows what types of virtual memory is assigned to a process and its committed memory, which is the virtual memory reserved by the operating system. This tool shows where allocated memory is used with a visual presentation.

VMMap measurements
VMMap shows how the operating system maps physical memory and uses memory in the virtual space to help administrators analyze how applications work with memory resources.

VMMap doesn’t check the hypervisor layer, but it does detail virtual memory use provided by the OS. Combined with other tools that view the hypervisor, VMMap gives a complete picture of the applications’ memory usage.

Another tool called RAMMap is similar to VMMap, but it works at the operating system level rather than the process level. Administrators can use both tools to get a complete picture of how applications are getting and using the memory.

BgInfo puts pertinent information on display

BgInfo is a small Sysinternals utility that displays selected system information on the desktop, such as the machine name, IP address, patch version and storage information.

While it’s not difficult to find these settings, making them more visible can help when you log into multiple VMs in a short amount of time. It’s also helpful to avoid installations on the wrong VM or even rebooting the wrong VM.