Tag Archives: complex

Find and lock down lax Windows share permissions

Keeping your data secure and away from unauthorized users is a complex task, which can be even more difficult if a default setting in Windows gets in your way.

Trying to secure Windows share permissions is a big challenge due to a setting called bypass traverse checking that the OS enables by default. This setting gives access to folders even if the user does not have access rights to any of its parents.

We can remove this authorization with group policy object setting, but it’s there for a reason. Without this setting enabled, you will see a big drop in performance since Windows will check every parent folder to see if the user is allowed to go to the target.

This article will explain how to create a report on Windows share permissions to determine which users have excessive authorizations and how to mend it using PowerShell and Sysinternals.

Gathering file shares and their authorized users

First, we need to find the file shares on the servers and client systems. We could do this either by using the Get-SmbShare command or by calling the win32_share namespace using either Get-CimInstance or Get-WmiObject.

For this example, Get-WmiObject is the preferred way to fetch our shares because it’s a more streamlined approach. Launch the PowerShell Terminal as an admin on a file server and enter the following command:

Get-WMIObject -Class win32_share

Name Path Description
---- ---- -----------
MyShare C:demoshare Demo share
ADMIN$ C:WINDOWS Remote Admin
C C:
C$ C: Default share
D$ D: Default share
E$ E: Default share
IPC$ Remote IPC
print$ C:WINDOWSsystem32spooldrivers Printer Drivers
scripts C:scripts

The PowerShell command outputs all the shares, but it doesn’t show the users with access to them. That’s because the Windows share permissions reside in another namespace called Win32_LogicalShareSecuritySetting:

Get-WmiObject -Class Win32_LogicalShareSecuritySetting

This resulting output doesn’t tell us much either. We need a more comprehensive PowerShell script to generate something more useful:

# Get all shares on the computer
$Shares = Get-WMIObject -Class win32_share

# Variable to processed shares to.
$NetworkShares = [System.Collections.Generic.List[PSCustomObject]]::new()

# Ignore default shares by filtering out '2147483648'
foreach ($Share in $Shares | ? {$_.Type -ne '2147483648' -and $_.Name -ne 'print$'}) {

# Create an object that we'll return
$ShareObject = [PSCustomObject]@{
Name = $Share.Name
Description = $Share.Description
LocalPath = $Share.Path
ACL = [System.Collections.ArrayList]::new()

}
# Get the security settings for the share
$ShareSecurity = Get-WmiObject -Class Win32_LogicalShareSecuritySetting -Filter "name='$($Share.Name)'"

# If security settings exists, build a list with ACLs
if($Null -ne $ShareSecurity){
Try{
$SecurityDescriptor = $ShareSecurity.GetSecurityDescriptor().Descriptor

foreach($AccessControl in $SecurityDescriptor.DACL){

$UserName = $AccessControl.Trustee.Name
$Trustee = $AccessControl.Trustee

If ($Trustee.Domain -ne $Null) {
$UserName = "$($Trustee.Domain)$UserName"
}

If ($Trustee.Name -eq $Null) {
$UserName = $Trustee.SIDString
}

$ShareObject.ACL.Add(
[System.Security.AccessControl.FileSystemAccessRule]::new(
$UserName,
$AccessControl.AccessMask,
$AccessControl.AceType
)
) | Out-Null
}

# Return the share object with the ACLs
$NetworkShares.Add($ShareObject)
}
Catch{
Write-Error $Error[0]
}
}
Else {
Write-Information "No permissions found for $($Share.Name) on $ComputerName"
}
}

The content of the $NetworkShares variable should end up looking similar to the following:

PS51> $NetworkShares

Name Description LocalPath ACL
---- ----------- --------- ---
DemoShare Demo share C:demoshare {System.Security.AccessControl.FileSystemAccessRule}
scripts C:scripts {System.Security.AccessControl.FileSystemAccessRule, System.Security.AccessControl.FileSystemAccessRule}

PS51> $NetworkShares[0].ACL

FileSystemRights : FullControl
AccessControlType : Allow
IdentityReference : Everyone
IsInherited : False
InheritanceFlags : None
PropagationFlags : None

We’ve successfully gathered data about our Windows share permissions, showing who has access to what. That might not be enough because administrators usually assign network share permissions on the NTFS level, not the network share level.

We also need to check the files and folders in the share if there are excessive permissions for other groups, such as Everyone or Domain Users.

Scanning file permissions using AccessChk

We have a list of our file shares. Next, we need to get all the file permissions. The fastest way to do this is by using the AccessChk file utility from the Sysinternals suite and parse the output with PowerShell.

Put AccessChk on your file server and copy the AccessChk64.exe file to your system32 folder. You can either download the utility from the link above or use the following PowerShell code to download it and copy it to your system32 folder:

Invoke-WebRequest -OutFile $env:TEMPAccessChk.zip -Uri https://download.sysinternals.com/files/AccessChk.zip 
Expand-Archive -Path $env:TEMPAccessChk.zip -DestinationPath $env:TEMP -Force
Copy-Item -Path $env:TEMPAccessChk64.exe C:WindowsSystem32AccessChk64.exe

We can use PowerShell to create a wrapper function around AccessChk for use in a script:

Function Invoke-AccessChk {
param(
$Path,
$Principals,
$AccessChkPath = "$env:windirsystem32accesschk64.exe",
[switch]$DirectoriesOnly,
[switch]$AcceptEula

)

# Accept EULA
if($AcceptEula){
& $AccessChkPath /accepteula | Out-Null
}

$Argument = "uqs"
if($DirectoriesOnly){
$Argument = "udqs"
}

$Output = & $AccessChkPath -nobanner -$Argument $Path

Foreach($Row in $Output){

# If it's a row with a file path output the previous object and create a new one
if($Row -match "^S"){
If($Null -ne $Object){
if($Object.Access.Keys.Count -gt 0){
$Object
}
}
$Object = [PSCustomObject]@{
Path = $Row
Access = @{}
}
}

# If it's a row with permissions
if($Row -match "^ [R ][W ]"){
If($Row -match ($Principals -replace "\",'\' -join "|")){

$Row -match "^ (?<Read>[R ])(?<Write>[W ]) (?<Principal>.*)" | Out-Null

$Object.Access[$Matches.Principal] = @{
Read = $Matches.Read -eq 'R'
Write = $Matches.Read -eq 'W'
}

}
}
}
# If it's the last row - output the object once more
if($Object.Access.Keys.Count -gt 0){
$Object
}
}

We can now run Invoke-AccessChk with the network shares stored in the $NetworkShares variable from the previous step. We add to a list of the security principals — without “domain” — to find:

# Invoke-AccessChk will only output files/folders where the following principals have permission:
$RiskPrincipals = @(
'Everyone',
'Domain Users',
'Domain Computers',
'Authenticated Users',
'Users'
)

$RiskyPermissions = Foreach($NetworkShare in $NetworkShares | Select -First 1){

# Only scan directory if it's shared to one of the principals in $RiskPrincipals
$RiskPrincipalExist = $Null -ne ($NetworkShare.ACL.IdentityReference.Value -replace ".*\" | ? {$_ -in $RiskPrincipals})

if($RiskPrincipalExist){
Invoke-AccessChk -Path $NetworkShare.LocalPath -Principals $RiskPrincipals
}

}

The $RiskyPermissions variable will give output similar to this:

PS51> $RiskyPermissions

Path Access
---- ------
C:demoshareFile1.txt {BUILTINUsers, NT AUTHORITYAuthenticated Users}
C:demoshareFolder1picture.png {NT AUTHORITYAuthenticated Users}
C:demoshareFolder1Folder2 {NT AUTHORITYAuthenticated Users}

PS51> $RiskyPermissions[0].Access

Creating a report from several computers and servers

Thus far, you can get a list of all the file shares and check all the files with the PowerShell wrapper for Invoke-AccessChk. One of PowerShell’s many strengths is its ability to scale. PowerShell remoting will take the code we’ve produced to the next level to gather the information from several computers at once.

First, we need a list of computers and servers to scan. If possible, the easiest way is through the Active Directory module from RSAT:

$Computers = (Get-ADComputer -Filter *).dnsHostName

This method might not be an option in larger environments that are heavily segmented. Another approach is to get data from your configuration management database or entering it manually using the following example:

$Computers = @(
'Server1',
'Server2',
'Server3',
'Server4',
'Server5',
'PC1'
# etc
)

Now it’s time to tie all these components in a script that uses PowerShell background jobs to do the following actions on the machines specified in the $Computers parameter:

  • Get all shares that are shared out to one of the principals in $RiskPrincipals.
  • Download AccessChk if it does not already exist.
  • Check the NTFS permission of all shares gathered by AccessChk.
  • Return an object with a list with all files where the security principals in $RiskPrincipals have either read or write permissions.

The computer running the script will then collect the results of all jobs and output it to a CSV file with the name ShareAccessReport.

Remember to run the following as an admin on a computer that has network access to said machines and to accept the EULA for AccessChk by changing $AcceptEula to true:

$Computers = @(
'Server-1',
'Server-2',
'PC-1'
)

# Accept EULA for AccessChk
# CHANGE TO TRUE
$AcceptEula = $false

if(!$AcceptEula){
Write-Warning "Did not accept EULA for AccessChk, can't continue"
break
}

# Principals that we want to scan for
$RiskPrincipals = @(
'Everyone',
'Domain Users',
'Domain Computers',
'Authenticated Users',
'Users'
)

# List of shares that we want to ignore.
# Setting a share name tied to it just in case since it should almost always be that path
$IgnoreShares = @(
'print$'
)

# Scriptblock that we'll send with Invoke-Command
$Scriptblock = {

$RiskPrincipals = $args[0].RiskPrincipals
$IgnoreShares = $args[1].IgnoreShares
$AcceptEula = $args[2].AcceptEula

# Functions to download and use AccessChk
# It utilizes a shell object instead of Expand-Archive for backward compatability
Function Download-AccessChk {
param(
$Url = "https://download.sysinternals.com/files/AccessChk.zip",
$Dest = $env:temp
)
if(Test-Path "$destaccesschk.zip"){
rm $DestAccessCHK.zip -Force
}
(New-Object System.Net.WebClient).DownloadFile($url, "$env:tempAccessChk.zip")
$Shell = New-Object -ComObject Shell.Application
$Zip = $shell.NameSpace("$env:tempAccessChk.zip")
$Destination = $shell.NameSpace("$env:windirsystem32")

$copyFlags = 0x00
$copyFlags += 0x04
$copyFlags += 0x10

$Destination.CopyHere($Zip.Items(), $copyFlags)
}

# The function that utilizes accesschk from part 2
Function Invoke-AccessChk {
param(
$Path,
$Principals,
$AccessChkPath = "$env:windirsystem32accesschk64.exe",
[switch]$DirectoriesOnly,
[switch]$AcceptEula

)

if(!(Test-Path "$env:windirsystem32accesschk64.exe")){
Download-AccessChk
}

# Accept EULA
if($AcceptEula){
& $AccessChkPath /accepteula | Out-Null
}

$Argument = "uqs"
if($DirectoriesOnly){
$Argument = "udqs"
}

$Output = & $AccessChkPath -nobanner -$Argument $Path

Foreach($Row in $Output){

# If it's a row with a file path output the previous object and create a new one
if($Row -match "^S"){
If($Null -ne $Object){
if($Object.Access.Keys.Count -gt 0){
$Object
}
}
$Object = [PSCustomObject]@{
Path = $Row
Access = @{}
}
}

# If it's a row with permissions
if($Row -match "^ [R ][W ]"){
If($Row -match ($Principals -replace "\",'\' -join "|")){

$Row -match "^ (?<Read>[R ])(?<Write>[W ]) (?<Principal>.*)" | Out-Null

$Object.Access[$Matches.Principal] = @{
Read = $Matches.Read -eq 'R'
Write = $Matches.Read -eq 'W'
}

}
}
}
# If it's the last row - output the object once more
if($Object.Access.Keys.Count -gt 0){
$Object
}
}

# Get all the shares by using WMI
$Shares = Get-WmiObject -Class win32_share

# Create an object that we will later return when we're done
$ReturnObject = [PSCustomObject]@{
ComputerName = $ComputerName
NetworkShares = [System.Collections.Generic.List[PSCustomObject]]::new()
AccessibleObjects = @{}
}

# Ignore default shares by filtering out '2147483648'
# Ignore shares in $IgnoreShares
foreach ($Share in $Shares | ? {$_.Type -ne '2147483648'} | ? {$_.Name -notin $IgnoreShares}) {
$ShareObject = [PSCustomObject]@{
Name = $Share.Name
Description = $Share.Description
LocalPath = $Share.Path
ACL = [System.Collections.ArrayList]::new()

}

$ShareSecurity = Get-WMIObject -Class Win32_LogicalShareSecuritySetting -Filter "name='$($Share.Name)'"
if($Null -ne $ShareSecurity){
Try{
$SecurityDescriptor = $ShareSecurity.GetSecurityDescriptor().Descriptor

foreach($AccessControl in $SecurityDescriptor.DACL){

$UserName = $AccessControl.Trustee.Name
$Trustee = $AccessControl.Trustee

If ($Trustee.Domain -ne $Null) {
$UserName = "$($Trustee.Domain)$UserName"
}

If ($Trustee.Name -eq $Null) {
$UserName = $Trustee.SIDString
}

$ShareObject.ACL.Add(
[System.Security.AccessControl.FileSystemAccessRule]::new(
$UserName,
$AccessControl.AccessMask,
$AccessControl.AceType
)
) | Out-Null
}

# Only add network share if it contains a risk user/group

$Match = $False
Foreach($IdentityReference in $ShareObject.ACL.IdentityReference.Value){
Foreach($Pattern in $RiskPrincipals){
if($IdentityReference -Match $Pattern){
$Match = $True
}
}
}
if($Match){
$ReturnObject.NetworkShares.Add($ShareObject)
}
Else {
Write-Verbose "No match for risky groups, not adding"
}

}
Catch{
Write-Error $Error[0]
}
}
Else {
Write-Information "No permissions found for $($Share.Name) on $ComputerName"
}

}
# Get all files from NetworkShares where a principal from $RiskPrincipals have either read or write access
$ReturnObject.NetworkShares | Foreach {
$ReturnObject.AccessibleObjects[$_.Name] = Invoke-AccessChk -Path $_.LocalPath -Principals $RiskPrincipals -AcceptEula:$AcceptEula
}

# Done! Lets return the returnobject:
$ReturnObject
}

# To add to the argument list of Invoke-Job because the remote PowerShell job doesn't have access to our variable space.
$InvokeParam = @{
RiskPrincipals = $RiskPrincipals
IgnoreShares = $IgnoreShares
AcceptEula = $AcceptEULA
}

# Start jobs
$Job = Invoke-Command -AsJob -ComputerName $Computers -ArgumentList $InvokeParam -ScriptBlock $Scriptblock

# Wait for jobs to finish
$Job | Wait-Job

# Collect data from all jobs
$Output = Get-Job | Receive-Job

# Output the output into a CSV
$ToCSV = Foreach($Result in $Output){

Foreach($Key in $Result.AccessibleObjects.Keys) {

# For using Select-Object expressions to get the data out of $Result.AccessibleObjects
# The downside of working a lot with hashtables
$ReadAccess = @{
Name='ReadAccess'
Expression={
$Base = $_.Access
($Base.Keys | ? {$Base[$_].Read}) -join ","
}
}

$WriteAccess = @{
Name='WriteAccess'
Expression={
$Base = $_.Access
($Base.Keys | ? {$Base[$_].Write}) -join ","
}
}

# Select from AccessibleObjects and create property for the principals with ReadAccess and WriteAccess
$Result.AccessibleObjects[$Key] | Select @{Name='ShareName';Expression={$Key}},Path,$ReadAccess,$WriteAccess
}
}
# Export the CSV
$ToCSV | Export-Csv -Path .ShareAccessReport.csv

When the PowerShell job finishes, it will create a full report of the access of the principals in the $RiskyPrincipals variable.

Fixing Windows share permissions

After you review the CSV and find the permissions that need adjusting, there are two ways to correct them. If there are only a few, then the best way is through the GUI. But if there are thousands, then the following command will use the CSV output to speed this along:

# This needs to run locally on the server with the file share.

$UserToRemove = 'Guest'
$CSV = Import-Csv -Path .ShareAccessReport.csv | ? {}
$CSV | ? {$_.ComputerName -eq $env:COMPUTERNAME} | Foreach {
$ACL = Get-Acl -Path $_.Path
$ACL.Access | ? {($_.IdentityReference.Value -replace '.*\') -eq $UserToRemove} | Foreach {
$ACL.Access.Remove($_)
}
}

This PowerShell script will remove all permissions for the Guest security principal.

The first report will usually bring a lot of work though because it will discover a lot of oddities and risks when it comes to your Windows share permissions. But running a solution like this regularly, especially targeted toward shares with sensitive information, will pay off in the end.

Go to Original Article
Author:

Microsoft for Healthcare: Empowering our customers and partners to provide better experiences, insights and care – The Official Microsoft Blog

At Microsoft, our goal within healthcare is to empower people and organizations to address the complex challenges facing the healthcare industry today. We help do this by co-innovating and collaborating with our customers and partners as a trusted technology provider. Today, we’re excited to share progress on the latest innovations from Microsoft aimed at helping address the most prevalent and persistent health and business challenges:

  • Empower care teams with Microsoft 365: Available in the coming weeks, the new Bookings app in Microsoft Teams will empower care teams to schedule, manage and conduct virtual visits with remote patients via video conference. Also coming soon, clinicians will be able to target Teams messages to recipients based on the shift they are working. Finally, healthcare customers can support their security and compliance requirements with the HIPAA/HITECH assessment in Microsoft Compliance Score.
  • Protect health information with Azure Sphere: Microsoft’s integrated security solution for IoT (Internet of Things) devices and equipment – is now widely available for the development and deployment of secure, connected devices. Azure Sphere helps securely personalize patient experiences with connected devices and solutions. And, to make it easier for healthcare leaders to develop their own IoT strategies, today we’re launching a new IoT Signals report focused on the healthcare industry that provides an industry pulse on the state of IoT adoption and helpful insights for IoT strategies. Learn more about Microsoft’s IoT offerings for healthcare here.
  • Enable personalized virtual care with Microsoft Healthcare Bot: Today, we’re pleased to announce that Microsoft Healthcare Bot, our HITRUST-certified platform for creating virtual health assistants, is enriching its healthcare intelligence with new built-in templates for healthcare-specific use cases, and expanding its integrated medical content options. With the addition of Infermedica, a cutting-edge triage engine based on advanced artificial intelligence (AI) that enables symptom checking in 17 languages Healthcare Bot is empowering providers to offer global access to care.
  • Reimagine healthcare using new data platform innovations: With the 2019 release of Azure API for FHIR, Microsoft became the first cloud provider with a fully managed, enterprise-grade service for health data in the Fast Healthcare Interoperability Resources (FHIR) format. We’re excited to expand those offerings with several new innovations around connecting, converting and transforming data. The first is Power BI FHIR Connector, which makes it simple and easy to bring FHIR data into Power BI for analytics and insights. The second, IoMT (Internet of Medical Things) FHIR Connector, is now available as open source software (OSS) and allows for seamless ingestion, normalization and transformation of Protected Health Information data from health devices into FHIR. Another new open source project, FHIR Converter, provides an easy way to convert healthcare data from legacy formats (i.e., HL7v2) into FHIR. And lastly, FHIR Tools for Anonymization, is now offered via OSS and enables anonymization and pseudonymization of data in the FHIR format. Including capabilities for redaction and date shifting in accordance with the HIPAA privacy rule.

Frictionless exchange of health information in FHIR makes it easier for researchers and clinicians to collaborate, innovate and improve patient care. As we move forward working with our customers and partners and others across the health ecosystem, Microsoft is committed to enabling and improving interoperability and required standards to make it easier for patients to manage their healthcare and control their information. At the same time, trust, privacy and compliance are a top priority – making sure Protected Health Information (PHI) remains under control and custodianship of healthcare providers and their patients.

We’ve seen a growing number of healthcare organizations not only deploy new technologies, but also begin to develop their own digital capabilities and solutions that use data and AI to transform and innovate healthcare and life sciences in profoundly positive ways. Over the past year, together with our customers and partners, we’ve announced new strategic partnerships aimed at empowering this transformation.

For example, to enable caregivers to focus more on patients by dramatically reducing the burden of documenting doctor-patient visits, Nuance has released Nuance Dragon Ambient eXperience (DAX). This ambient clinical intelligence technologies (ACI) is enriched by AI and cloud capabilities from Microsoft, including the ambient intelligence technology, EmpowerMD, which is coming to market as part of Nuance’s DAX solution. The solution aims to transform the exam room by deploying ACI to capture, with patient consent, interactions between clinicians and patients so that clinical documentation writes itself.

Among health systems, Providence St. Joseph Health is using Microsoft’s cloud, AI, productivity and collaboration technologies to deploy next-generation healthcare solutions while empowering their employees. NHS Calderdale is enabling patients and their providers to hold appointments virtually via Microsoft Teams for routine and follow-up visits, which helps lower costs while maintaining the quality of care. The U.S. Veterans Affairs Department is embracing mixed reality by working with technology providers Medivis, Microsoft and Verizon to roll out its first 5G-enabled hospital. And specifically for health consumers, Walgreens Boots Alliance will harness the power of our cloud, AI and productivity technologies to empower care teams and deliver new retail solutions to make healthcare delivery more personal, affordable and accessible.

Major payor, pharmaceutical and health technology platform companies are also transforming healthcare in collaboration with us. Humana will develop predictive solutions for personalized and secure patient support, and by using Azure, Azure AI and Microsoft 365, they’ll also equip home healthcare workers with real-time access to information and voice technology to better understand key factors that influence patient health. In pharmaceuticals, Novartis will bring Microsoft AI capabilities together with its deep expertise in life sciences to address specific challenges that make the process of discovering, developing and delivering new medicines so costly and time-consuming.

We’re pleased to showcase how together with our customers and partners, we’re working to bring healthcare solutions to life and positively impact the health ecosystem.

To keep up to date with the latest announcements visit the Microsoft Health News Room.

About the authors:
As Corporate Vice President of Health Technology and Alliances, Dr. Greg Moore leads the dedicated research and development collaborations with our strategic partners, to deliver next-generation technologies and experiences for healthcare.

Vice President and Chief Medical Officer Dr. David Rhew recently joined Microsoft’s Worldwide Commercial Business Healthcare leadership team and provides executive-level support, engaging in business opportunities with our customers and partners.

As Corporate Vice President of Healthcare, Peter Lee leads the Microsoft organization that works on technologies for better and more efficient healthcare, with a special focus on AI and cloud computing.

Tags: , , , ,

Go to Original Article
Author: Microsoft News Center

Developing a quantum-ready global workforce – Microsoft Quantum

At Microsoft Quantum, our ambition is to help solve some of the world’s most complex challenges through the world’s most scalable quantum system. Recently, we introduced Azure Quantum to unite a diverse and growing quantum community and accelerate the impact of this technology. Whether it’s algorithmic innovation that improves healthcare outcomes or breakthroughs in cryogenic engineering that enable more sustainable systems design, these recent advancements across the stack are bringing the promise of quantum to our world, right now.

In December 2018, the United States Congress signed the National Quantum Initiative Act – an important milestone for investing the resources needed to continue advancing the field. As recognized by the Act, education on quantum information science and engineering needs to be an area of explicit focus, as the shortage of quantum computing talent worldwide poses a significant challenge to accelerating innovation and fully realizing the impact quantum can have on our world.

Leaders across both public and private sectors need to continue working together to develop a global workforce of quantum engineers, researchers, computer and materials scientists, and other industry experts who will be able to carry quantum computing into the future. Microsoft has been collaborating with academic institutions and industrial entities around the world to grow this quantum generation and prepare the workforce for this next technological revolution.

Empowering the quantum generation through education

Earlier this year, Microsoft partnered with the University of Washington to teach an introductory course on quantum computing and programming. The course, led by Dr. Krysta Svore, General Manager of Microsoft Quantum Systems, focused on the practical implementation of quantum algorithms.

Students were first introduced to quantum programming with Q# through a series of coding exercises followed by programming assignments. For their final project, student teams developed quantum solutions for specified problems – everything from entanglement games and key distribution protocols to quantum chemistry and a Bitcoin mining algorithm. Several students from this undergraduate course joined the Microsoft Quantum team for a summer internship, further developing their new skillsets and delivering quantum impact to organizations around the world.

On the heels of this hands-on teaching engagement, Microsoft has established curriculum partnerships with more than 10 institutions around the world to continue closing the skills gap in quantum development and quantum algorithm design. This curriculum is circling the globe, from the University of California, Los Angeles (UCLA) to the Indian Institute of Technology (IIT) in Roorkee and Hyderabad, India.

Partner universities leverage Q#, Microsoft’s quantum programming language and associated Quantum Development Kit, to teach the principles of quantum computing to the next generation of computer engineers and scientists.

“The course material extended to us by Microsoft is concise and challenging. It covers the necessary mathematical foundations of Quantum Computing. Simulation on Q# is quite straightforward and easy to interpret. Collaboration with Microsoft has indeed captivated students of IIT Roorkee to get deeper insights into Quantum Technology.”

Professor Ajay Wasan of IIT Roorkee, Department of Physics

Q# integrates with familiar tools like Visual Studio and Python, making it a very approachable entry point for undergraduate and graduate students alike.

 “I integrated Microsoft’s Q# into my UCLA graduate course called Quantum Programming.  My students found many aspects of Q# easy to learn and used the language to program and run four quantum algorithms. Thus, the curriculum partnership with Microsoft [has] helped me teach quantum computing to computer science students successfully.”

– Professor Jens Palsberg of UCLA, Computer Science Department

Microsoft has also partnered with Brilliant to bring quantum computing to students and professionals around the world via a self-serve e-learning environment.

a GIF of Microsoft's Brilliant quantum curriculum

a GIF of Microsoft

This interactive Quantum Computing course introduces students to quantum principles and uses Q# to help people learn to build quantum algorithms, simulating a quantum environment in their browsers. In the last six months, more than 40,000 people have interacted with the course and started building their own quantum solutions.

Accelerating quantum innovation through cross-industry collaboration

Recently, Microsoft enrolled into the Quantum Economic Development Consortium (QED-C), which aims to enable and grow the United States quantum industry.

QED-C was established with support from the National Institute of Standards and Technology (NIST) as part of the federal strategy for advancing quantum information science. Through the QED-C, Microsoft partners with a diverse set of business and academic leaders to identify and address gaps in technology, standards, and workforce readiness facing the quantum industry.

We look forward to continuing our academic and cross-industry collaborations in developing a quantum workforce to tackle real-world scenarios and bring this revolutionary technology to fruition.

Request to be an early adopter of Azure Quantum and incorporate Q# and the QDK in your quantum curriculum.

Are you currently a student interested in joining Microsoft Quantum as an intern? Apply to our open research intern positions today!

Other ways to get involved:

Learning resources:

Go to Original Article
Author: Microsoft News Center

Microsoft’s new approach to hybrid: Azure services when and where customers need them | Innovation Stories

As business computing needs have grown more complex and sophisticated, many enterprises have discovered they need multiple systems to meet various requirements – a mix of technology environments in multiple locations, known as hybrid IT or hybrid cloud.

Technology vendors have responded with an array of services and platforms – public clouds, private clouds and the growing edge computing model – but there hasn’t necessarily been a cohesive strategy to get them to work together.

We got here in an ad hoc fashion,” said Erik Vogel, global vice president for customer experience for HPE GreenLake at Hewlett Packard Enterprise. Customers didn’t have a strategic model to work from.

Instead, he said, various business owners in the same company may have bought different software as a service (SaaS) applications, or developers may have independently started leveraging Amazon Web Services, Azure or Google Cloud Platform to develop a set of applications.

At its Ignite conference this week in Orlando, Florida, Microsoft announced its solution to such cloud sprawl. The company has launched a preview of Azure Arc, which offers Azure services and management to customers on other clouds or infrastructure, including those offered by Amazon and Google.

John JG Chirapurath, general manager for Azure data, blockchain and artificial intelligence at Microsoft, said the new service is both an acknowledgement of, and a response to, the reality that many companies face today. They are running various parts of their businesses on different cloud platforms, and they also have a lot of data stored on their own new or legacy systems.

In all those cases, he said, these customers are telling Microsoft they could use the benefits of Azure cloud innovation whether or not their data is stored in the cloud, and they could benefit from having the same Azure capabilities – including security safeguards – available to them across their entire portfolio.

We are offering our customers the ability to take their services, untethered from Azure, and run them inside their own datacenter or in another cloud,” Chirapurath said.

Microsoft says Azure Arc builds on years of work the company has done to serve hybrid cloud needs. For example, Azure Resource Manager, released in 2014, was created with the vision that it would manage resources outside of Azure, including in companies’ internal servers and on other clouds.

That flexibility can help customers operate their services on a mix of clouds more efficiently, without purchasing new hardware or switching among cloud providers. Companies can use a public cloud to obtain computing power and data storage from an outside vendor, but they can also house critical applications and sensitive data on their own premises in a private cloud or server.

Then there’s edge computing, which stores data where the user is, in between the company and the public cloud for example, on their customers’ mobile devices or on sensors in smart buildings like hospitals and factories.

YouTube Video

That’s compelling for companies that need to run AI models on systems that aren’t reliably connected to the cloud, or to make computations more quickly than if they had to send large amounts of data to and from the cloud. But it also must work with companies’ cloud-based, internet-connected systems.

“A customer at the edge doesn’t want to use different app models for different environments,” said Mark Russinovich, Azure chief technology officer. “They need apps that span cloud and edge, leveraging the same code and same management constructs.”

Streamlining and standardizing a customer’s IT structure gives developers more time to build applications that produce value for the business instead of managing multiple operating models. And enabling Azure to integrate administrative and compliance needs across the enterprise – automating system updates and security enhancements brings additional savings in time and money.

“You begin to free up people to go work on other projects, which means faster development time, faster time to market,” said HPE’s Vogel. HPE is working with Microsoft on offerings that will complement Azure Arc.

Arpan Shah, general manager of Azure infrastructure, said Azure Arc allows companies to use Azure’s governance tools for their virtual machines, Kubernetes clusters and data across different locations, helping ensure companywide compliance on things like regulations, security, spending policies and auditing tools.

Azure Arc is underpinned in part by Microsoft’s commitment to technologies that customers are using today, including virtual machines, containers and Kubernetes, an open source system for organizing and managing containers. That makes clusters of applications easily portable across a hybrid IT environment – to the cloud, the edge or an internal server.

“It’s easy for a customer to put that container anywhere,” Chirapurath said. “Today, you can keep it here. Tomorrow, you can move it somewhere else.”

Microsoft says these latest Azure updates reflect an ongoing effort to better understand the complex needs of customers trying to manage their Linux and Windows servers, Kubernetes clusters and data across environments.

“This is just the latest wave of this sort of innovation,” Chirapurath said. “We’re really thinking much more expansively about customer needs and meeting them according to how they’d like to run their applications and services.”

Top image: Erik Vogel, global vice president for customer experience for HPE GreenLake at Hewlett Packard Enterprise, with a prototype of memory-driven computing. HPE is working with Microsoft on offerings that will complement Azure Arc. Photo by John Brecher for Microsoft.

Related:

Go to Original Article
Author: Microsoft News Center

Helping customers shift to a modern desktop – Microsoft 365 Blog

IT is complex. And that means it can be difficult to keep up with the day-to-day demands of your organization, let alone deliver technological innovation that drives the business forward. In desktop management, this is especially true: the process of creating standard images, deploying devices, testing updates, and providing end user support hasn’t changed much in years. It can be tedious, manual, and time consuming. We’re determined to change that with our vision for a modern desktop powered by Windows 10 and Office 365 ProPlus. A modern desktop not only offers end users the most productive, most secure computing experience—it also saves IT time and money so you can focus on driving business results.

Today, we’re pleased to make three announcements that help you make the shift to a modern desktop:

  • Cloud-based analytics tools to make modern desktop deployment even easier.
  • A program to ensure app compatibility for upgrades and updates of Windows and Office.
  • Servicing and support changes to give you additional deployment flexibility.

Analytics to make modern desktop deployment easier

Collectively, you’ve told us that one of your biggest upgrade and update challenges is application testing. A critical part of any desktop deployment plan is analysis of existing applications—and the process of testing apps and remediating issues has historically been very manual and very time consuming. Microsoft 365 offers incredible tools today to help customers shift to a modern desktop, including System Center Configuration Manager, Microsoft Intune, Windows Analytics, and Office Readiness Toolkit. But we’ve felt like there’s even more we could do.

Today, we’re announcing that Windows Analytics is being expanded to Desktop Analytics—a new cloud-based service integrated with ConfigMgr and designed to create an inventory of apps running in the organization, assess app compatibility with the latest feature updates of Windows 10 and Office 365 ProPlus, and create pilot groups that represent the entire application and driver estate across a minimal set of devices.

The new Desktop Analytics service will provide insight and intelligence for you to make more informed decisions about the update readiness of your Windows and Office clients. You can then optimize pilot and production deployments with ConfigMgr. Combining data from your own organization with data aggregated from millions of devices connected to our cloud services, you can take the guess work out of testing and focus your attention on key blockers. We’ll share more information about Desktop Analytics and other modern desktop deployment tools at Ignite.

Standing behind our app compatibility promise

We’re also pleased to announce Desktop App Assure—a new service from Microsoft FastTrack designed to address issues with Windows 10 and Office 365 ProPlus app compatibility. Windows 10 is the most compatible Windows operating system ever, and using millions of data points from customer diagnostic data and the Windows Insider validation process, we’ve found that 99 percent of apps are compatible with new Windows updates. So you should generally expect that apps that work on Windows 7 will continue to work on Windows 10 and subsequent feature updates. But if you find any app compatibility issues after a Windows 10 or Office 365 ProPlus update, Desktop App Assure is designed to help you get a fix. Simply let us know by filing a ticket through FastTrack, and a Microsoft engineer will follow up to work with you until the issue is resolved. In short, Desktop App Assure operationalizes our Windows 10 and Office 365 ProPlus compatibility promise: We’ve got your back on app compatibility and are committed to removing it entirely as a blocker.

Desktop App Assure will be offered at no additional cost to Windows 10 Enterprise and Windows 10 Education customers. We’ll share more details on this new service at Ignite and will begin to preview this service in North America on October 1, 2018, with worldwide availability by February 1, 2019.

Servicing and support flexibility

Longer Windows 10 servicing for enterprises and educational institutions
In April 2017, we aligned the Windows 10 and Office 365 ProPlus update cadence to a predictable semi-annual schedule, targeting September and March. While many customers—including Mars and Accenture—have shifted to a modern desktop and are using the semi-annual channel to take updates regularly with great success, we’ve also heard feedback from some of you that you need more time and flexibility in the Windows 10 update cycle.

Based on that feedback, we’re announcing four changes:

  • All currently supported feature updates of Windows 10 Enterprise and Education editions (versions 1607, 1703, 1709, and 1803) will be supported for 30 months from their original release date. This will give customers on those versions more time for change management as they move to a faster update cycle.
  • All future feature updates of Windows 10 Enterprise and Education editions with a targeted release month of September (starting with 1809) will be supported for 30 months from their release date. This will give customers with longer deployment cycles the time they need to plan, test, and deploy.
  • All future feature updates of Windows 10 Enterprise and Education editions with a targeted release month of March (starting with 1903) will continue to be supported for 18 months from their release date. This maintains the semi-annual update cadence as our north star and retains the option for customers that want to update twice a year.
  • All feature releases of Windows 10 Home, Windows 10 Pro, and Office 365 ProPlus will continue to be supported for 18 months (this applies to feature updates targeting both March and September).

In summary, our new modern desktop support policies—starting in September 2018—are:

Windows 7 Extended Security Updates
As previously announced, Windows 7 extended support is ending January 14, 2020. While many of you are already well on your way in deploying Windows 10, we understand that everyone is at a different point in the upgrade process.

With that in mind, today we are announcing that we will offer paid Windows 7 Extended Security Updates (ESU) through January 2023. The Windows 7 ESU will be sold on a per-device basis and the price will increase each year. Windows 7 ESUs will be available to all Windows 7 Professional and Windows 7 Enterprise customers in Volume Licensing, with a discount to customers with Windows software assurance, Windows 10 Enterprise or Windows 10 Education subscriptions. In addition, Office 365 ProPlus will be supported on devices with active Windows 7 Extended Security Updates (ESU) through January 2023. This means that customers who purchase the Windows 7 ESU will be able to continue to run Office 365 ProPlus.

Please reach out to your partner or Microsoft account team for further details.

Support for Office 365 ProPlus on Windows 8.1 and Windows Server 2016
Office 365 ProPlus delivers cloud-connected and always up-to-date versions of the Office desktop apps. To support customers already on Office 365 ProPlus through their operating system transitions, we are updating the Windows system requirements for Office 365 ProPlus and revising some announcements that were made in February. We are pleased to announce the following updates to our Office 365 ProPlus system requirements:

  • Office 365 ProPlus will continue to be supported on Windows 8.1 through January 2023, which is the end of support date for Windows 8.1.
  • Office 365 ProPlus will also continue to be supported on Windows Server 2016 until October 2025.

Office 2016 connectivity support for Office 365 services
In addition, we are modifying the Office 365 services system requirements related to service connectivity. In February, we announced that starting October 13, 2020, customers will need Office 365 ProPlus or Office 2019 clients in mainstream support to connect to Office 365 services. To give you more time to transition fully to the cloud, we are now modifying that policy and will continue to support Office 2016 connections with the Office 365 services through October 2023.

Shift to a modern desktop

You’ve been talking, and we’ve been listening. Specifically, we’ve heard your feedback on desktop deployment, and we’re working hard to introduce new capabilities, services, and policies to help you on your way. The combination of Windows 10 and Office 365 ProPlus delivers the most productive, most secure end user computing experience available. But we recognize that it takes time to both upgrade devices and operationalize new update processes. Today’s announcements are designed to respond to your feedback and make it easier, faster, and cheaper to deploy a modern desktop. We know that there is still a lot of work to do. But we’re committed to working with you and systematically resolving any issues. We’d love to hear your thoughts and look forward to seeing you and discussing in more detail in the keynotes and sessions at Ignite in a few weeks!

How API-based integration dissolves SaaS connectivity limits

As businesses introduce an ever-growing, complex IT ecosystem of on-premises and SaaS applications, APIs, blockchain and other technologies, how can they possibly tie them together?

For many DevOps teams, the answer is API-based integration to enable communication between applications and platforms. Integration projects, however, pose challenges in security, runtime and management.

In this Q&A, Oracle’s Vikas Anand explores industry trends that drive rapid adoption of API-based integrations. He also lays out the hybrid cloud connectivity integration challenges for DevOps teams and ways to bypass those issues.

Anand is vice president of product management for integration, process and API management cloud services at Oracle.

Which technologies and use cases drive use of API-based integration?

Vikas Anand: SaaS connectivity limitations are the No. 1 reason enterprises adopt and then expand API integration programs. When SaaS is not integrated, it quickly changes to silo as a service. Customers can only derive limited value if their SaaS system is not working well in a very heterogeneous enterprise IT environment.

Vikas Anand, vice president of integration, OracleVikas Anand

APIs power new technologies that create better experiences for customers, such as chatbots and many mobile user experiences. APIs provide information from on-premises and cloud back-end systems, such as CRM [customer relationship management] or ERP.

Those new technologies have to be integrated into the existing IT environments and then extended to customers. For example, adoption is growing in API-driven B2B technologies, which provide a nimbler transaction option than traditional EDI [enterprise data integration] [Standard] X12-based transactions. Another example is growing use of smart contracts with blockchain to do transactions in a trusted way. API integration provides the pathways for these transactions.

What problems do DevOps teams encounter in API-based integration implementation and management?

Anand: The No. 1 challenge is how to secure their APIs. APIs are exposed on the edge, and they are available for everyone to use. A thought-through security model is important. My advice is to focus on using security standards, such as OAuth. Then, you’ll be on the same security level as partners and customers.

Another challenge is documentation of how you define, build and share APIs. Look at standards such as OpenAPI [that] support moving APIs across teams and across API developers.

A third challenge is optimizing API runtime, which relates to monitoring, testing and management. This calls for preproduction work in API interface testing and validating API functionality. In operations, ensure that APIs are not only secured and protected from anomalies, but also can be scaled up as more APIs are consumed by the partners in an ever-growing hybrid environment.

Consider that APIs run not just behind the customer firewall in a data center, but also across multiple clouds and devices. At runtime, you need your APIs to be close to the back-end applications to deliver the timely response, scale and experiences customers want.

Why isn’t integration built into SaaS offerings?

Anand: SaaS only allows you to configure and customize. If you need to extend the applications, API-based integration is a lightweight alternative to legacy, on-premises ESB [enterprise service bus] integration suites.

For example, say you have a CRM application with a coding system, and you might need to have an extension of the business logic to support new discounting rules. Unfortunately, it may not be possible to configure or change the SaaS environment. The vendor will not allow you to do it, because the SaaS product would then be upgrade-unfriendly. So, in such cases, DevOps can use business process automation in alignment with API-based application integration to deliver those extensions.

In hybrid compute environments, how does the business value of APIs and API-based integration play out?

SaaS connectivity limitations are the No. 1 reason enterprises adopt and then expand API integration programs. When SaaS is not integrated, it quickly changes to silo as a service.
Vikas Anandvice president of integration, Oracle

Anand: API integration supports multichannel experiences that improve customer engagement. An example is how integration helps businesses partner with other service providers to offer new capabilities. An example is an API model that makes Uber services available on a United Airlines application.

APIs also spur revenue growth. For instance, a business’s IP [intellectual property] that lies behind firewalls can be exposed as an API to create new revenue channels. Many new-age companies, such as Airbnb and Lyft, leverage the API model to deliver revenue. Traditional companies [in] manufacturing and other [industries] are really applying this to their domain.

API-first design provides modernized back-end interfaces that speed integrations. Doing back-end integrations? You can run the APIs within the data center to integrate SaaS and on-premises applications. A good API, a well-designed API can actually reduce the cost of integration by 50%.

Which best practices do you suggest for API-based integration project success?

Anand: Developers need to transform and route data and apply process automation capabilities. To do integration efficiently, enterprises have to automate data flow, business processes and whatever repeatable, error-prone tasks IT does. This calls for support from automation models, such as robotic process automation, to create single pane of glass for analytics.

Enterprise-level application integration projects used to take a year or two. Now that SaaS applications can be deployed in a matter of months, that won’t do. Fortunately, APIs themselves are now designed so that the integrations can be done more effectively, more efficiently and with better time to market than ever before. For API integration, there are automated, prebuilt connections that can be applied. Also, automated API integrated features are available in some iPaaS offerings now and coming to others soon.

Users want hybrid cloud networking to be simple

As American jazz great Duke Ellington once said, “Simplicity is a most complex form.” And IT doesn’t get much more complex than hybrid cloud networking.

I take pictures — lots of pictures. To make them frame-worthy, I edit them in Lightroom and Photoshop, two Adobe Systems programs. In the old days, if I wanted to use them, I’d have to cough up more than a thousand dollars to buy and install the software on my desktop. Now the applications live in Adobe Creative Cloud, and my inexpensive monthly subscription makes editing simple and affordable.

All the networking Adobe has done to make my interactions with Lightroom and Photoshop seamless is invisible to me, but what goes on behind the scenes is anything but simple. Hybrid cloud networking — or in Adobe’s case, multicloud networking, as described in our cover story — is complex. When it simply works, it’s a beautiful thing.

Users prefer that complexity remain invisible. We like clicking an icon and having our work pop up. Have a look at The Subnet Q&A for more about how Adobe strives for simplicity in its hybrid cloud networking among all its services.

The world continues to move toward ensuring the user experience with the network is simple and engaging. In “IT pros seek better methods to manage application performance” we look beyond the need to integrate the cloud, to the onslaught of mobile devices accessing servers, which has IT managers struggling for the best way to add application performance management to overall network management.

Those mobile devices also present their own level of complexity. More and more employees rely on their smartphones to communicate — whether that’s calling, messaging or video conferencing. This puts pressure on unified communications vendors to make their UC applications as simple to use as a smartphone’s native applications but with added security, features and reporting capabilities for the enterprise. In “Mobile unified communications market has growing pains” we look at what vendors are doing and who might win the market. 

HR metrics and analytics make workforces more valuable

Even the most complex enterprise needs metrics and analytics to stay on course and compete in the global marketplace. Of all the parts of a company that can be optimized with analytics, the most valuable asset is the workforce itself.

Gathering good data to describe and define employee performance with HR metrics and analytics isn’t as simple as deriving metrics from warehousing or marketing campaign performance data. Descriptive analytics can expose subtle traits in employees and define their patterns of success or failure, and predictive analytics can help managers choose the right employees for upcoming team projects.

But it’s all for naught if the data is bad.

Objective vs. subjective measures

When it comes to human performance, having a handle on not only what is being measured but how can make a huge difference in the ultimate value of the metric. Management appraisal of employee performance, for example, is necessarily subjective, and that’s a good thing — it’s where thoughtful opinion should prevail. But too often, the performance data gathered via management evaluation is both static and linear; it only captures what’s happening at the moment. Giving an employee a score of seven out of 10 for performance says almost nothing about the employee and is of little value to an organization.

Adding another dimension to these subjective evaluations greatly enhances their usefulness for HR metrics and analytics. If a manager adds a rating for employee potential to the score for employee performance that forms a unified two-axis metric, then the value of the metric in analytics is greatly multiplied. An employee with low performance but high potential is remediated differently than an employee with high performance but low potential: The former needs mentoring, while the latter needs training. And the application of analytics to these metrics can result in more focused, effective mentoring and training programs for the entire workforce. Moreover, HR can potentially identify an employee who is not doing well in a current job but may flourish in a different position.

Traditionally, quantitative measures have held sway over the evaluation of employee effectiveness. Salespeople live and die by their numbers, as do assembly line workers and delivery people. But traditional metrics offer no analytical insight into why raw productivity data might be rising or falling, and that’s the information the enterprise truly needs.

Descriptive analytics can make a difference by enhancing an otherwise static metric. Imagine a sales team exceeding quotas for most of the year, then experiencing a collective drop-off in numbers. Upper management may speak to the sales manager and get a top-down view of the team’s performance lapse, but HR metrics and analytics can open windows into how an individual’s performance is affecting group performance.

Retaining top employees

[W]hen an employee decides to move on, money is seldom the most important factor.

In an increasingly migratory business ecosphere, people move from one company to another far more frequently than in past decades. Keeping employees happy is more important than ever. And when an employee decides to move on, money is seldom the most important factor. Deciding factors can include upward mobility, benefits, education and training support, a family-friendly environment and a positive corporate culture.

It’s often combinations of these factors that keep employees stable and satisfied. Upward mobility and training programs are likely to be shared values for some employees, while a good health plan and a family-friendly atmosphere may appeal to others. Digging into the data and using HR metrics and analytics to detect and resolve these patterns of employee values leads to more comprehensive HR initiatives to retain employees.

Additionally, isolating these factors can help HR create more reliable hiring practices. While there’s certainly merit in following industry recommendations on selecting new employees, it’s even more effective to analyze in-house employee data to establish what work-related experiences and personal traits make for a good fit. Relying on dynamics that have proved to be successful in the workplace is usually a better bet than spinning the roulette wheel on a prospect sporting a well-written résumé.