Tag Archives: process

Wanted – LGA 1150 Motherboard

I have an MSI mini ITX board (MSI H97 AC) which i am in the process of removing from my small pc.
I have the box and most of the gubbins that came with it.

One of the tabs to remove the RAM is broken but it does not stop the ram being removed or re-seated.
One of the antenna for wifi may be missing – I’ll have to check the other box as I have 2 of these PC’s.
How does £35 inc delivery sound?
Go to Original Article
Author:

Canon breach exposes General Electric employee data

Canon Business Process Services suffered a security incident, according to a data breach disclosure by General Electric, for which Canon processes current and former employees’ documents and beneficiary-related documents.

GE systems were not impacted by the cyberattack, according to the company’s disclosure, but personally identifiable information for current and former employees as well as their beneficiaries was exposed in the Canon breach. The breach, which was first reported by BleepingComputer, took place between Feb. 3 and Feb. 14 of this year, and GE was notified of the breach on the 28th. According to the disclosure, “an unauthorized party gained access to an email account that contained documents of certain GE employees, former employees and beneficiaries entitled to benefits that were maintained on Canon’s systems.”

Said documents included “direct deposit forms, driver’s licenses, passports, birth certificates, marriage certificates, death certificates, medical child support orders, tax withholding forms, beneficiary designation forms and applications for benefits such as retirement, severance and death benefits with related forms and documents.” Personal information stolen “may have included names, addresses, Social Security numbers, driver’s license numbers, bank account numbers, passport numbers, dates of birth, and other information contained in the relevant forms.”

GE’s disclosure also said Canon retained “a data security expert” to conduct a forensic investigation. At GE’s request, Canon is offering two years of free identity protection and credit monitoring services.

GE shared the following statement with SearchSecurity regarding the Canon breach.

“We are aware of a data security incident experienced by one of GE’s suppliers, Canon Business Process Services, Inc. We understand certain personal information on Canon’s systems may have been accessed by an unauthorized individual. Protection of personal information is a top priority for GE, and we are taking steps to notify the affected employees and former employees,” the statement read.

Canon did not return SearchSecurity’s request for comment. At press time, Canon has not released a public statement.

Go to Original Article
Author:

Using automated machine learning for AI in insurance

Mitsui Sumitomo Insurance, one of the largest insurance firms in Japan, began the process of digital transformation several years ago. The company launched multiple projects, and continues to start new projects, to send it further into the digital age.

One of MSI’s more ambitious undertakings is the MS1 Brain platform, an AI in insurance project to create a more personalized experience for customers.

AI in insurance

Released earlier this year, the MS1 Brain platform uses machine learning and predictive analytics, along with customer data, including contract details, accident information and lifestyle changes, to recommend products and services to customers based on their predicted needs.

The platform also generates personalized communications for customers.

“Our business model is B to B to C [business to business to consumer]. We provide our products through agencies,” said Teruki Yokoyama, deputy manager of digital strategy in the department of digital business at MSI. “Until now, we have provided products to customers, both individuals and corporations mostly by leveraging experienced agents’ intimate knowledge of client needs.”

“By providing the needs analysis outcomes of each customer to the agency by MS1Brain, now even an inexperienced agency can make optimal proposals to customers with higher demands,” he continued.

To build the platform, MSI chose dotData, a startup automated machine learning vendor based in San Mateo, Calif.

Machine learning
Mitsui Sumitomo Insurance used automated machine learning to build out a machine learning platform

Automated machine learning

MSI first connected with dotData in 2017, when MSI‘s CIO visited Silicon Valley for a technical survey, Yokoyama said.

At that time, dotData was just getting started, and it hadn’t released a product. Still, MSI was intrigued by its automated machine learning platform, which claims to provide full-cycle machine learning automation. DotData competitors include DataRobot, H2O.ai and Auger.ai.

Automation of the data science process is the only way a company can truly deliver value from AI/ML investments and provide competitive differentiation by investing in predictive analytics.
Teruki YokoyamaDeputy manager of digital strategy, Mitsui Sumitomo Insurance

“When it comes to data analysis, model accuracy often gets the most attention; dotData, on the other hand, focuses on how quickly you can move from raw data to working models — the AI-based feature engineering is what stood out,” Yokoyama said.

MSI had to build a lot of intelligent models, said Ryohei Fujimaki, CEO and founder of dotData. But, the firm didn’t have the data science team to build them.

DotData’s platform was scalable and enabled MSI to automate the entire AI building process, from feature generation to model implementation, Yokoyama said.

“Everyone should embrace this approach,” said Yokoyama of the automated machine learning approach.

“Automation of the data science process is the only way a company can truly deliver value from AI/ML investments and provide competitive differentiation by investing in predictive analytics,” he said.

Go to Original Article
Author:

SMBs grappling with digital transformation initiatives

Small and medium-sized businesses across the board have either recently launched or are in the process of launching digital transformation initiatives, but many are running up against obstacles.

The results of the inaugural Insight 2020 Technology Report: IT Trends for Midmarket and Small Business, a study based on input from more than 400 North American IT professionals at independent and emerging businesses, highlighted that dilemma. Insight Enterprises conducted the research.

Technology has become a corporate cornerstone, even for small businesses, according to the report. “Enterprises recognize that they need to embrace leading-edge technology in order to remain competitive,” said Joseph Clinton, central region sales director at Insight Enterprises.

Organizations expect that embracing digital transformation will positively impact their operations, the report said. The top three business areas that respondents said digital transformation initiatives would help improve are customer experience, cited by 43%; operational efficiency, 42%; and workforce productivity and collaboration, 42%.

Digital transformation challenges

Realizing the potential benefits of digital transformation has been vexing, however. Close to half of the study’s respondents, 49%, said integrating new technology with legacy systems is very or extremely challenging when dealing with IT service providers. “Many companies perform a lift and shift. They simply take their existing applications and port them to the cloud,” Clinton explained. “In those cases, they do not take advantage of any of the modernization features available in the new environment.”

Steps for digital transformation chart
Four essential steps to digital transformation

Funding also poses issues for these initiatives. The study found that 44% of respondents pointed to budget constraints as an inhibitor to embracing digital transformation.

Additionally, when attempting to equip their businesses with the latest technology, 45% of SMBs said understanding which new technologies to invest in is an area of concern. Many new systems are cloud-based, and cost is a key obstacle here, as well. Comparing cloud pricing to current expenditures is the most frequently cited barrier to migration, cited by 56% of SMB respondents.

Building a digital transformation framework

So, how can a company address its digital transformation challenges? “Businesses need a framework, a plan that aligns where they are today to where they want to be in the future,” Clinton said.

Businesses need a framework, a plan that aligns where they are today to where they want to be in the future.
Joseph ClintonCentral region sales director, Insight Enterprises

But in many cases, these organizations lack the technical depth to create the plan — a task that channel partners can help with. “There are a lot of potential pitfalls,” Clinton said. “Businesses need to get their applications and employees ready for the change. The bursting that cloud offers can be helpful, but it can also be expensive. In some cases, modern applications cost much more than legacy systems.”

As a result, corporations should not automatically assume the cloud offers them a better deployment model than legacy infrastructure. They need to create a methodology to see which workloads should move. Channel partners with deep cloud experience are in prime position to help them make the right call, Clinton said.

Go to Original Article
Author:

RPA in manufacturing increases efficiency, reduces costs

Robotic process automation software is increasingly being adopted by organizations to improve processes and make operations more efficient.

In manufacturing, the use cases for RPA range from reducing errors in payroll processes to eliminating unneeded processes before undergoing a major ERP system upgrade.

In this Q&A, Shibaji Das, global director of finance and accounting and supply chain management for UiPath, discusses the role of RPA in manufacturing ERP systems and how it can help improve efficiency in organizations.

UiPath, based in New York, got its start in 2005 as DeskOver, which made automation scripts. In 2012, the company relaunched as UiPath and shifted its focus to RPA. UiPath markets applications that enable organizations to examine processes and create bots, or software robots that automate repetitive, rules-based manufacturing and business processes. RPA bots are usually infused with AI or machine learning so that they can take on more complex tasks and learn as they encounter more processes.

What is RPA and how does it relate to ERP systems?

Shibaji Das: When you’re designing a city, you put down the freeways. Implementing RPA is a little like putting down those freeways with major traffic flowing through, with RPA as the last mile automation. Let’s say you’re operating on an ERP, but when you extract information from the ERP, you still do it manually to export to Excel or via email. A platform like [UiPath] forms a glue between ERP systems and automates repetitive rules-based stuff. On top of that, we have AI, which gives brains to the robot and helps it understand documents, processes and process-mining elements.

Why is RPA important for the manufacturing industry?

Shibaji Das, UiPathShibaji Das

Das: When you look at the manufacturing industry, the challenges are always the cost pressure of having lower margins or the resources to get innovation funds to focus on the next-generation of products. Core manufacturing is already largely automated with physical robots; for example, the automotive industry where robots are on the assembly lines. The question is how can RPA enable the supporting functions of manufacturing to work more efficiently? For example, how can RPA enable upstream processes like demand planning, sourcing and procurement? Then for downstream processes when the product is ready, how do you look at the distribution channel, warehouse management and the logistics? Those are the two big areas where RPA plus AI play an important role.

What are some steps companies need to take when implementing RPA for manufacturing?

Das: Initially, there will be a process mining element or process understanding element, because you don’t want to automate bad processes. That’s why having a thought process around making the processes efficient first is critical for any bigger digital transformation. Once that happens, and you have more efficient processes running, which will integrate with multiple ERP systems or other legacy systems, you could go to task automation. 

What are some of the ways that implementing RPA will affect jobs in manufacturing? Will it lead to job losses if more processes are automated?

Das: Will there be a change in jobs as we know them? Yes, but at the same time, there’s a very positive element that will create a net positive impact from a jobs perspective, experience perspective, cost, and the overall quality of life perspective. For example, the moment computers came in, someone’s job was to push hundreds of piles of paper, but now, because of computing, they don’t have to do that. Does that mean there was a negative effect? Probably not, in the long run. So, it’s important to understand that RPA — and RPA that’s done in collaboration with AI — will have a positive impact on the job market in the next five to 10 years.

Can RPA help improve operations by eliminating mistakes that are common in manual processes?

Das: Robots do not make mistakes unless you code it wrong at the beginning, and that’s why governance is so important. Robots are trained to do certain things and will do them correctly every time — 100%, 24/7 — without needing coffee breaks.

What are some of the biggest benefits of RPA in manufacturing?

Das: From an ROI perspective, one benefit of RPA is the cost element because it increases productivity. Second is revenue; for example, at UiPath, we are using our own robots to manage our cash application process, which has impacted revenue collection [positively]. Third is around speed, because what an individual can do, a robot can do much faster. However, this depends on the system, as a robot will only operate as fast as the mother system of the ERP system works — with accuracy, of course. Last, but not least, the most important part is experience. RPA plus AI will enhance the experience of your employees, of your customers and vendors. This is because the way you do business becomes easier, more user-friendly and much more nimble as you get rid of the most common frustrations that keep coming up, like a vendor not getting paid.

What’s the future of RPA and AI in organizations?

Das: The vision of Daniel Dines [UiPath’s co-founder and CEO] is to have one robot for every individual. It’s similar to every individual having access to Excel or Word. We know the benefits of the usage of Excel or Word, but RPA access is still a little technical and there’s a bit of coding involved. But UiPath is focused on making this as code free as possible. If you can draw a flowchart and define a process clearly through click level, our process mining tool can observe it and create an automation for you without any code. For example, I have four credit cards, and every month, I review it and click the statement for whatever the amount is and pay it. I have a bot now that goes in at the 15th of the month and logs into the accounts and clicks through the process. This is just a simple example of how practical RPA could become.

Go to Original Article
Author:

How to transfer FSMO roles with PowerShell

In the process of migrating to a new server or spreading the workload around, you might need to transfer FSMO roles in Active Directory from one domain controller to another.

AD relies on a concept called flexible server master operations roles, commonly referred to as FSMO roles. Domain controllers in an AD forest and domain hold one or more of these roles that handle different duties, such as keeping the AD schema in sync and synchronizing passwords across all domain controllers. You might need to spread these roles to other domain controllers to make AD operate more efficiently. As is the case when managing a Windows shop, you can manage much of your infrastructure either through the GUI or with PowerShell. There is no right or wrong way, but a script can be customized and reused, which saves some time and effort.

It’s not always easy to figure out which domain controller holds a particular role since FSMO roles tend to get spread out among various domain controllers. Then, once you’ve found the FSMO role, you need to juggle multiple windows if you try to manage them with the GUI. However, if you use PowerShell, we can both find where these FSMO roles live and easily move them to any domain controller with a script.

Before you get started

Before you can find and move FSMO roles with PowerShell, be sure to install Remote Server Administration Tools found here, which also includes the AD module. The computer you use PowerShell on should be on the domain, and you should have the appropriate permissions to move FSMO roles.

Use PowerShell to find FSMO roles

It’s not necessary to find the FSMO role holders before moving them, but it’s helpful to know the state before you make these types of significant changes.

There are two PowerShell commands we’ll use first to find the FSMO roles: Get-AdDomain and Get-AdForest. You need to use both commands since some FSMO holders reside at the forest level and some at the domain level. The AD module contains these cmdlets, so if you have that installed, you’re good to go.

First, you can find all the domain-based FSMO roles with the Get-AdDomain command. Since the Get-AdDomain returns a lot more than just FSMO role holders, you can reduce the output a bit with Select-Object:

Get-ADDomain | Select-Object InfrastructureMaster,PDCEmulator,RIDMaster | Format-List

[embedded content]
Migrating an AD domain controller

This command returns all the domain-based roles, including the Primary Domain Controller (PDC) emulator and Relative Identifier (RID) master, but we need to find the forest-level FSMO roles called domain naming master and schema master. For these FSMO roles, you need to use the Get-ADForest command.

Since Get-AdForest returns other information besides the FSMO role holders, limit the output using Select-Object to find the FSMO role holders we want.

Get-ADForest | Select-Object DomainNamingMaster,SchemaMaster | Format-List

How to transfer FSMO roles

It’s not necessary to find the FSMO role holders before moving them, but it’s helpful to know the state before you make these types of significant changes.

To save some time in the future, you can write a PowerShell function called Get-ADFSMORole that returns the FSMO role holders at the domain and the forest level in one shot.

function Get-ADFSMORole {
[CmdletBinding()]
param
()

Get-ADDomain | Select-Object InfrastructureMaster,PDCEmulator,RIDMaster
Get-ADForest | Select-Object DomainNamingMaster,SchemaMaster
}

Now that you have a single function to retrieve all the FSMO role holders, you can get to the task of moving them. To do that, call the function you made and assign a before state to a variable.

$roles = Get-ADFSMORole

With all the roles captured in a variable, you can transfer FSMO roles with a single command called Move-ADDirectoryServerOperationMasterRole. This command just handles moving FSMO roles. You can move each role individually by looping over each role name and calling the command, or you could do them all at once. Both methods work depending on how much control you need.

$destinationDc = 'DC01'
## Method 1
'DomainNamingMaster','PDCEmulator','RIDMaster','SchemaMaster','InfrastructureMaster' | ForEach-Object {
Move-ADDirectoryServerOperationMasterRole -OperationMasterRole $_ -Identity $destinationDc
}

## Method 2
Move-ADDirectoryServerOperationMasterRole -OperationMasterRole DomainNamingMaster,PDCEmulator,RIDMaster,SchemaMaster,InfrastructureMaster-Identity $destinationDc

After you run the command, use the custom Get-ADFSMORole function created earlier to confirm the roles now reside on the new domain controller.

Go to Original Article
Author:

Azure publishes guidance for secure cloud adoption by governments

Governments around the world are in the process of a digital transformation, actively investigating solutions and selecting architectures that will help them transition many of their workloads to the cloud. There are many drivers behind the digital transformation, including the need to engage citizens, empower employees, transform government services, and optimize government operations. Governments across the world are also looking to improve their cybersecurity posture to secure their assets and counter the evolving threat landscape.

To help governments worldwide get answers to common cloud security related questions, Microsoft published a white paper, titled Azure for Secure Worldwide Public Sector Cloud Adoption. This paper addresses common security and isolation concerns pertinent to worldwide public sector customers. It also explores technologies available in Azure to help safeguard unclassified, confidential, and sensitive workloads in the public multi-tenant cloud in combination with Azure Stack and Azure Data Box Edge deployed on-premises and at the edge for fully disconnected scenarios involving highly sensitive data. The paper addresses common customer concerns, including:

  • Data residency and data sovereignty
  • Government access to customer data, including CLOUD Act related questions
  • Data encryption, including customer control of encryption keys
  • Access to customer data by Microsoft personnel
  • Threat detection and prevention
  • Private and hybrid cloud options
  • Cloud compliance and certifications
  • Conceptual architecture for classified workloads

Azure can be used by governments worldwide to meet rigorous data protection requirements.

For governments and the public sector industry worldwide, Microsoft provides Azure – a public multi-tenant cloud services platform that government agencies can use to deploy a variety of solutions. A multi-tenant cloud platform implies that multiple customer applications and data are stored on the same physical hardware. Azure uses logical isolation to segregate each customer’s applications and data from those of others. This approach provides the scale and economic benefits of multi-tenant cloud services while rigorously helping prevent customers from accessing one another’s data or applications.

A hyperscale public cloud provides resiliency in times of natural disaster or other disturbances. The cloud provides capacity for failover redundancy and empowers sovereign nations with flexibility regarding global resiliency planning. A hyperscale public cloud also offers a feature-rich environment incorporating the latest cloud innovations such as artificial intelligence, machine learning, Internet of Things (IoT) services, intelligent edge, and more. This rich feature set helps government customers increase efficiency and unlock insights into their operations and performance.

Using Azure’s public cloud capabilities, customers benefit from rapid feature growth, resiliency, and the cost-effective operation of the hyperscale cloud while still obtaining the levels of isolation, security, and confidence required to handle workloads across a broad spectrum of data classifications, including unclassified and classified data. Leveraging Azure isolation technologies, as well as intelligent edge capabilities (such as Azure Stack and Azure Data Box Edge), customers can process confidential and sensitive data in secure isolated infrastructure within Azure’s multi-tenant regions or highly sensitive data at the edge under the customer’s full operational control.

To get answers to common cloud security related questions, government customers worldwide should review Azure for Secure Worldwide Public Sector Cloud Adoption. To learn more about how Microsoft helps customers meet their own compliance obligations across regulated industries and markets worldwide, review “Microsoft Azure compliance offerings.

Go to Original Article
Author: Microsoft News Center

Create and configure a shielded VM in Hyper-V

Creating a shielded VM to protect your data is a relatively straightforward process that consists of a few simple steps and PowerShell commands.

A shielded VM depends on a dedicated server separate from the Hyper-V host that runs the Host Guardian Service (HGS). The HGS server must not be domain-joined because it is going to take on the role of a special-purpose domain controller. To install HGS, open an administrative PowerShell window and run this command:

Install-WindowsFeature -Name HostGuardianServiceRole -Restart

Once the server reboots, create the required domain. Here, the password is [email protected] and the domain name is PoseyHGS.net. Create the domain by entering these commands:

$AdminPassword = ConvertTo-SecureString -AsPlainText ‘[email protected]’ -Force

Install-HgsServer -HgsDomainName ‘PoseyHGS.net’ -SafeModeAdministratorPassword $AdminPassword -Restart

Install the HGS server.
Figure A. This is how to install the Host Guardian Service server.

The next step in the process of creating and configuring a shielded VM is to create two certificates: an encryption certificate and a signing certificate. In production, you must use certificates from a trusted certificate authority. In a lab environment, you can use self-signed certificates, such as those used in the example below. To create these certificates, use the following commands:

$CertificatePassword = ConvertTo-SecureString -AsPlainText ‘[email protected]’ -Force
$SigningCert = New-SelfSignedCertificate -DNSName “signing.poseyhgs.net”
Export-PfxCertificate -Cert $SigningCert -Password $CertificatePassword -FilePath ‘c:CertsSigningCert.pfx’
$EncryptionCert=New-SelfSignedCertificate -DNSName “encryption.poseyhgs.net”
Export-PfxCertificate -Cert $EncryptionCert -Password $CertificatePassword -FilePath ‘C:certsEncryptionCert.pfx’

Create the certificates.
Figure B. This is how to create the required certificates.

Now, it’s time to initialize the HGS server. To perform the initialization process, use the following command:

Initialize-HGSServer -HGSServiceName ‘hgs’ -SigningCertificatePath ‘C:certsSigningCert.pfx’ -SigningCertificatePassword $CertificatePassword -EncryptionCertificatePath ‘C:certsEncryptionCert.pfx’ -EncryptionCertificatePassword $CertificatePassword -TrustTPM

The initialization process
Figure C. This is what the installation process looks like.

The last thing you need to do when provisioning the HGS server is to set up conditional domain name service (DNS) forwarding. To do so, use the following commands:

Add-DnsServerConditionalForwardZone -Name “PoseyHDS.net” -ReplicationScope “Forest” -MasterServers

Netdom trust PoseyHDS.net /domain:PoseyHDS.net /userD:PoseyHDS.netAdministrator /password: /add

In the process of creating and configuring a shielded VM, the next step is to add the guarded Hyper-V host to the Active Directory (AD) domain that you just created. You must create a global AD security group called GuardedHosts. You must also set up conditional DNS forwarding on the host so the host can find the domain controller.

Once all of that is complete, retrieve the security identifier (SID) for the GuardedHosts group, and then add that SID to the HGS attestation host group. From the domain controller, enter the following command to retrieve the group’s SID:

Get-ADGroup “GuardedHosts” | Select-Object SID

Once you know the SID, run this command on the HGS server:

Add-HgsAttestationHostGroup -Name “GuardedHosts” -Identifier “

Now, it’s time to create a code integrity policy on the Hyper-V server. To do so, enter the following commands:

New-CIPPolicy -Level FilePublisher -Fallback Hash -FilePath ‘C:PolicyHWLCodeIntegrity.xml’

ConvertFrom-CIPolicy -XMLFilePath ‘C:PolicyHwlCodeIntegrity.xml’ -BinaryFilePath ‘C:PolicyHWLCodeIntegrity.p7b’

Now, you must copy the P7B file you just created to the HGS server. From there, run this command:

Add-HGSAttestationCIPolicy -Path ‘C:HWLCodeIntegrity.p7b’ -Name ‘StdGuardHost’

Get-HGSServer

At this point, the server should display an attestation URL and a key protection URL. Be sure to make note of both of these URLs. Now, go back to the Hyper-V host and enter this command:

Set-HGSClientConfiguration -KeyProtectionServerURL “” -AttestationServerURL “

To wrap things up on the Hyper-V server, retrieve an XML file from the HGS server and import it. You must also define the host’s HGS guardian. Here are the commands to do so:

Invoke-WebRequest “/service/metadata/2014-07/metadata.xml” -OutFile ‘C:certsmetadata.xml’

Import-HGSGuardian -Path ‘C:certsmetadata.xml’ -Name ‘PoseyHGS’ -AllowUntrustedRoot

Shield a Hyper-V VM.
Figure D. Shield a Hyper-V VM by selecting a single checkbox.

Once you import the host guardian into the Hyper-V server, you can use PowerShell to configure a shielded VM. However, you can also enable shielding directly through the Hyper-V Manager by selecting the Enable Shielding checkbox on the VM’s Settings screen, as shown in Figure D above.

Windows 10 zero-day disclosed on Twitter, no fix in sight

A mishandled disclosure process saw proof-of-concept code for a Windows 10 zero-day flaw released on Twitter, but Microsoft has no patch available.

A self-described retired vulnerability researcher who goes by the handle SandboxEscaper announced the Windows 10 zero-day on Twitter on Aug. 27, complete with proof-of-concept (POC) code hosted on GitHub, but didn’t notify Microsoft beforehand. The flaw is part of the Windows Task Scheduler, and it can allow an attacker to obtain system privileges.

According to the CERT Coordination Center (CERT/CC) advisory, the “Windows task scheduler contains a local privilege escalation vulnerability in the Advanced Local Procedure Call (ALPC) interface.”

“We have confirmed that the public exploit code works on 64-bit Windows 10 and Windows Server 2016 systems,” Will Dormann, vulnerability analyst for CERT/CC, wrote in the advisory. “Compatibility with other Windows versions may be possible with modification of the publicly-available exploit source code.”

Dormann also confirmed on Twitter that although the POC released by SandboxEscaper was designed to be a Windows 10 zero-day and affect 64-bit systems, the exploit would also work on 32-bit systems with “minor tweaks.”

Craig Young, computer security researcher at Tripwire, based in Portland, Ore., noted that the Windows 10 zero-day would allow “the caller to manipulate file permissions of protected system files.”

“This can be used to overwrite system libraries with malicious code to hijack Windows. With this published exploit code, it is trivial for malware to take complete control of the system after the malware has been loaded,” Young wrote via email. “Without a privilege escalation bug like this, the malware would be dependent on users clicking through access control alerts or entering administrator credentials.”

Risk vs. exploit code  

Experts generally agreed the level of risk for this Task Scheduler Windows 10 zero-day wouldn’t normally be too severe, because the exploit requires local access. This means an attacker would have to trick a user into downloading and running a malicious program, or they would need to have previously gained access to a system. However, experts said the release of the POC code changes the risk profile for the Windows 10 zero-day.

Allan Liska, solutions architect at Recorded Future, based in Somerville, Mass., added that this Windows 10 zero-day is another flaw in a long history of issues in the Windows Task Scheduler service.

“At this time, there is no patch for the vulnerability. One possible mitigation is to prevent untrusted — usually guest — users from running code. However, if an attacker gains access with user-level privilege, this mitigation will not work,” Liska said in an email. “The best bet until Microsoft releases a patch is to monitor for suspicious activity from Task Scheduler, and for this specific POC, monitor for the print spooler service spawning unusual processes,” he continued.

“Though bear in mind that while the POC uses the print spooler service, this vulnerability is not limited to just the print spooler. With some minor tweaking, the POC code could be used to execute other services.”

Although there were no specific details, SandboxEscaper expressed frustration with Microsoft and infosec in general before releasing the Windows 10 zero-day on Twitter, but appeared regretful two days later.

SandboxEscaper had mentioned a battle with depression and a desire to quit vulnerability research in a number of tweets leading up to releasing the POC code, and the vast majority of commenters offered messages of empathy or aid.

Microsoft did not respond to requests for comment at the time of this post.

Plan your Exchange migration to Office 365 with confidence

Introduction

Choosing an Exchange migration to Office 365 is just the beginning of this process for administrators. Migrating all the content, troubleshooting the issues and then getting the settings just right in a new system can be overwhelming, especially with tricky legacy archives.

Even though it might appear that the Exchange migration to Office 365 is happening everywhere, transitioning to the cloud is not a black and white choice for every organization. On-premises servers still get the job done; however, Exchange Online offers a constant flow of new features and costs less in some cases. Administrators should also consider a hybrid deployment to get the benefits of both platforms.

Once you have determined the right configuration, you will have to choose how to transfer archived emails and public folders and which tools to use. Beyond relocating mailboxes, administrators have to keep content accessible and security a priority during an Exchange migration to Office 365.

This guide simplifies the decision-making process and steers administrators away from common issues. More advanced tutorials share the reasons to keep certain data on premises and the tricks to set up the cloud service for optimal results.

1Before the move

Plan your Exchange migration

Prepare for your move from Exchange Server to the cloud by understanding your deployment options and tools to smooth out any bumps in the road.

2After the move

Working with Exchange Online

After you’ve made the switch to Office 365’s hosted email platform, these tools and practices will have your organization taking advantage of the new platform’s perks without delay.

3Glossary

Definitions related to Exchange Server migration

Understand the terms related to moving Exchange mailboxes.