Tag Archives: systems

7 PowerShell courses to help hone skills for all levels of expertise

PowerShell can be one of the most effective tools administrators have for managing Windows systems. But it can be difficult to master, especially when time is limited. An online PowerShell course can expedite this process by prioritizing the most important topics and presenting them in logical order.

Admins have plenty of PowerShell courses from which to choose, offered by well-established vendors. But with so many courses available, it isn’t always clear which ones will be the most beneficial. To help make the course selection process easier, here we offer a sampling of popular PowerShell courses that cater to varying levels of experience.

Windows currently ships with PowerShell 5.1, but PowerShell Core 6 is available for download, and PowerShell 7 is in preview. PowerShell Core is a cross-platform version of PowerShell that runs on multiple OS platforms. It isn’t an upgrade to Windows PowerShell, but a separate application that runs on the same system.

Some of the PowerShell courses listed here, as well as other online classes, specify the PowerShell version on which the course is based. But not all classes offer this information, and some courses provide only a range, such as PowerShell 4 or later. So, before signing up for an online course, be sure to verify the PowerShell version.

Learning Windows PowerShell

This popular PowerShell tutorial from Udemy is designed for beginners. This course targets systems admins who have no prior PowerShell experience but want to use PowerShell to manage Windows desktops and servers. This course is based on PowerShell 5. But this shouldn’t be an issue when learning basic concepts, which is the primary focus of this PowerShell tutorial.

Admins have plenty of PowerShell courses from which to choose, offered by well-established vendors.

The course provides background information about PowerShell and explains how to set up the PowerShell environment, including how to configure the console and work with profiles. The course introduces cmdlets, shows how they’re related to .NET objects and classes, and explains how to build a pipeline using cmdlets and other language elements. With this information, systems admins will have the basics they need to move onto the next topic: PowerShell scripts.

The tutorial on scripting is nearly as extensive as the section on cmdlets. The course examines the details of script elements, such as variables, constants, comparison operators, if statements, looping structures and regular expressions. This is followed by details on PowerShell providers and how to work with files and folders, and then a discussion of administration basics. This course can help provide participants with a solid foundation in PowerShell so they’re ready to take on more advanced topics.

Introduction to Windows PowerShell 5.1

This Udemy tutorial is based on PowerShell 5.1, so it’s more current than the previous course. The training is geared toward both beginner PowerShell users and more experienced admins who want to hone their PowerShell skills. The course covers a wide range of topics, from understanding PowerShell syntax to managing Active Directory (AD). Participants who sign up for this course should already know how to run PowerShell, but they don’t need to be advanced users.

The course covers the basics of how to use both the PowerShell console and the Intelligent Scripting Environment (ISE). It explains what steps to take to get help and find commands. This is followed by an in-depth look at the PowerShell command syntax. The material also covers objects and their properties and methods, as well as an explanation of how to build a PowerShell pipeline.

Participants can move onto the section on scripting, which starts with a discussion on arrays and variables. Users then learn how to build looping structures and conditional statements, and how to use PowerShell functions. This course demonstrates how to use PowerShell to work with AD, covering such tasks as installing and configuring server roles.

PowerShell version 5.1 and 6: Step-by-Step

This tutorial, which is one of Udemy’s highest rated PowerShell courses, is geared toward admins who want to learn how to use PowerShell to perform management tasks. The course is broad in scope and covers both PowerShell 5.1 and PowerShell Core 6. Users who sign up for this course should have a basic understanding of the Windows OS — both desktop and server versions.

Because the course covers so many topics, it’s longer than the previous two training sessions and goes into more detail. It explains the differences between PowerShell and the Windows Command Prompt, how to determine the PowerShell version and how to work with aliases. The course also examines the steps necessary to run unsupported commands and create PowerShell transcripts.

This PowerShell tutorial also examines more advanced topics, such as working with object members, creating hash tables and managing execution policy levels. This is followed by a detailed discussion about the Common Information Model (CIM) and how it can manage hard drives and work with BIOS. In addition, participants will learn how to create profile scripts, functions and modules, as well as how to use script parameters and to pause script execution. Because the course is so comprehensive, admins should come away with a solid understanding of how to use PowerShell to script their daily management tasks.

Udemy course pricing

Udemy distinguishes between personal and business users. For personal users, Udemy charges by the course, with prices for PowerShell courses ranging between $25 and $200. Udemy also offers personal users a 30-day, money-back guarantee.

Udemy also offers two business plans that provide unlimited access to its courses. The Team plan supports between five and 20 users and costs $240 per user, per year. It also comes with a 14-day trial. Contact Udemy for details regarding its Enterprise plan, which supports 21 or more users. Udemy also offers courses to help users prepare for IT certifications, supporting such programs as Cisco CCNA, Oracle Certification and Microsoft Certification.

Windows PowerShell: Essentials

Pluralsight offers a variety of PowerShell courses, as well as learning paths. A path is a series of related courses that provide users with a strategy for learning a specific technology. This path includes six courses ranging from beginner to advanced user. Participants should come away with a strong foundation in how to create PowerShell scripts that automate administrative processes. Before embarking on this path, however, they should have a basic understanding of Windows networking and troubleshooting.

The beginning courses on this path provide users with the information they need to start working with PowerShell, even if they’re first-timers. Users will learn how to use cmdlets, work with objects and get help when they need it. These courses also introduce concepts such as aliases, providers and mapping network drives. The intermediate tutorials build on the beginning courses by explaining how to work with objects and the PowerShell pipeline, and how to format output. The intermediate courses also focus on using PowerShell in a networked environment, covering such topics as CIM and Windows Management Instrumentation.

The advanced courses build on the beginning and intermediate tutorials by focusing on automation scripts. Admins will learn how to use PowerShell scripting to automate their routine processes and tasks. They’ll also learn how to troubleshoot problems in their scripts if PowerShell exhibits unusual behavior. The path approach might not be for everyone, but for those ready to invest their time in a comprehensive program, this path could prove a valuable resource.

Practical Desired State Configuration

Those not suited to a learning path can choose from a variety of other Pluralsight courses that address specific technologies. This highly rated course caters to advanced users and provides real-world examples of how to use PowerShell to write Desired State Configurations (DSCs). Those interested in the course should be familiar with PowerShell and DSC principles.

DSC refers to a new way of managing Windows Server that shifts the focus from point-and-click GUIs to infrastructure as code. To achieve this, admins can use PowerShell to build DSCs. This process is the focus of this course, which covers several advanced topics ranging from writing configurations with custom resources to building dynamic collector configurations.

The tutorial demonstrates how to use custom resources in a configuration and offers an in-depth discussion of securing DSC operations. Participants then learn how to use the DSC model to configure and manage AD, covering such topics as building domains and creating users and groups. The course demonstrates how to set up Windows event forwarding. Although not everyone is looking for such advanced topics, for some users, this course might be just what they need to progress their PowerShell skills.

Pluralsight pricing

Pluralsight doesn’t charge by the course, but rather it offers three personal plans and two business plans. The personal plans start at $299 per year, and the business plans start at $579 per user, per year. All plans include access to the entire course library. In addition, Pluralsight offers a 10-day personal free trial and, like Udemy, courses geared toward IT certification.

PowerShell 5 Essential Training

Of the 13 online PowerShell courses offered by LinkedIn Learning — formerly, Lynda.com — this is the most popular. The course targets beginner and intermediate PowerShell users who are Windows systems admins. Although the course is based on PowerShell 5, the basic information is still applicable today, like other courseware written to this version.

The material covers most of the basics one would expect from a course at this level. It explains how to set up and customize PowerShell, and it introduces admins to cmdlets and their syntax and how to find help. This is followed by installing modules and packages. The course also describes how to use the PowerShell pipeline, covering such topics as working with files and printers, as well as storing data as a webpage.

The course moves onto objects and their properties and methods. Participants can learn how to create scripts that incorporate variables and parameters so they can automate administrative tasks. Participants are also introduced to PowerShell ISE and shown how to use PowerShell remoting to manage multiple systems at once, along with practical examples of administrative operations at scale.

PowerShell: Scripting for Advanced Automation

This course, which is also offered by LinkedIn Learning, focuses on automating advanced administrative operations in a Windows network. Those planning to take the course should have a strong foundation in managing Windows environments. As its name suggests, the course is geared toward advanced users.

After a brief introduction, the course jumps into DSC automation, providing an overview of DSC and explaining how to set up DSCs. Users can learn how to work with DSC resources, push DSCs and create pull configurations. The course then moves onto Just Enough Administration, explaining JEA concepts and best practices. In this part of the course, participants learn how to create role capability files and JEA session configurations, as well as how to register JEA endpoints.

The final section of the tutorial describes how to troubleshoot PowerShell scripts. The discussion begins with an overview of PowerShell workflows and examines the specifics of troubleshooting PowerShell in both the console and ISE. The section ends with information about using the PSScriptAnalyzer tool for quality control. As with any advanced course, not all users will benefit from this information. But the tutorial could provide a valuable resource for admins looking to refine their PowerShell skills.

LinkedIn Learning pricing

LinkedIn Learning sells courses individually, offers a one-month free trial and provides both personal and business plans. Individual PowerShell courses cost between $30 and $45, and individual subscription plans start at $20 per month. Contact LinkedIn Learning regarding business plans. LinkedIn Learning also offers courses aimed at IT certifications.

Go to Original Article
Author:

What are Windows virtualization-based security features?

Windows administrators must maintain constant vigilance over their systems to prevent a vulnerability from crippling their systems or exposing data to threat actors. For shops that use Hyper-V, Microsoft offers another layer of protection through its virtualization-based security.

Virtualization-based security uses Hyper-V and the machine’s hardware virtualization features to isolate and protect an area of system memory that runs the most sensitive and critical parts of the OS kernel and user modes. Once deployed, these protected areas can guard other kernel and user-mode instances.

Virtualization-based security effectively reduces the Windows attack surface, so even if a malicious actor gains access to the OS kernel, the protected content can prevent code execution and the access of secrets, such as system credentials. In theory, these added protections would prevent malware attacks that use kernel exploits from gaining access to sensitive information.

Code examining, malware prevention among key capabilities

Virtualization-based security is a foundation technology and must be in place before adopting a range of advanced security features in Windows Server. One example is Hypervisor-Enforced Code Integrity (HVCI), which examines code — such as drivers — and ensures the kernel mode drivers and binaries are signed before they load into memory. Unsigned content gets denied, reducing the possibility of running malicious code.

Other advanced security capabilities that rely on virtualization-based security include Windows Defender Credential Guard, which prevents malware from accessing credentials, and the ability to create virtual trusted platform modules (TPMs) for shielded VMs.

In Windows Server 2019, Microsoft expanded its shielded VMs feature beyond the Windows platform to cover Linux workloads running on Hyper-V to prevent data leakage when the VM is both static and when it moves to another Hyper-V host.

New in Windows Server 2019 is a feature called host key attestation, which uses asymmetric key pairs to authenticate hosts covered by the Host Guardian Service in what is described as an easier deployment method by not requiring an Active Directory trust arrangement.

What are the virtualization-based security requirements?

Virtualization-based security has numerous requirements. It’s important to investigate the complete set of hardware, firmware and software requirements before adopting virtualization-based security. Any missing requirements may make it impossible to enable virtualization-based security and compromise system security features that depend on virtualization-based security support.

At the hardware level, virtualization-based security needs a 64-bit processor with virtualization extensions (Intel VT-x and AMD-V) and second-level address translation as Extended Page Tables or Rapid Virtualization Indexing. I/O virtualization must be supported through Intel VT-d or AMD-Vi. The server hardware must include TPM 2.0 or better.

System firmware must support the Windows System Management Mode Security Mitigations Table specification. Unified Extensible Firmware Interface must support memory reporting features such as the UEFI v2.6 Memory Attributes Table. Support for Secure Memory Overwrite Request v2 will inhibit in-memory attacks. All drivers must be compatible with HVCI standards.

Go to Original Article
Author:

Using wsusscn2.cab to find missing Windows updates

Keeping your Windows Server and Windows desktop systems updated can be tricky, and finding missing patches in conventional ways might not be reliable.

There are a few reasons why important security patches might not get installed. They could be mistakenly declined in Windows Server Update Services or get overlooked in environments that a lack an internet connection.

Microsoft provides a Windows Update offline scan file, also known as wsusscn2.cab, to help you check Windows systems for missing updates. The CAB file contains information about most patches for Windows and Microsoft applications distributed through Windows Update.

The challenge with the wsusscn2.cab file is its size. It weighs in around 650 MB, and distributing it to all the servers to perform a scan can be tricky and time-consuming. This tutorial explains how to avoid those issues and run it on all of your servers in a secure and timely manner using IIS for file transfer instead of SMB or PowerShell sessions.

Requirements for offline scanning

There are some simple requirements to use this tutorial:

  • a server or PC running Windows Server 2012 or newer or Windows 10;
  • a domain account with local administrator on the servers you want to scan; and
  • PowerShell remoting enabled on the servers you want to scan.

Step 1. Install IIS

First, we need a web server we can use to distribute the wsusscn2.cab file. There are several ways to copy the file, but they all have different drawbacks.

For example, we could distribute the wsusscn2.cab file with a regular file share, but that requires a double-hop. You could also copy the wsusscn2.cab file over a PowerShell session, but that causes a lot of overhead and is extremely slow for large files. An easier and more secure way to distribute the file is through HTTP and IIS.

Installing on Windows Server

Start PowerShell as admin and type the following to install IIS:

Install-WindowsFeature -name Web-Server -IncludeManagementTools

Installing on Windows 10

Start PowerShell as an admin and type the following to install IIS:

Enable-WindowsOptionalFeature -Online -FeatureName IIS-WebServer

The IIS role should be installed. The default site will point to the root folder of the C drive.

We can now proceed to download wsusscn2.cab from Microsoft.

Step 2. Download wsusscn2.cab

The link for this file can be tricky to find. You can either download it from this link and save it to the C drive or run the following script as admin on the IIS server:

# Default Site path, change if necessary
$IISFolderPath = "C:inetpubwwwroot"

# Download wsusscn2.cab
Start-BitsTransfer -Source "http://go.microsoft.com/fwlink/?linkid=74689" -Destination "$IISFolderPathwsusscn2.cab"

The script downloads the file to the wwwroot folder. We can verify the download by browsing to http:///wsusscn2.cab.

You also need to get the hash value of wsusscn2.cab to verify it. After saving it, run the following PowerShell command to check the file hash:

(Get-FileHash C:inetpubwwwrootwsusscn2.cab).Hash

31997CD01B8790CA68A02F3A351F812A38639FA49FEC7346E28F7153A8ABBA05

Step 3. Run the check on a server

Next, you can use a PowerShell script to download and scan for missing updates on a PC or server using the wsusscn2.cab file. You can run the script on at least Windows Server 2008 or newer to avoid compatibility issues. To do this in a secure and effective manner over HTTP, we get the file hash of the downloaded wsusscn2.cab file and compare it with the file hash of the CAB file on the IIS server.

We can also use the file hash to see when Microsoft releases a new version of wsusscn2.cab.

Copy and save the following script as Get-MissingUpdates.ps1:

Param(
    [parameter(mandatory)]
    [string]$FileHash,

    [parameter(mandatory)]
    [string]$Wsusscn2Url
)


Function Get-Hash($Path){
    
    $Stream = New-Object System.IO.FileStream($Path,[System.IO.FileMode]::Open) 
    
    $StringBuilder = New-Object System.Text.StringBuilder 
    $HashCreate = [System.Security.Cryptography.HashAlgorithm]::Create("SHA256").ComputeHash($Stream)
    $HashCreate | Foreach {
        $StringBuilder.Append($_.ToString("x2")) | Out-Null
    }
    $Stream.Close() 
    $StringBuilder.ToString() 
}

$DataFolder = "$env:ProgramDataWSUS Offline Catalog"
$CabPath = "$DataFolderwsusscn2.cab"

# Create download dir
mkdir $DataFolder -Force | Out-Null

# Check if cab exists
$CabExists = Test-Path $CabPath


# Compare hashes if download is needed
if($CabExists){
    Write-Verbose "Comparing hashes of wsusscn2.cab"
    
    $HashMatch = $Hash -ne (Get-Hash -Path $CabPath)

    if($HashMatch){   
        Write-Warning "Filehash of $CabPath did not match $($FileHash) - downloading"
        Remove-Item $CabPath -Force
    }
    Else{
        Write-Verbose "Hashes matched"
    }
}

# Download wsus2scn.cab if it dosen't exist or hashes mismatch
if(!$CabExists -or $HashMatch -eq $false){
    Write-Verbose "Downloading wsusscn2.cab"
    # Works on Windows Server 2008 as well
    (New-Object System.Net.WebClient).DownloadFile($Wsusscn2Url, $CabPath)

    if($Hash -ne (Get-Hash -Path $CabPath)){
        Throw "$CabPath did not match $($FileHash)"
    }

}

Write-Verbose "Checking digital signature of wsusscn2.cab"

$CertificateIssuer = "CN=Microsoft Code Signing PCA, O=Microsoft Corporation, L=Redmond, S=Washington, C=US"
$Signature = Get-AuthenticodeSignature -FilePath $CabPath
$SignatureOk = $Signature.SignerCertificate.Issuer -eq $CertificateIssuer -and $Signature.Status -eq "Valid"


If(!$SignatureOk){
    Throw "Signature of wsusscn2.cab is invalid!"
}


Write-Verbose "Creating Windows Update session"
$UpdateSession = New-Object -ComObject Microsoft.Update.Session
$UpdateServiceManager  = New-Object -ComObject Microsoft.Update.ServiceManager 

$UpdateService = $UpdateServiceManager.AddScanPackageService("Offline Sync Service", $CabPath, 1) 

Write-Verbose "Creating Windows Update Searcher"
$UpdateSearcher = $UpdateSession.CreateUpdateSearcher()  
$UpdateSearcher.ServerSelection = 3
$UpdateSearcher.ServiceID = $UpdateService.ServiceID.ToString()
 
Write-Verbose "Searching for missing updates"
$SearchResult = $UpdateSearcher.Search("IsInstalled=0")

$Updates = $SearchResult.Updates

$UpdateSummary = [PSCustomObject]@{

    ComputerName = $env:COMPUTERNAME    
    MissingUpdatesCount = $Updates.Count
    Vulnerabilities = $Updates | Foreach {
        $_.CveIDs
    }
    MissingUpdates = $Updates | Select Title, MsrcSeverity, @{Name="KBArticleIDs";Expression={$_.KBArticleIDs}}
}

Return $UpdateSummary

Run the script on one of the servers of computers to check for missing updates. To do this, copy the script to the machine and run the script with the URL to the wsusscn2.cab on the IIS server and the hash value from step two:

PS51> Get-MissingUpdates.ps1 -Wsusscn2Url "http://
  
   /wsusscn2.cab" -FileHash 31997CD01B8790CA68A02F3A351F812A38639FA49FEC7346E28F7153A8ABBA05
  

If there are missing updates, you should see output similar to the following:

ComputerName     MissingUpdatesCount Vulnerabilities  MissingUpdates
------------     ------------------- ---------------  --------------
UNSECURESERVER                    14 {CVE-2006-4685, CVE-2006-4686,
CVE-2019-1079, CVE-2019-1079...} {@{Title=MSXML 6.0 RTM Security Updat

If the machine is not missing updates, then you should see this type of output:

ComputerName MissingUpdatesCount Vulnerabilities MissingUpdates
------------ ------------------- --------------- --------------
SECURESERVER                   0

The script gives a summary of the number of missing updates, what those updates are and the vulnerabilities they patch.

This process is a great deal faster than searching for missing updates online. But this manual method is not efficient when checking a fleet of servers, so let’s learn how to run the script on all systems and collect the output.

Step 4. Run the scanning script on multiple servers at once

The easiest way to collect missing updates from all servers with PowerShell is with a PowerShell job. The PowerShell jobs run in parallel on all computers, and you can fetch the results.

On a PC or server, save the file from the previous step to the C drive — or another directory of your choice — and run the following as a user with admin permissions on your systems:

# The servers you want to collect missing updates from
$Computers = @(
        'server1',
        'server2',
        'server3'
)

# These are the arguments that will be sent to the remote servers
$RemoteArgs = @(
    # File hash from step 2
    "31997CD01B8790CA68A02F3A351F812A38639FA49FEC7346E28F7153A8ABBA05",
    "http://$env:COMPUTERNAME/wsusscn2.cab"
)

$Params = @{
    ComputerName = $Computers
    ArgumentList = $RemoteArgs
    AsJob        = $True
    # Filepath to the script on the server/computer you are running this command on
    FilePath = "C:ScriptsGet-MissingUpdates.ps1"
    # Maximum number of active jobs
    ThrottleLimit = 20
}

$Job = Invoke-Command @Params

# Wait for all jobs to finish
$Job | Wait-Job

# Collect Results from the jobs
$Results = $Job | Receive-Job

# Show results
$Results

This runs the Get-MissingUpdates.ps1 script on all servers in the $Computers variable in parallel to save time and make it easier to collect the results.

You should run these PowerShell jobs regularly to catch servers with a malfunctioning Windows Update and to be sure important updates get installed.

Go to Original Article
Author:

Epicor ERP system focuses on distribution

Many ERP systems try to be all things to all use cases, but that often comes at the expense of heavy customizations.

Some companies are discovering that a purpose-built ERP is a better and more cost-effective bet, particularly for small and midsize companies. One such product is the Epicor ERP system Prophet 21, which is primarily aimed at wholesale distributors.

The functionality in the Epicor ERP system is designed to help distributors run processes more efficiently and make better use of data flowing through the system.

In addition to distribution-focused functions, the Prophet 21 Epicor ERP system includes the ability to integrate value-added services, which could be valuable for distributors, said Mark Jensen, Epicor senior director of product management.

“A distributor can do manufacturing processes for their customers, or rentals, or field service and maintenance work. Those are three areas that we focused on with Prophet 21,” Jensen said.

Prophet 21’s functionality is particularly strong in managing inventory, including picking, packing and shipping goods, as well as receiving and put-away processes.

Specialized functions for distributors

Distribution companies that specialize in certain industries or products have different processes that Prophet 21 includes in its functions, Jensen said. For example, Prophet 21 has functionality designed specifically for tile and slab distributors.

“The ability to be able to work with the slab of granite or a slab of marble — what size it is, how much is left after it’s been cut, transporting that slab of granite or tile — is a very specific functionality, because you’re dealing with various sizes, colors, dimensions,” he said. “Being purpose-built gives [the Epicor ERP system] an advantage over competitors like Oracle, SAP, NetSuite, [which] either have to customize or rely on a third-party vendor to attach that kind of functionality.”

Jergens Industrial Supply, a wholesale supplies distributor based in Cleveland, has improved efficiency and is more responsive to shifting customer demands using Prophet 21, said Tony Filipovic, Jergens Industrial Supply (JIS) operations manager.

We looked at other systems that say they do manufacturing and distribution, but I just don’t feel that that’s the case.
Tony FilipovicOperations manager, Jergens Industrial Supply

“We like Prophet 21 because it’s geared toward distribution and was the leading product for distribution,” Filipovic said. “We looked at other systems that say they do manufacturing and distribution, but I just don’t feel that that’s the case. Prophet 21 is something that’s been top of line for years for resources distribution needs.”

One of the key differentiators for JIS was Prophet 21’s inventory management functionality, which was useful because distributors manage inventory differently than manufacturers, Filipovic said.

“All that functionality within that was key, and everything is under one package,” he said. “So from the moment you are quoting or entering an order to purchasing the product, receiving it, billing it, shipping it and paying for it was all streamlined under one system.”

Another key new feature is an IoT-enabled button similar to Amazon Dash buttons that enables customers to resupply stocks remotely. This allows JIS to “stay ahead of the click” and offer customers lower cost and more efficient delivery, Filipovic said.

“Online platforms are becoming more and more prevalent in our industry,” he said. “The Dash button allows customers to find out where we can get into their process and make things easier. We’ve got the ordering at the point where customers realize that when they need to stock, all they do is press the button and it saves multiple hours and days.”

Epicor Prophet 21 a strong contender in purpose-built ERP

Epicor Prophet 21 is on solid ground with its purpose-built ERP focus, but companies have other options they can look at, said Cindy Jutras, president of Mint Jutras, an ERP research and advisory firm in Windham, NH.

“Epicor Prophet 21 is a strong contender from a feature and function standpoint. I’m a fan of solutions that go that last mile for industry-specific functionality, and there aren’t all that many for wholesale distribution,” Jutras said. “Infor is pretty strong, NetSuite plays here, and then there a ton of little guys that aren’t as well-known.”

Prophet 21 may take advantage of new cloud capabilities to compete better in some global markets, said Predrag Jakovljevic, principal analyst at Technology Evaluation Centers, an enterprise computing analysis firm in Montreal.

“Of course a vertically-focused ERP is always advantageous, and Prophet 21 and Infor SX.e go head-to-head all the time in North America,” Jakovljevic said. “Prophet 21 is now getting cloud enabled and will be in Australia and the UK, where it might compete with NetSuite or Infor M3, which are global products.”

Go to Original Article
Author:

Cornell researchers call for AI transparency in automated hiring

Cornell University is becoming a hotbed of warning about automated hiring systems. In two separate papers, researchers have given the systems considerable scrutiny. Both papers cite problems with AI transparency, or the ability to explain how an AI system reaches a conclusion.

Vendors are selling automated hiring systems partly as a remedy to human bias. But they also argue they can speed up the hiring process and select applicants who will make good employees.

Manish Raghavan, a computer science doctoral student at Cornell who led the most recent study, questions vendors’ claims. If AI is doing a better job than hiring managers, “how do we know that’s the case or when will we know that that’s the case?” he said.

A major thrust of the research is the need for AI transparency. That’s not only needed for the buyers of automated hiring systems, but for job applicants as well.

At Cornell, Raghavan knows students who take AI-enabled tests as part of a job application. “One common complaint that I’ve heard is that it just viscerally feels upsetting to have to perform for a robot,” he said.

Manish Raghavan, a doctoral student in computer science at Cornell UniversityManish Raghavan

A job applicant may have to install an app to film a video interview, play a game that may measure cognitive ability or take a psychometric test that can be used to measure intelligence and personality.

“This sort of feels like they’re forcing you [the job applicant] to invest extra effort, but they’re actually investing less effort into you,” Raghavan said. Rejected applicants won’t know why they were rejected, the standards used to measure their performance, or how they can improve, he said.

Nascent research, lack of regulation

The paper, “Mitigating Bias in Algorithmic Employment Screening: Evaluating Claims and Practices,” is the work of a multidisciplinary team of computer scientists, as well as those with legal and sociological expertise. It argues that HR vendors are not providing insights into automated hiring systems.

One common complaint that I’ve heard is that it just viscerally feels upsetting to have to perform for a robot.
Manish RaghavanDoctoral student in computer science, AI researcher, Cornell University

The researchers looked at the public claims of nearly 20 vendors that sell these systems. Many are startups, although some have been around for more than a decade. They argue that vendors are taking nascent research and translating it into practice “at sort of breakneck pace,” Raghavan said. They’re able to do so because of a lack of regulation.

Vendors can produce data from automated hiring systems that shows how their systems perform in helping achieve diversity, Raghavan said. “Their diversity numbers are quite good,” but they can cherry-pick what data they release, he said. Nonetheless, “it also feels like there is some value being added here, and their clients seem fairly happy with the results.”

But there are two levels of transparency that Raghavan would like to see improve. First, he suggested vendors release internal studies that show the validity of their assessments. The data should include how often vendors are running into issues of disparate impact, which refers to a U.S. Equal Employment Opportunity Commission formula for determining if hiring is having a discriminatory impact on a protected group.

A second step for AI transparency involves having third-party independent researchers do some of their own analysis.

Vendors argue that AI systems do a better job than humans in reducing bias. But researchers see a risk that they could embed certain biases against a group of people that won’t be easily discovered unless there’s an understanding for how these systems work.

One problem often cited is that an AI-enabled system can help improve diversity but still discriminate against certain groups or people. New York University researchers recently noted that most of the AI code today is being written by young white males, who many encode their biases.

Ask about the ‘magic fairy dust’

Ben Eubanks, principal analyst at Lighthouse Research & Advisory, believes the Cornell paper should be on every HR manager’s reading list, “not necessarily because it should scare them away but because it should encourage them to ask more questions about the magic fairy dust behind some technology claims.”

“Hiring is and has always been full of bias,” said Eubanks, who studies AI use in HR. “Algorithms are subject to some of those same constraints, but they can also offer ways to mitigate some of the very real human bias in the process.”

But the motivation for employers may be different, Eubanks said.

“Employers adopting these technologies may be more concerned initially with the outcomes — be it faster hiring, cheaper hiring, or longer retention rates — than about the algorithm actually preventing or mitigating bias,” Eubanks said. That’s what HR managers will likely be rewarded on.

In a separate paper, Ifeoma Ajunwa, assistant professor of labor relations, law and history at Cornell University, argued for independent audits and compulsory data retention in her recently published “Automated Employment Discrimination.”

Ajunwa’s paper raises problems with automated hiring, including systems that “discreetly eliminate applicants from protected categories without retaining a record.”¬†

AI transparency adds confidence

Still, in an interview, Cornell’s Raghavan was even-handed about using AI and didn’t warn users away from automated hiring systems. He can see use cases but believes there is good reason for caution.

“I think what we can agree on is that the more transparency there is, the easier it will be for us to determine when is or is not the right time or the right place to be using these systems,” Raghavan said.

“A lot of these companies and vendors seem well-intentioned — they think what they’re doing is actually very good for the world,” he said. “It’s in their interest to have people be confident in their practices.”

Go to Original Article
Author:

Google-Ascension deal reveals murky side of sharing health data

One of the largest nonprofit health systems in the U.S. created headlines when it was revealed that it was sharing patient data with Google — under codename Project Nightingale.

Ascension, a Catholic health system based in St. Louis, partnered with Google to transition the health system’s infrastructure to the Google Cloud Platform, to use the Google G Suite productivity and collaboration tools, and to explore the tech giant’s artificial intelligence and machine learning applications. By doing so, it is giving Google access to patient data, which the search giant can use to inform its own products.

The partnership appears to be technically and legally sound, according to experts. After news broke, Ascension released a statement saying the partnership is HIPAA-compliant and a business associate agreement, a contract required by the federal government that spells out each party’s responsibility for protected health information, is in place. Yet reports from The Wall Street Journal and The Guardian about the possible improper transfer of 50 million patients’ data has resulted in an Office for Civil Rights inquiry into the Google-Ascension partnership.

Legality aside, the resounding reaction to the partnership speaks to a lack of transparency in healthcare. Organizations should see the response as both an example of what not to do, as well as a call to make patients more aware of how they’re using health data, especially as consumer companies known for collecting and using data for profit become their partners.

Partnership breeds legal, ethical concerns

Forrester Research senior analyst Jeff Becker said Google entered into a similar strategic partnership with Mayo Clinic in September, and the coverage was largely positive.

Forrester Research senior analyst Jeff Becker Jeff Becker

According to a Mayo Clinic news release, the nonprofit academic medical center based in Rochester, Minn., selected Google Cloud to be “the cornerstone of its digital transformation,” and the clinic would use “advanced cloud computing, data analytics, machine learning and artificial intelligence” to improve healthcare delivery.

But Ascension wasn’t as forthcoming with its Google partnership. It was Google that announced its work with Ascension during a quarterly earnings call in July, and Ascension didn’t issue a news release about the partnership until after the news broke.

“There should have been a public-facing announcement of the partnership,” Becker said. “This was a PR failure. Secrecy creates distrust.”

Matthew Fisher, partner at Mirick O’Connell Attorneys at Law and chairman of its health law group, said the outcry over the Google-Ascension partnership was surprising. For years, tech companies have been trying to get access to patient data to help healthcare organizations and, at the same time, develop or refine their existing products, he said.

“I get the sense that just because it was Google that was announced to have been a partner, that’s what drove a lot of the attention,” he said. “Everyone knows Google mostly for purposes outside of healthcare, which leads to the concern of does Google understand the regulatory obligations and restrictions that come to bear by entering the healthcare space?”

Ascension’s statement in response to the situation said the partnership with Google is covered by a business associate agreement — a distinction Fisher said is “absolutely required” before any protected health information can be shared with Google. Parties in a business associate agreement are obligated by federal regulation to comply with the applicable portions of HIPAA, such as its security and privacy rules.

A business associate relationship allows identifiable patient information to be shared and used by Google only under specified circumstances. It is the legal basis for keeping patient data segregated and restricting Google from freely using that data. According to Ascension, the health system’s clinical data is housed within an Ascension-owned virtual private space in Google Cloud, and Google isn’t allowed to use the data for marketing or research.

“Our data will always be separate from Google’s consumer data, and it will never be used by Google for purposes such as targeting consumers for advertising,” the statement said.

Health IT and information security expert Kate Borten Kate Borten

But health IT and information security expert Kate Borten believes business associate agreements and the HIPAA privacy rule they adhere to don’t go far enough to ensure patient privacy rights, especially when companies like Google get involved. The HIPAA privacy rule doesn’t require healthcare organizations to disclose to patients who they’re sharing patient data with.

“The privacy rule says as long as you have this business associate contract — and business associates are defined by HIPAA very broadly — then the healthcare provider organization or insurer doesn’t have to tell the plan members or the patients about all these business associates who now have access to your data,” she said.

Chilmark Research senior analyst Jody Ranck said much of the alarm over the Google-Ascension partnership may be misplaced, but it speaks to a growing concern about companies like Google entering healthcare.

Since the Office for Civil Rights is looking into the partnership, Ranck said there is still a question of whether the partnership fully complies with the law. But the bigger question has to do with privacy and security concerns around collecting and using patient data, as well as companies like Google using patient data to train AI algorithms and the potential biases it could create.

All of this starts to feel like a bit of an algorithmic iron cage.
Jody RanckSenior analyst, Chilmark Research

Ranck believes consumer trust in tech companies is declining, especially as data privacy concerns get more play.

“Now that they know everything you purchase and they can listen in to that Alexa sitting beside your bed at night, and now they’re going to get access to health data … what’s a consumer to do? Where’s their power to control their destiny when algorithms are being used to assign you as a high-, medium-, or low-risk individual, as creditworthy?” Ranck said. “All of this starts to feel like a bit of an algorithmic iron cage.”

A call for more transparency

Healthcare organizations and big tech partnerships with the likes of Google, Amazon, Apple and Microsoft are growing. Like other industries, healthcare organizations are looking to modernize their infrastructure and take advantage of state of the art storage, security, data analytics tools and emerging tech like artificial intelligence.

But for healthcare organizations, partnerships like these have an added complexity — truly sensitive data. Forrester’s Becker said the mistake in the Google-Ascension partnership was the lack of transparency. There was no press release early on announcing the partnership, laying out what information is being shared, how the information will be used, and what outcome improvements the healthcare organization hopes to achieve.

“There should also be assurance that the partnership falls within HIPAA and that data will not be used for advertising or other commercial activities unrelated to the healthcare ambitions stated,” he said.

Fisher believes the Google-Ascension partnership raises questions about what the legal, moral and ethical aspects of these relationships are. While Ascension and Google may have been legally in the right, Fisher believes it’s important to recognize that privacy expectations are shifting, which calls for better consumer education, as well as more transparency around where and how data is being used.

Although he believes it would be “unduly burdensome” to require a healthcare organization to name every organization it shares data with, Fisher said better education on how HIPAA operates and what it allows when it comes to data sharing, as well as explaining how patient data will be protected when shared with a company like Google, could go a long way in helping patients understand what’s happening with their data.

“If you’re going to be contracting with one of these big-name companies that everyone has generalized concerns about with how they utilize data, you need to be ahead of the game,” Fisher said. “Even if you’re doing everything right from a legal standpoint, there’s still going to be a PR side to it. That’s really the practical reality of doing business. You want to be taking as many measures as you can to avoid the public backlash and having to be on the defensive by having the relationship found out and reported upon or discussed without trying to drive that discussion.”

Go to Original Article
Author:

Redis Labs eases database management with RedisInsight

The robust market of tools to help users of the Redis database manage their systems just got a new entrant.

Redis Labs disclosed the availability of its RedisInsight tool, a graphical user interface (GUI) for database management and operations.

Redis is a popular open source NoSQL database that is also increasingly being used in cloud-native Kubernetes deployments as users move workloads to the cloud. Open source database use is growing quickly according to recent reports as the need for flexible, open systems to meet different needs has become a common requirement.

Among the challenges often associated with databases of any type is ease of management, which Redis is trying to address with RedisInsight.

“Database management will never go out of fashion,” said James Governor, analyst and co-founder at RedMonk. “Anyone running a Redis cluster is going to appreciate better memory and cluster management tools.”

Governor noted that Redis is following a tested approach, by building out more tools for users that improve management. Enterprises are willing to pay for better manageability, Governor noted, and RedisInsight aims to do that.

RedisInsight based on RDBtools

The RedisInsight tool, introduced Nov. 12, is based on the RDBTools technology that Redis Labs acquired in April 2019. RDBTools is an open source GUI for users to interact with and explore data stores in a Redis database.

Database management will never go out of fashion.
James GovernorAnalyst and co-founder, RedMonk

Over the last seven months, Redis added more capabilities to the RDBTools GUI, expanding the product’s coverage for different applications, said Alvin Richards, chief product officer at Redis.

One of the core pieces of extensibility in Redis is the ability to introduce modules that contain new data structures or processing frameworks. So for example, a module could include time series, or graph data structures, Richards explained.

“What we have added to RedisInsight is the ability to visualize the data for those different data structures from the different modules,” he said. “So if you want to visualize the connections in your graph data for example, you can see that directly within the tool.”

RedisInsight overview dashboard
RedisInsight overview dashboard

RDBTools is just one of many different third-party tools that exist for providing some form of management and data insight for Redis. There are some 30 other third-party GUI tools in the Redis ecosystem, though lack of maturity is a challenge.

“They tend to sort of come up quickly and get developed once and then are never maintained,” Richards said. “So, the key thing we wanted to do is ensure that not only is it current with the latest features, but we have the apparatus behind it to carry on maintaining it.”

How RedisInsight works

For users, getting started with the new tool is relatively straightforward. RedisInsight is a piece of software that needs to be downloaded and then connected to an existing Redis database. The tool ingests all the appropriate metadata and delivers the visual interface to users.

RedisInsight is available for Windows, macOS and Linux, and also available as a Docker container. Redis doesn’t have a RedisInsight as a Service offering yet.

“We have considered having RedisInsight as a service and it’s something we’re still working on in the background, as we do see demand from our customers,” Richards said. “The challenge is always going to be making sure we have the ability to ensure that there is the right segmentation, security and authorization in place to put guarantees around the usage of data.”

Go to Original Article
Author:

InfoTrax settles FTC complaint, will implement infosec program

InfoTrax Systems this week settled with the Federal Trade Commission regarding allegations that the company failed to protect consumer data after a nearly two-year-long data breach.

The FTC filed a complaint against the Utah-based multi-level marketing software company in the wake of attackers stealing sensitive information on approximately one million customers over the course of more than 20 malicious infiltrations between May 2014 and March 2016.

InfoTrax only became aware of the attacks in March 2016 because a data archive file created by the malicious actors grew so large that servers reached maximum storage capacity. “Only then did Respondents begin to take steps to remove the intruder from InfoTrax’s network,” the FTC wrote in the complaint.

The FTC asserted that InfoTrax — in part — failed to implement a process to inventory and delete unnecessary customer information, failed to detect malicious file uploads and stored consumers’ personal information, including Social Security numbers, bank account information, payment card information and more, “in clear, readable text on InfoTrax’s network.”

The FTC added, “Respondents could have addressed each of the failures … by implementing readily available and relatively low-cost security measures.”

As a result of the settlement, InfoTrax is “prohibited from collecting, selling, sharing or storing personal information unless they implement an information security program that would address the security failures identified in the complaint. This includes assessing and documenting internal and external security risks; implementing safeguards to protect personal information from cybersecurity risks; and testing and monitoring the effectiveness of those safeguards,” the FTC wrote in a statement published Tuesday.

The company will also have to obtain assessments of its information security program from a third party, approved by the FTC, every two years.

InfoTrax responds

On Nov. 12, Scott Smith, president and newly appointed CEO of InfoTrax, released a statement that is no longer hosted at its original link on PR Newswire. A copy was published by The Herald Journal.

Smith claimed the company “took immediate action” to secure data and prevent further unauthorized access after discovering the breach. The company then contacted affected clients, law enforcement agencies, including the FBI, as well as “top forensic security experts to help us identify where our system was vulnerable and to take steps to improve our security and prevent further incidents like this.”

“Without agreeing with the FTC’s findings from their investigation, we have signed a consent order that outlines the security measures that we will maintain going forward, many of which were implemented before we received the FTC’s order,” Smith said. “We deeply regret that this security incident happened. Information security is critical and integral to our operations, and our clients’ and customers’ security and privacy is our top priority.”

In response to SearchSecurity’s request for the original statement, InfoTrax offered a slightly modified one from the CEO, which notably removed the part about not agreeing to the FTC’s findings:

“This incident happened nearly four years ago, at which time we took immediate steps to identify and remediate the issue. We notified our clients and worked closely with security experts and law enforcement. We deeply regret that the incident happened,” Smith said in the statement. “Even though the FTC has just now released their documents, this is an issue we responded to immediately and aggressively as soon as we became aware of it in 2016, and we have not experienced additional incidents since then. The privacy and security of our clients’ information continues to be our top priority today.”

Richard Newman, an FTC defense attorney at Hinch Newman LLP in New York, told SearchSecurity that his overall take on the case was that “The FTC’s enforcement of data security matters based upon alleged unreasonable data security practices is becoming an increasingly common occurrence. The Commission does so under various theories, including that such acts and practices are ‘unfair’ in violation of the FTC Act.”

He added that the stipulation that InfoTrax is prohibited from collecting, sharing, or selling user data until they fix their security issues is “not uncommon,” and that “stipulated settlement agreements in this area have recently undergone an overhaul based upon judicial developments and enforceability-related challenges. Terms such as mandated information security programs, security assessments, etc. are now commonplace in such settlements.”

Regarding whether or not the settlement is adequate, Adam Solander, a partner at King & Spalding LLP in Atlanta, told SearchSecurity, “It’s hard to judge without being involved intimately with the facts, but the FTC is an aggressive organization. They take privacy and security very seriously, and I think this is evidence of how aggressive they are in their enforcement of it.”

Go to Original Article
Author:

CIOs express hope, concern for proposed interoperability rule

While CIOs applaud the efforts by federal agencies to make healthcare systems more interoperable, they also have significant concerns about patient data security.

The Office of the National Coordinator for Health IT (ONC) and the Centers for Medicare & Medicaid Services proposed rules earlier this year that would further define information blocking, or unreasonably stopping a patient’s information from being shared, as well as outline requirements for healthcare organizations to share data such as using FHIR-based APIs so patients can download healthcare data onto mobile healthcare apps.

The proposed rules are part of an ongoing interoperability effort mandated by the 21st Century Cures Act, a healthcare bill that provides funding to modernize the U.S. healthcare system. Final versions of the proposed information blocking and interoperability rules are on track to be released in November.

“We all now have to realize we’ve got to play in the sandbox fairly and maybe we can cut some of this medical cost through interoperability,” said Martha Sullivan, CIO at Harrison Memorial Hospital in Cynthiana, Ky.

CIOs’ take on proposed interoperability rule

To Sullivan, interoperability brings the focus back to the patient — a focus she thinks has been lost over the years.

She commended ONC’s efforts to make patient access to health information easier, yet she has concerns about data stored in mobile healthcare apps. Harrison’s system is API-capable, but Sullivan said the organization will not recommend APIs to patients for liability reasons.

Healthcare CIOs at Meditech's 2019 Physician and CIO Forum shared their thoughts on proposed interoperability rules from ONC and CMS.
Physicians and CIOs at EHR vendor Meditech’s 2019 Physician and CIO Forum in Foxborough, Mass. Helen Waters, Meditech executive vice president, spoke at the event.

“The security concerns me because patient data is really important, and the privacy of that data is critical,” she said.

Harrison may not be the only organization reluctant to promote APIs to patients. A study published in the Journal of the American Medical Association of 12 U.S. health systems that used APIs for at least nine months found “little effort by healthcare systems or health information technology vendors to market this new capability to patients” and went on to say “there are not clear incentives for patients to adopt it.”

Jim Green, CIO at Boone County Hospital in Iowa, said ONC’s efforts with the interoperability rule are well-intentioned but overlook a significant pain point: physician adoption. He said more efforts should be made to create “a product that’s usable for the pace of life that a physician has.”

The product also needs to keep pace with technology, something Green described as being a “constant battle.”

There are some nuances there that make me really nervous as a CIO.
Jeannette CurrieCIO of Community Hospitals, Beth Israel Deaconess Medical Center

Interoperability is often temporary, he said. When a system gets upgraded or a new version of software is released, it can throw the system’s ability to share data with another system out of whack.

“To say at a point in time, ‘We’re interoperable with such-and-such a product,’ it’s a point in time,” he said.

Interoperability remains “critically important” for healthcare, said Jeannette Currie, CIO of Community Hospitals at Beth Israel Deaconess Medical Center in Boston. But so is patient data security. That’s one of her main concerns with ONC’s efforts and the interoperability rule, something physicians and industry experts also expressed during the comment period for the proposed rules.

“When I look at the fact that a patient can come in and say, ‘I need you to interact with my app,’ and when I look at the HIPAA requirements I’m still beholden to, there are some nuances there that make me really nervous as a CIO,” she said.

Go to Original Article
Author: