Tag Archives: systems

Using wsusscn2.cab to find missing Windows updates

Keeping your Windows Server and Windows desktop systems updated can be tricky, and finding missing patches in conventional ways might not be reliable.

There are a few reasons why important security patches might not get installed. They could be mistakenly declined in Windows Server Update Services or get overlooked in environments that a lack an internet connection.

Microsoft provides a Windows Update offline scan file, also known as wsusscn2.cab, to help you check Windows systems for missing updates. The CAB file contains information about most patches for Windows and Microsoft applications distributed through Windows Update.

The challenge with the wsusscn2.cab file is its size. It weighs in around 650 MB, and distributing it to all the servers to perform a scan can be tricky and time-consuming. This tutorial explains how to avoid those issues and run it on all of your servers in a secure and timely manner using IIS for file transfer instead of SMB or PowerShell sessions.

Requirements for offline scanning

There are some simple requirements to use this tutorial:

  • a server or PC running Windows Server 2012 or newer or Windows 10;
  • a domain account with local administrator on the servers you want to scan; and
  • PowerShell remoting enabled on the servers you want to scan.

Step 1. Install IIS

First, we need a web server we can use to distribute the wsusscn2.cab file. There are several ways to copy the file, but they all have different drawbacks.

For example, we could distribute the wsusscn2.cab file with a regular file share, but that requires a double-hop. You could also copy the wsusscn2.cab file over a PowerShell session, but that causes a lot of overhead and is extremely slow for large files. An easier and more secure way to distribute the file is through HTTP and IIS.

Installing on Windows Server

Start PowerShell as admin and type the following to install IIS:

Install-WindowsFeature -name Web-Server -IncludeManagementTools

Installing on Windows 10

Start PowerShell as an admin and type the following to install IIS:

Enable-WindowsOptionalFeature -Online -FeatureName IIS-WebServer

The IIS role should be installed. The default site will point to the root folder of the C drive.

We can now proceed to download wsusscn2.cab from Microsoft.

Step 2. Download wsusscn2.cab

The link for this file can be tricky to find. You can either download it from this link and save it to the C drive or run the following script as admin on the IIS server:

# Default Site path, change if necessary
$IISFolderPath = "C:inetpubwwwroot"

# Download wsusscn2.cab
Start-BitsTransfer -Source "http://go.microsoft.com/fwlink/?linkid=74689" -Destination "$IISFolderPathwsusscn2.cab"

The script downloads the file to the wwwroot folder. We can verify the download by browsing to http:///wsusscn2.cab.

You also need to get the hash value of wsusscn2.cab to verify it. After saving it, run the following PowerShell command to check the file hash:

(Get-FileHash C:inetpubwwwrootwsusscn2.cab).Hash

31997CD01B8790CA68A02F3A351F812A38639FA49FEC7346E28F7153A8ABBA05

Step 3. Run the check on a server

Next, you can use a PowerShell script to download and scan for missing updates on a PC or server using the wsusscn2.cab file. You can run the script on at least Windows Server 2008 or newer to avoid compatibility issues. To do this in a secure and effective manner over HTTP, we get the file hash of the downloaded wsusscn2.cab file and compare it with the file hash of the CAB file on the IIS server.

We can also use the file hash to see when Microsoft releases a new version of wsusscn2.cab.

Copy and save the following script as Get-MissingUpdates.ps1:

Param(
    [parameter(mandatory)]
    [string]$FileHash,

    [parameter(mandatory)]
    [string]$Wsusscn2Url
)


Function Get-Hash($Path){
    
    $Stream = New-Object System.IO.FileStream($Path,[System.IO.FileMode]::Open) 
    
    $StringBuilder = New-Object System.Text.StringBuilder 
    $HashCreate = [System.Security.Cryptography.HashAlgorithm]::Create("SHA256").ComputeHash($Stream)
    $HashCreate | Foreach {
        $StringBuilder.Append($_.ToString("x2")) | Out-Null
    }
    $Stream.Close() 
    $StringBuilder.ToString() 
}

$DataFolder = "$env:ProgramDataWSUS Offline Catalog"
$CabPath = "$DataFolderwsusscn2.cab"

# Create download dir
mkdir $DataFolder -Force | Out-Null

# Check if cab exists
$CabExists = Test-Path $CabPath


# Compare hashes if download is needed
if($CabExists){
    Write-Verbose "Comparing hashes of wsusscn2.cab"
    
    $HashMatch = $Hash -ne (Get-Hash -Path $CabPath)

    if($HashMatch){   
        Write-Warning "Filehash of $CabPath did not match $($FileHash) - downloading"
        Remove-Item $CabPath -Force
    }
    Else{
        Write-Verbose "Hashes matched"
    }
}

# Download wsus2scn.cab if it dosen't exist or hashes mismatch
if(!$CabExists -or $HashMatch -eq $false){
    Write-Verbose "Downloading wsusscn2.cab"
    # Works on Windows Server 2008 as well
    (New-Object System.Net.WebClient).DownloadFile($Wsusscn2Url, $CabPath)

    if($Hash -ne (Get-Hash -Path $CabPath)){
        Throw "$CabPath did not match $($FileHash)"
    }

}

Write-Verbose "Checking digital signature of wsusscn2.cab"

$CertificateIssuer = "CN=Microsoft Code Signing PCA, O=Microsoft Corporation, L=Redmond, S=Washington, C=US"
$Signature = Get-AuthenticodeSignature -FilePath $CabPath
$SignatureOk = $Signature.SignerCertificate.Issuer -eq $CertificateIssuer -and $Signature.Status -eq "Valid"


If(!$SignatureOk){
    Throw "Signature of wsusscn2.cab is invalid!"
}


Write-Verbose "Creating Windows Update session"
$UpdateSession = New-Object -ComObject Microsoft.Update.Session
$UpdateServiceManager  = New-Object -ComObject Microsoft.Update.ServiceManager 

$UpdateService = $UpdateServiceManager.AddScanPackageService("Offline Sync Service", $CabPath, 1) 

Write-Verbose "Creating Windows Update Searcher"
$UpdateSearcher = $UpdateSession.CreateUpdateSearcher()  
$UpdateSearcher.ServerSelection = 3
$UpdateSearcher.ServiceID = $UpdateService.ServiceID.ToString()
 
Write-Verbose "Searching for missing updates"
$SearchResult = $UpdateSearcher.Search("IsInstalled=0")

$Updates = $SearchResult.Updates

$UpdateSummary = [PSCustomObject]@{

    ComputerName = $env:COMPUTERNAME    
    MissingUpdatesCount = $Updates.Count
    Vulnerabilities = $Updates | Foreach {
        $_.CveIDs
    }
    MissingUpdates = $Updates | Select Title, MsrcSeverity, @{Name="KBArticleIDs";Expression={$_.KBArticleIDs}}
}

Return $UpdateSummary

Run the script on one of the servers of computers to check for missing updates. To do this, copy the script to the machine and run the script with the URL to the wsusscn2.cab on the IIS server and the hash value from step two:

PS51> Get-MissingUpdates.ps1 -Wsusscn2Url "http://
  
   /wsusscn2.cab" -FileHash 31997CD01B8790CA68A02F3A351F812A38639FA49FEC7346E28F7153A8ABBA05
  

If there are missing updates, you should see output similar to the following:

ComputerName     MissingUpdatesCount Vulnerabilities  MissingUpdates
------------     ------------------- ---------------  --------------
UNSECURESERVER                    14 {CVE-2006-4685, CVE-2006-4686,
CVE-2019-1079, CVE-2019-1079...} {@{Title=MSXML 6.0 RTM Security Updat

If the machine is not missing updates, then you should see this type of output:

ComputerName MissingUpdatesCount Vulnerabilities MissingUpdates
------------ ------------------- --------------- --------------
SECURESERVER                   0

The script gives a summary of the number of missing updates, what those updates are and the vulnerabilities they patch.

This process is a great deal faster than searching for missing updates online. But this manual method is not efficient when checking a fleet of servers, so let’s learn how to run the script on all systems and collect the output.

Step 4. Run the scanning script on multiple servers at once

The easiest way to collect missing updates from all servers with PowerShell is with a PowerShell job. The PowerShell jobs run in parallel on all computers, and you can fetch the results.

On a PC or server, save the file from the previous step to the C drive — or another directory of your choice — and run the following as a user with admin permissions on your systems:

# The servers you want to collect missing updates from
$Computers = @(
        'server1',
        'server2',
        'server3'
)

# These are the arguments that will be sent to the remote servers
$RemoteArgs = @(
    # File hash from step 2
    "31997CD01B8790CA68A02F3A351F812A38639FA49FEC7346E28F7153A8ABBA05",
    "http://$env:COMPUTERNAME/wsusscn2.cab"
)

$Params = @{
    ComputerName = $Computers
    ArgumentList = $RemoteArgs
    AsJob        = $True
    # Filepath to the script on the server/computer you are running this command on
    FilePath = "C:ScriptsGet-MissingUpdates.ps1"
    # Maximum number of active jobs
    ThrottleLimit = 20
}

$Job = Invoke-Command @Params

# Wait for all jobs to finish
$Job | Wait-Job

# Collect Results from the jobs
$Results = $Job | Receive-Job

# Show results
$Results

This runs the Get-MissingUpdates.ps1 script on all servers in the $Computers variable in parallel to save time and make it easier to collect the results.

You should run these PowerShell jobs regularly to catch servers with a malfunctioning Windows Update and to be sure important updates get installed.

Go to Original Article
Author:

Epicor ERP system focuses on distribution

Many ERP systems try to be all things to all use cases, but that often comes at the expense of heavy customizations.

Some companies are discovering that a purpose-built ERP is a better and more cost-effective bet, particularly for small and midsize companies. One such product is the Epicor ERP system Prophet 21, which is primarily aimed at wholesale distributors.

The functionality in the Epicor ERP system is designed to help distributors run processes more efficiently and make better use of data flowing through the system.

In addition to distribution-focused functions, the Prophet 21 Epicor ERP system includes the ability to integrate value-added services, which could be valuable for distributors, said Mark Jensen, Epicor senior director of product management.

“A distributor can do manufacturing processes for their customers, or rentals, or field service and maintenance work. Those are three areas that we focused on with Prophet 21,” Jensen said.

Prophet 21’s functionality is particularly strong in managing inventory, including picking, packing and shipping goods, as well as receiving and put-away processes.

Specialized functions for distributors

Distribution companies that specialize in certain industries or products have different processes that Prophet 21 includes in its functions, Jensen said. For example, Prophet 21 has functionality designed specifically for tile and slab distributors.

“The ability to be able to work with the slab of granite or a slab of marble — what size it is, how much is left after it’s been cut, transporting that slab of granite or tile — is a very specific functionality, because you’re dealing with various sizes, colors, dimensions,” he said. “Being purpose-built gives [the Epicor ERP system] an advantage over competitors like Oracle, SAP, NetSuite, [which] either have to customize or rely on a third-party vendor to attach that kind of functionality.”

Jergens Industrial Supply, a wholesale supplies distributor based in Cleveland, has improved efficiency and is more responsive to shifting customer demands using Prophet 21, said Tony Filipovic, Jergens Industrial Supply (JIS) operations manager.

We looked at other systems that say they do manufacturing and distribution, but I just don’t feel that that’s the case.
Tony FilipovicOperations manager, Jergens Industrial Supply

“We like Prophet 21 because it’s geared toward distribution and was the leading product for distribution,” Filipovic said. “We looked at other systems that say they do manufacturing and distribution, but I just don’t feel that that’s the case. Prophet 21 is something that’s been top of line for years for resources distribution needs.”

One of the key differentiators for JIS was Prophet 21’s inventory management functionality, which was useful because distributors manage inventory differently than manufacturers, Filipovic said.

“All that functionality within that was key, and everything is under one package,” he said. “So from the moment you are quoting or entering an order to purchasing the product, receiving it, billing it, shipping it and paying for it was all streamlined under one system.”

Another key new feature is an IoT-enabled button similar to Amazon Dash buttons that enables customers to resupply stocks remotely. This allows JIS to “stay ahead of the click” and offer customers lower cost and more efficient delivery, Filipovic said.

“Online platforms are becoming more and more prevalent in our industry,” he said. “The Dash button allows customers to find out where we can get into their process and make things easier. We’ve got the ordering at the point where customers realize that when they need to stock, all they do is press the button and it saves multiple hours and days.”

Epicor Prophet 21 a strong contender in purpose-built ERP

Epicor Prophet 21 is on solid ground with its purpose-built ERP focus, but companies have other options they can look at, said Cindy Jutras, president of Mint Jutras, an ERP research and advisory firm in Windham, NH.

“Epicor Prophet 21 is a strong contender from a feature and function standpoint. I’m a fan of solutions that go that last mile for industry-specific functionality, and there aren’t all that many for wholesale distribution,” Jutras said. “Infor is pretty strong, NetSuite plays here, and then there a ton of little guys that aren’t as well-known.”

Prophet 21 may take advantage of new cloud capabilities to compete better in some global markets, said Predrag Jakovljevic, principal analyst at Technology Evaluation Centers, an enterprise computing analysis firm in Montreal.

“Of course a vertically-focused ERP is always advantageous, and Prophet 21 and Infor SX.e go head-to-head all the time in North America,” Jakovljevic said. “Prophet 21 is now getting cloud enabled and will be in Australia and the UK, where it might compete with NetSuite or Infor M3, which are global products.”

Go to Original Article
Author:

Cornell researchers call for AI transparency in automated hiring

Cornell University is becoming a hotbed of warning about automated hiring systems. In two separate papers, researchers have given the systems considerable scrutiny. Both papers cite problems with AI transparency, or the ability to explain how an AI system reaches a conclusion.

Vendors are selling automated hiring systems partly as a remedy to human bias. But they also argue they can speed up the hiring process and select applicants who will make good employees.

Manish Raghavan, a computer science doctoral student at Cornell who led the most recent study, questions vendors’ claims. If AI is doing a better job than hiring managers, “how do we know that’s the case or when will we know that that’s the case?” he said.

A major thrust of the research is the need for AI transparency. That’s not only needed for the buyers of automated hiring systems, but for job applicants as well.

At Cornell, Raghavan knows students who take AI-enabled tests as part of a job application. “One common complaint that I’ve heard is that it just viscerally feels upsetting to have to perform for a robot,” he said.

Manish Raghavan, a doctoral student in computer science at Cornell UniversityManish Raghavan

A job applicant may have to install an app to film a video interview, play a game that may measure cognitive ability or take a psychometric test that can be used to measure intelligence and personality.

“This sort of feels like they’re forcing you [the job applicant] to invest extra effort, but they’re actually investing less effort into you,” Raghavan said. Rejected applicants won’t know why they were rejected, the standards used to measure their performance, or how they can improve, he said.

Nascent research, lack of regulation

The paper, “Mitigating Bias in Algorithmic Employment Screening: Evaluating Claims and Practices,” is the work of a multidisciplinary team of computer scientists, as well as those with legal and sociological expertise. It argues that HR vendors are not providing insights into automated hiring systems.

One common complaint that I’ve heard is that it just viscerally feels upsetting to have to perform for a robot.
Manish RaghavanDoctoral student in computer science, AI researcher, Cornell University

The researchers looked at the public claims of nearly 20 vendors that sell these systems. Many are startups, although some have been around for more than a decade. They argue that vendors are taking nascent research and translating it into practice “at sort of breakneck pace,” Raghavan said. They’re able to do so because of a lack of regulation.

Vendors can produce data from automated hiring systems that shows how their systems perform in helping achieve diversity, Raghavan said. “Their diversity numbers are quite good,” but they can cherry-pick what data they release, he said. Nonetheless, “it also feels like there is some value being added here, and their clients seem fairly happy with the results.”

But there are two levels of transparency that Raghavan would like to see improve. First, he suggested vendors release internal studies that show the validity of their assessments. The data should include how often vendors are running into issues of disparate impact, which refers to a U.S. Equal Employment Opportunity Commission formula for determining if hiring is having a discriminatory impact on a protected group.

A second step for AI transparency involves having third-party independent researchers do some of their own analysis.

Vendors argue that AI systems do a better job than humans in reducing bias. But researchers see a risk that they could embed certain biases against a group of people that won’t be easily discovered unless there’s an understanding for how these systems work.

One problem often cited is that an AI-enabled system can help improve diversity but still discriminate against certain groups or people. New York University researchers recently noted that most of the AI code today is being written by young white males, who many encode their biases.

Ask about the ‘magic fairy dust’

Ben Eubanks, principal analyst at Lighthouse Research & Advisory, believes the Cornell paper should be on every HR manager’s reading list, “not necessarily because it should scare them away but because it should encourage them to ask more questions about the magic fairy dust behind some technology claims.”

“Hiring is and has always been full of bias,” said Eubanks, who studies AI use in HR. “Algorithms are subject to some of those same constraints, but they can also offer ways to mitigate some of the very real human bias in the process.”

But the motivation for employers may be different, Eubanks said.

“Employers adopting these technologies may be more concerned initially with the outcomes — be it faster hiring, cheaper hiring, or longer retention rates — than about the algorithm actually preventing or mitigating bias,” Eubanks said. That’s what HR managers will likely be rewarded on.

In a separate paper, Ifeoma Ajunwa, assistant professor of labor relations, law and history at Cornell University, argued for independent audits and compulsory data retention in her recently published “Automated Employment Discrimination.”

Ajunwa’s paper raises problems with automated hiring, including systems that “discreetly eliminate applicants from protected categories without retaining a record.” 

AI transparency adds confidence

Still, in an interview, Cornell’s Raghavan was even-handed about using AI and didn’t warn users away from automated hiring systems. He can see use cases but believes there is good reason for caution.

“I think what we can agree on is that the more transparency there is, the easier it will be for us to determine when is or is not the right time or the right place to be using these systems,” Raghavan said.

“A lot of these companies and vendors seem well-intentioned — they think what they’re doing is actually very good for the world,” he said. “It’s in their interest to have people be confident in their practices.”

Go to Original Article
Author:

Google-Ascension deal reveals murky side of sharing health data

One of the largest nonprofit health systems in the U.S. created headlines when it was revealed that it was sharing patient data with Google — under codename Project Nightingale.

Ascension, a Catholic health system based in St. Louis, partnered with Google to transition the health system’s infrastructure to the Google Cloud Platform, to use the Google G Suite productivity and collaboration tools, and to explore the tech giant’s artificial intelligence and machine learning applications. By doing so, it is giving Google access to patient data, which the search giant can use to inform its own products.

The partnership appears to be technically and legally sound, according to experts. After news broke, Ascension released a statement saying the partnership is HIPAA-compliant and a business associate agreement, a contract required by the federal government that spells out each party’s responsibility for protected health information, is in place. Yet reports from The Wall Street Journal and The Guardian about the possible improper transfer of 50 million patients’ data has resulted in an Office for Civil Rights inquiry into the Google-Ascension partnership.

Legality aside, the resounding reaction to the partnership speaks to a lack of transparency in healthcare. Organizations should see the response as both an example of what not to do, as well as a call to make patients more aware of how they’re using health data, especially as consumer companies known for collecting and using data for profit become their partners.

Partnership breeds legal, ethical concerns

Forrester Research senior analyst Jeff Becker said Google entered into a similar strategic partnership with Mayo Clinic in September, and the coverage was largely positive.

Forrester Research senior analyst Jeff Becker Jeff Becker

According to a Mayo Clinic news release, the nonprofit academic medical center based in Rochester, Minn., selected Google Cloud to be “the cornerstone of its digital transformation,” and the clinic would use “advanced cloud computing, data analytics, machine learning and artificial intelligence” to improve healthcare delivery.

But Ascension wasn’t as forthcoming with its Google partnership. It was Google that announced its work with Ascension during a quarterly earnings call in July, and Ascension didn’t issue a news release about the partnership until after the news broke.

“There should have been a public-facing announcement of the partnership,” Becker said. “This was a PR failure. Secrecy creates distrust.”

Matthew Fisher, partner at Mirick O’Connell Attorneys at Law and chairman of its health law group, said the outcry over the Google-Ascension partnership was surprising. For years, tech companies have been trying to get access to patient data to help healthcare organizations and, at the same time, develop or refine their existing products, he said.

“I get the sense that just because it was Google that was announced to have been a partner, that’s what drove a lot of the attention,” he said. “Everyone knows Google mostly for purposes outside of healthcare, which leads to the concern of does Google understand the regulatory obligations and restrictions that come to bear by entering the healthcare space?”

Ascension’s statement in response to the situation said the partnership with Google is covered by a business associate agreement — a distinction Fisher said is “absolutely required” before any protected health information can be shared with Google. Parties in a business associate agreement are obligated by federal regulation to comply with the applicable portions of HIPAA, such as its security and privacy rules.

A business associate relationship allows identifiable patient information to be shared and used by Google only under specified circumstances. It is the legal basis for keeping patient data segregated and restricting Google from freely using that data. According to Ascension, the health system’s clinical data is housed within an Ascension-owned virtual private space in Google Cloud, and Google isn’t allowed to use the data for marketing or research.

“Our data will always be separate from Google’s consumer data, and it will never be used by Google for purposes such as targeting consumers for advertising,” the statement said.

Health IT and information security expert Kate Borten Kate Borten

But health IT and information security expert Kate Borten believes business associate agreements and the HIPAA privacy rule they adhere to don’t go far enough to ensure patient privacy rights, especially when companies like Google get involved. The HIPAA privacy rule doesn’t require healthcare organizations to disclose to patients who they’re sharing patient data with.

“The privacy rule says as long as you have this business associate contract — and business associates are defined by HIPAA very broadly — then the healthcare provider organization or insurer doesn’t have to tell the plan members or the patients about all these business associates who now have access to your data,” she said.

Chilmark Research senior analyst Jody Ranck said much of the alarm over the Google-Ascension partnership may be misplaced, but it speaks to a growing concern about companies like Google entering healthcare.

Since the Office for Civil Rights is looking into the partnership, Ranck said there is still a question of whether the partnership fully complies with the law. But the bigger question has to do with privacy and security concerns around collecting and using patient data, as well as companies like Google using patient data to train AI algorithms and the potential biases it could create.

All of this starts to feel like a bit of an algorithmic iron cage.
Jody RanckSenior analyst, Chilmark Research

Ranck believes consumer trust in tech companies is declining, especially as data privacy concerns get more play.

“Now that they know everything you purchase and they can listen in to that Alexa sitting beside your bed at night, and now they’re going to get access to health data … what’s a consumer to do? Where’s their power to control their destiny when algorithms are being used to assign you as a high-, medium-, or low-risk individual, as creditworthy?” Ranck said. “All of this starts to feel like a bit of an algorithmic iron cage.”

A call for more transparency

Healthcare organizations and big tech partnerships with the likes of Google, Amazon, Apple and Microsoft are growing. Like other industries, healthcare organizations are looking to modernize their infrastructure and take advantage of state of the art storage, security, data analytics tools and emerging tech like artificial intelligence.

But for healthcare organizations, partnerships like these have an added complexity — truly sensitive data. Forrester’s Becker said the mistake in the Google-Ascension partnership was the lack of transparency. There was no press release early on announcing the partnership, laying out what information is being shared, how the information will be used, and what outcome improvements the healthcare organization hopes to achieve.

“There should also be assurance that the partnership falls within HIPAA and that data will not be used for advertising or other commercial activities unrelated to the healthcare ambitions stated,” he said.

Fisher believes the Google-Ascension partnership raises questions about what the legal, moral and ethical aspects of these relationships are. While Ascension and Google may have been legally in the right, Fisher believes it’s important to recognize that privacy expectations are shifting, which calls for better consumer education, as well as more transparency around where and how data is being used.

Although he believes it would be “unduly burdensome” to require a healthcare organization to name every organization it shares data with, Fisher said better education on how HIPAA operates and what it allows when it comes to data sharing, as well as explaining how patient data will be protected when shared with a company like Google, could go a long way in helping patients understand what’s happening with their data.

“If you’re going to be contracting with one of these big-name companies that everyone has generalized concerns about with how they utilize data, you need to be ahead of the game,” Fisher said. “Even if you’re doing everything right from a legal standpoint, there’s still going to be a PR side to it. That’s really the practical reality of doing business. You want to be taking as many measures as you can to avoid the public backlash and having to be on the defensive by having the relationship found out and reported upon or discussed without trying to drive that discussion.”

Go to Original Article
Author:

Redis Labs eases database management with RedisInsight

The robust market of tools to help users of the Redis database manage their systems just got a new entrant.

Redis Labs disclosed the availability of its RedisInsight tool, a graphical user interface (GUI) for database management and operations.

Redis is a popular open source NoSQL database that is also increasingly being used in cloud-native Kubernetes deployments as users move workloads to the cloud. Open source database use is growing quickly according to recent reports as the need for flexible, open systems to meet different needs has become a common requirement.

Among the challenges often associated with databases of any type is ease of management, which Redis is trying to address with RedisInsight.

“Database management will never go out of fashion,” said James Governor, analyst and co-founder at RedMonk. “Anyone running a Redis cluster is going to appreciate better memory and cluster management tools.”

Governor noted that Redis is following a tested approach, by building out more tools for users that improve management. Enterprises are willing to pay for better manageability, Governor noted, and RedisInsight aims to do that.

RedisInsight based on RDBtools

The RedisInsight tool, introduced Nov. 12, is based on the RDBTools technology that Redis Labs acquired in April 2019. RDBTools is an open source GUI for users to interact with and explore data stores in a Redis database.

Database management will never go out of fashion.
James GovernorAnalyst and co-founder, RedMonk

Over the last seven months, Redis added more capabilities to the RDBTools GUI, expanding the product’s coverage for different applications, said Alvin Richards, chief product officer at Redis.

One of the core pieces of extensibility in Redis is the ability to introduce modules that contain new data structures or processing frameworks. So for example, a module could include time series, or graph data structures, Richards explained.

“What we have added to RedisInsight is the ability to visualize the data for those different data structures from the different modules,” he said. “So if you want to visualize the connections in your graph data for example, you can see that directly within the tool.”

RedisInsight overview dashboard
RedisInsight overview dashboard

RDBTools is just one of many different third-party tools that exist for providing some form of management and data insight for Redis. There are some 30 other third-party GUI tools in the Redis ecosystem, though lack of maturity is a challenge.

“They tend to sort of come up quickly and get developed once and then are never maintained,” Richards said. “So, the key thing we wanted to do is ensure that not only is it current with the latest features, but we have the apparatus behind it to carry on maintaining it.”

How RedisInsight works

For users, getting started with the new tool is relatively straightforward. RedisInsight is a piece of software that needs to be downloaded and then connected to an existing Redis database. The tool ingests all the appropriate metadata and delivers the visual interface to users.

RedisInsight is available for Windows, macOS and Linux, and also available as a Docker container. Redis doesn’t have a RedisInsight as a Service offering yet.

“We have considered having RedisInsight as a service and it’s something we’re still working on in the background, as we do see demand from our customers,” Richards said. “The challenge is always going to be making sure we have the ability to ensure that there is the right segmentation, security and authorization in place to put guarantees around the usage of data.”

Go to Original Article
Author:

InfoTrax settles FTC complaint, will implement infosec program

InfoTrax Systems this week settled with the Federal Trade Commission regarding allegations that the company failed to protect consumer data after a nearly two-year-long data breach.

The FTC filed a complaint against the Utah-based multi-level marketing software company in the wake of attackers stealing sensitive information on approximately one million customers over the course of more than 20 malicious infiltrations between May 2014 and March 2016.

InfoTrax only became aware of the attacks in March 2016 because a data archive file created by the malicious actors grew so large that servers reached maximum storage capacity. “Only then did Respondents begin to take steps to remove the intruder from InfoTrax’s network,” the FTC wrote in the complaint.

The FTC asserted that InfoTrax — in part — failed to implement a process to inventory and delete unnecessary customer information, failed to detect malicious file uploads and stored consumers’ personal information, including Social Security numbers, bank account information, payment card information and more, “in clear, readable text on InfoTrax’s network.”

The FTC added, “Respondents could have addressed each of the failures … by implementing readily available and relatively low-cost security measures.”

As a result of the settlement, InfoTrax is “prohibited from collecting, selling, sharing or storing personal information unless they implement an information security program that would address the security failures identified in the complaint. This includes assessing and documenting internal and external security risks; implementing safeguards to protect personal information from cybersecurity risks; and testing and monitoring the effectiveness of those safeguards,” the FTC wrote in a statement published Tuesday.

The company will also have to obtain assessments of its information security program from a third party, approved by the FTC, every two years.

InfoTrax responds

On Nov. 12, Scott Smith, president and newly appointed CEO of InfoTrax, released a statement that is no longer hosted at its original link on PR Newswire. A copy was published by The Herald Journal.

Smith claimed the company “took immediate action” to secure data and prevent further unauthorized access after discovering the breach. The company then contacted affected clients, law enforcement agencies, including the FBI, as well as “top forensic security experts to help us identify where our system was vulnerable and to take steps to improve our security and prevent further incidents like this.”

“Without agreeing with the FTC’s findings from their investigation, we have signed a consent order that outlines the security measures that we will maintain going forward, many of which were implemented before we received the FTC’s order,” Smith said. “We deeply regret that this security incident happened. Information security is critical and integral to our operations, and our clients’ and customers’ security and privacy is our top priority.”

In response to SearchSecurity’s request for the original statement, InfoTrax offered a slightly modified one from the CEO, which notably removed the part about not agreeing to the FTC’s findings:

“This incident happened nearly four years ago, at which time we took immediate steps to identify and remediate the issue. We notified our clients and worked closely with security experts and law enforcement. We deeply regret that the incident happened,” Smith said in the statement. “Even though the FTC has just now released their documents, this is an issue we responded to immediately and aggressively as soon as we became aware of it in 2016, and we have not experienced additional incidents since then. The privacy and security of our clients’ information continues to be our top priority today.”

Richard Newman, an FTC defense attorney at Hinch Newman LLP in New York, told SearchSecurity that his overall take on the case was that “The FTC’s enforcement of data security matters based upon alleged unreasonable data security practices is becoming an increasingly common occurrence. The Commission does so under various theories, including that such acts and practices are ‘unfair’ in violation of the FTC Act.”

He added that the stipulation that InfoTrax is prohibited from collecting, sharing, or selling user data until they fix their security issues is “not uncommon,” and that “stipulated settlement agreements in this area have recently undergone an overhaul based upon judicial developments and enforceability-related challenges. Terms such as mandated information security programs, security assessments, etc. are now commonplace in such settlements.”

Regarding whether or not the settlement is adequate, Adam Solander, a partner at King & Spalding LLP in Atlanta, told SearchSecurity, “It’s hard to judge without being involved intimately with the facts, but the FTC is an aggressive organization. They take privacy and security very seriously, and I think this is evidence of how aggressive they are in their enforcement of it.”

Go to Original Article
Author:

CIOs express hope, concern for proposed interoperability rule

While CIOs applaud the efforts by federal agencies to make healthcare systems more interoperable, they also have significant concerns about patient data security.

The Office of the National Coordinator for Health IT (ONC) and the Centers for Medicare & Medicaid Services proposed rules earlier this year that would further define information blocking, or unreasonably stopping a patient’s information from being shared, as well as outline requirements for healthcare organizations to share data such as using FHIR-based APIs so patients can download healthcare data onto mobile healthcare apps.

The proposed rules are part of an ongoing interoperability effort mandated by the 21st Century Cures Act, a healthcare bill that provides funding to modernize the U.S. healthcare system. Final versions of the proposed information blocking and interoperability rules are on track to be released in November.

“We all now have to realize we’ve got to play in the sandbox fairly and maybe we can cut some of this medical cost through interoperability,” said Martha Sullivan, CIO at Harrison Memorial Hospital in Cynthiana, Ky.

CIOs’ take on proposed interoperability rule

To Sullivan, interoperability brings the focus back to the patient — a focus she thinks has been lost over the years.

She commended ONC’s efforts to make patient access to health information easier, yet she has concerns about data stored in mobile healthcare apps. Harrison’s system is API-capable, but Sullivan said the organization will not recommend APIs to patients for liability reasons.

Healthcare CIOs at Meditech's 2019 Physician and CIO Forum shared their thoughts on proposed interoperability rules from ONC and CMS.
Physicians and CIOs at EHR vendor Meditech’s 2019 Physician and CIO Forum in Foxborough, Mass. Helen Waters, Meditech executive vice president, spoke at the event.

“The security concerns me because patient data is really important, and the privacy of that data is critical,” she said.

Harrison may not be the only organization reluctant to promote APIs to patients. A study published in the Journal of the American Medical Association of 12 U.S. health systems that used APIs for at least nine months found “little effort by healthcare systems or health information technology vendors to market this new capability to patients” and went on to say “there are not clear incentives for patients to adopt it.”

Jim Green, CIO at Boone County Hospital in Iowa, said ONC’s efforts with the interoperability rule are well-intentioned but overlook a significant pain point: physician adoption. He said more efforts should be made to create “a product that’s usable for the pace of life that a physician has.”

The product also needs to keep pace with technology, something Green described as being a “constant battle.”

There are some nuances there that make me really nervous as a CIO.
Jeannette CurrieCIO of Community Hospitals, Beth Israel Deaconess Medical Center

Interoperability is often temporary, he said. When a system gets upgraded or a new version of software is released, it can throw the system’s ability to share data with another system out of whack.

“To say at a point in time, ‘We’re interoperable with such-and-such a product,’ it’s a point in time,” he said.

Interoperability remains “critically important” for healthcare, said Jeannette Currie, CIO of Community Hospitals at Beth Israel Deaconess Medical Center in Boston. But so is patient data security. That’s one of her main concerns with ONC’s efforts and the interoperability rule, something physicians and industry experts also expressed during the comment period for the proposed rules.

“When I look at the fact that a patient can come in and say, ‘I need you to interact with my app,’ and when I look at the HIPAA requirements I’m still beholden to, there are some nuances there that make me really nervous as a CIO,” she said.

Go to Original Article
Author:

Navy sails SAP ERP systems to AWS GovCloud

The U.S. Navy has moved several SAP and other ERP systems from on premises to AWS GovCloud, a public cloud service designed to meet the regulatory and compliance requirements of U.S. government agencies.

The project entailed migrating 26 ERPs across 15 landscapes that were set up around 60,000 users across the globe. The Navy tapped SAP National Security Services Inc. (NS2) for the migration. NS2 was spun out of SAP specifically to sell SAP systems that adhere to the highly regulated conditions that U.S. government agencies operate under.

Approximately half of the systems that moved to AWS GovCloud were SAP ERP systems running on Oracle databases, according to Harish Luthra, president of NS2 secure cloud business. SAP systems were also migrated to the SAP HANA database, while non-SAP systems remain on their respective databases.

Architecture simplification and reducing TCO

The Navy wanted to move the ERP systems to take advantage of the new technologies that are more suited for cloud deployments, as well as to simplify the underlying ERP architecture and to reduce the total cost of ownership (TCO), Luthra said.

The migration enabled the Navy to reduce the data size from 80 TB to 28 TB after the migration was completed.

Harish LuthraHarish Luthra

“Part of it was done through archiving, part was disk compression, so the cost of even the data itself is reducing quite a bit,” Luthra said. “On the AWS GovCloud side, we’re using one of the largest instances — 12 terabytes — and will be moving to a 24 terabyte instance working with AWS.”

The Navy also added applications to consolidate financial systems and improve data management and analytics functionality.

“We added one application called the Universe of Transactions, based on SAP Analytics that allows the Navy to provide a consolidated financial statement between Navy ERP and their other ERPs,” Luthra said. “This is all new and didn’t exist before on-premises and was only possible to add because we now have HANA, which enables a very fast processing of analytics. It’s a giant amount of transactions that we are able to crunch and produce a consolidated ledger.”

Joe GioffreJoe Gioffre

Accelerated timeline

The project was done at an accelerated pace that had to be sped up even more when the Navy altered its requirements, according to Joe Gioffre, SAP NS2 project principal consultant. The original go-live date was scheduled for May 2020, almost two years to the day when the project began. However, when the Navy tried to move a command working capital fund onto the on-premises ERP system, it discovered the system could not handle the additional data volume and workload.

This drove the HANA cloud migration go-live date to August 2019 to meet the fiscal new year start of Oct. 1, 2019, so the fund could be included.

“We went into a re-planning effort, drew up a new milestone plan, set up Navy staffing and NS2 staffing to the new plan so that we could hit all of the dates one by one and get to August 2019,” Gioffre said. “That was a colossal effort in re-planning and re-resourcing for both us and the Navy, and then tracking it to make sure we stayed on target with each date in that plan.”

It’s not as hard as landing on the moon, but you’re still entering orbital space when you are going to these cloud implementations.
Joshua GreenbaumPrincipal, Enterprise Applications Consulting

Governance keeps project on track

Tight governance over the project was the key to completing it in the accelerated timeframe.

“We had a very detailed project plan with a lot of moving parts and we tracked everything in that project plan. If something started to fall behind, we identified it early and created a mitigation for it,” Gioffre explained. “If you have a plan that tracks to this level of detail and you fall behind, unless you have the right level of governance, you can’t execute mitigation quickly enough.”

The consolidation of the various ERPs onto one SAP HANA system was a main goal of the initiative, and it now sets up the Navy to take advantage of next-generation technology.

“The next step is planning a move to SAP S/4HANA and gaining process improvements as we go to that system,” he said.

Proving confidence in the public cloud

It’s not a particular revelation that public cloud hyperscaler storage providers like AWS GovCloud can handle huge government workloads, but it is notable that the Department of Defense is confident in going to the cloud, according to analyst Joshua Greenbaum, principal at Enterprise Applications Consulting, a firm based in Berkeley, Calif.

“The glitches that happened with Amazon recently and [the breach of customer data from Capital One] highlight the fact that we have a long way to go across the board in perfecting the cloud model,” Greenbaum said. “But I think that SAP and its competitors have really proven that stuff does work on AWS, Azure and, to a lesser extent, Google Cloud Platform. They have really settled in as legitimate strategic platforms and are now just getting the bugs out of the system.”

Greenbaum is skeptical that the project was “easy,” but it would be quite an accomplishment if it was done relatively painlessly.

“Every time you tell me it was easy and simple and painless, I think that you’re not telling me the whole story because it’s always going to be hard,” he said. “And these are government systems, so they’re not trivial and simple stuff. But this may show us that if the will is there and the technology is there, you can do it. It’s not as hard as landing on the moon, but you’re still entering orbital space when you are going to these cloud implementations, so it’s always going to be hard.”

Go to Original Article
Author:

Stibo Systems advances multidomain MDM system

Stibo Systems is helping to advance the market for Master Data Management (MDM) with its latest release. The Stibo Systems 9.2 release of it multidomain MDN solution provides users with new features to manage, organize and make sense of data.

Stibo Systems got its start four years ago and is a division of Denmark-based Stibo A/S, an IT and print technology multinational that was founded in 1794 as a printing company. As part of the 9.2 update, the multidomain MDM system gains enhanced machine learning capabilities to help manage data across multiple data domains.

The update, which became generally available Sept. 4, also includes a bundled integration with the Sisense BI-analytics platform for executing data analytics on the multidomain MDM.

Though MDM is not a term that is heard as often in recent years as big data, Forrester vice president and research director Gene Leganza said MDM is as relevant now as it ever was, despite the significant changes that have come about in the big data era.

“For one thing, the years of stories of firms doing innovative things with data and analytics have gotten business leaders’ attention and anyone who was unaware of the value hidden in their data assets has gotten the message that they cannot afford to leave that value unmined,” Leganza said. “For these data management laggards — and there are a lot of them — newfound enthusiasm to improve their data capabilities usually means getting started with data governance and MDM.”

Simply collecting data, though, isn’t enough. Leganza said all data analysis is a “garbage-in-garbage-out” proposition, and the reliability and trustworthiness of data has never been more important as organizations work harder to evolve into data- and insights-driven cultures. Keeping data clean and usable is where multidomain MDM plays a key role.

The years of stories of firms doing innovative things with data and analytics have gotten business leaders’ attention and anyone who was unaware of the value hidden in their data assets has gotten the message that they cannot afford to leave that value unmined.
Gene LeganzaVice president and director of research, Forrester

Looking at Stibo Systems, Leganza said that in the last few years, the vendor has significantly bolstered its general MDM capabilities and Forrester included them in the Q1 2019 Forrester Wave evaluation of MDM systems, in which Stibo was ranked a “contender.” He noted that the evaluation did not include the features in the new 9.2 release, and adding machine learning to improve data quality and governance is something Forrester had noted customers were asking for.

“This new release strengthens both their product domain dominance as well as their general MDM capabilities, which should serve them well in the marketplace,” Leganza said.

Multidomain MDM

The MDM system is a purpose-built platform for mastering data and the various domains that go into that, whether that be product domain, customer information, supplier details or vendor locations, said Doug Kimball, vice-president of global solution strategy at Stibo Systems.

Kimball said that with a multidomain MDM, Stibo Systems customers can connect data across different pieces of their domain. For example, a company could map customers to products and know where those products are by location, he said.

A sizeable amount of what goes into multidomain MDM is data governance, enabling data traceability, as well as compliance with regulations. The Stibo Systems platform brings data in from wherever a company has it, be it a database, data lake, ERP system or otherwise, Kimball said.

“We do the de-duplication, the matching of records, the address verification and all the things that make the data good and usable,” he said.

9.2 enhancements 

Among the changes in the 9.2 release, Kimball noted, is that the Cassandra database is now a database option, providing an alternative to just running an Oracle database.

For the product master data management component, Stibo Systems now has a partnership with Sisense to deliver embedded analytics. It’s now possible to create data visualizations and actionable insights that are effectively embedded in the user experience, Kimball said.

Also in the new release is an application called Smartsheet that can help to bridge the gap between multidomain MDM and a simple Excel spreadsheet.

Kimball said Stibo Systems is working on a new user experience interface that is intended to make it easier for users to navigate the multidomain MDM. The vendor is also working on MDM on the edge.

“We’re looking at the fact that you’ve got all these devices out there: smart watches, refrigerators, beacons, creating all this additional data that needs to be mastered,” Kimball said. “The data is on the edge, instead of being in traditional data stores.”

Go to Original Article
Author: