What’s new with Seeing AI

Saqib Shaikh holds his camera phone in front of his face with Seeing AI open on the screen

By Saqib Shaikh, Software Engineering Manager and Project Lead for Seeing AI

Seeing AI provides people who are blind or with low vision an easier way to understand the world around them through the cameras on their smartphones. Whether in a room, on a street, in a mall or an office – people are using the app to independently accomplish daily tasks like never before. Seeing AI helps users read printed text in books, restaurant menus, street signs and handwritten notes, as well as identify banknotes and products via their barcode. Leveraging on-device facial-recognition technology, the app can even describe the physical appearance of people and predict their mood.

Today, we are announcing new Seeing AI features for the enthusiastic community of users who share their experiences with the app, recommend new capabilities and suggest improvements for its functionalities. Inspired by this rich feedback, here are the updates rolling out to Seeing AI to enhance the user’s experience:

  • Explore photos by touch: Leveraging technology from Azure Cognitive Services, including Custom Vision Service in tandem with the Computer Vision API, this new feature enables users to tap their finger to an image on a touch-screen to hear a description of objects within an image and the spatial relationship between them. Users can explore photos of their surroundings taken on the Scene channel, family photos stored in their photo browser, and even images shared on social media by summoning the options menu while in other apps.
  • Native iPad support: For the first time we’re releasing iPad support, to provide a better Seeing AI experience that accounts for the larger display requirements. iPad support is particularly important to individuals using Seeing AI in academic or other professional settings where they are unable to use a cellular device.
  • Channel improvements: Users can now customize the order in which channels are shown, enabling easier access to favorite features. We’ve also made it easier to access the face recognition function while on the Person channel, by relocating the feature directly on the main screen. Additionally, when analyzing photos from other apps, the app will now provide audio cues that indicate Seeing AI is processing the image.

Since the app’s launch in 2017, Seeing AI has leveraged AI technology and inclusive design to help people with more than 10 million tasks. If you haven’t tried Seeing AI yet, download it for free on the App Store. If you have, please share your thoughts, feedback or questions with us at [email protected], or through the Disability Answer Desk and Accessibility User Voice Forum.

Go to Original Article
Author: Steve Clarke

How to implement a winning interoperability testing strategy

With the increased popularity of APIs as the go-to means to integrate multiple applications, exchange data, and perform shared calculations and functions, organizations can’t afford to ignore interoperability testing.

Interoperability testing verifies that components within the application, server and database work together and deliver the expected results. It’s not sufficient to only test components or applications; you must test all the components with which they interact.

A development team could create mock systems that simulate interoperability testing for a solid first step. But simulations don’t replace interoperability tests, which cover as many possible connection points and functions as possible for all partners.

Interoperability testing is challenging, which is why software development teams attempt to get around it. For example, in a partnership, one development team from Company A won’t have its code ready until right before the expected release date, while Company B wants to thoroughly test their interoperable code before release. So, Company B’s developers create mock code that simulates the existence of Company A’s expected code. That simulation, while imperfect, helps both teams avoid logistical challenges.

Despite the difficulties, interoperability testing is the only sure way to confirm that components function and communicate as expected, if security practices are up to snuff, and if messages get through and go to the correct place.

Barriers to efficient interoperability testing are surmountable, with effort from the development team.

Compatible, secure communication

Security is often the first failure point in interoperability test execution. Get all parties on the same page to facilitate a secure handshake for every connection, response, message and data byte transferred or shared.

By design, the system components must communicate with each other securely. Work with all partners or vendors to establish a standard security exchange protocol, such as HTTPS, Lightweight Directory Access Protocol or Security Assertion Markup Language; otherwise, systems cannot communicate. The aforementioned handshakes must always occur. Verify that all providers use the same security best practices and re-create this setup in your testing environment.

Data transfer and management

More than any other form of communication, systems most commonly exchange data. Data must end up in the correct location, and it must be 100% complete. Data loss is unacceptable.

To perform interoperability testing, make sure you can exchange realistic data types during testing. You can mock or simulate the data — such as healthcare lab results — but the data type, such as a PDF, must be real. Otherwise, you will experience data compatibility issues and application functionality failures in live operations.

Network complexity and connection

Multiple network components, such as partner vendors, data exchanges or connection endpoints, complicate the system. These systems are like a house of cards, in which every card relies on another to support the structure; one wrong move, and the entire system goes down.

Set up all the possible connections you can, and test against them as soon as possible. You might not duplicate 100% of the system complexity of the production deployment, but get as close as is feasible. The more you test upfront, the better you can address variations in system performance with real clients and customers.

Timing with partners

When working with partner vendors, it’s a big challenge to time the development effort so every organization’s contribution is done and ready for testing on the same schedule. So, plan what you do if a partner gets behind schedule. For one, you can set up a QA environment with all the pieces in place, mocked or simulated. Then, you can fill in the missing pieces as development winds down and unit tests have been executed on the integrated code.

If you use an integration system that provides the back end and connectivity, and you customize code for what you need beyond the tool’s native features, you can save resources and simulate testing when partners’ code is integrated into the branch and passes integrated unit tests. Companies like InterSystems, IBM and Oracle provide the pieces to create an interoperable system, and you put them together as it works for your application, network and database.

A place to test

An organization can find many ways to skimp on its testing budget, but you simply cannot skip building a completely interoperable testing system. It doesn’t necessarily need to be state-of-the-art, but the system must represent the majority of your anticipated user base, which includes partner organizations, their endpoints, shared databases and application end users.

Decide how you will perform the test. You and your partners can meet up and hold an interoperability test marathon. Alternatively, you each can work separately with each other’s test cases — or even stick with your own test cases and improvise the rest.

Interoperability testing marathons, also called testathons, provide quick access to each partner’s developer and QA resources. These events are a popular way to get IT staffs to work together and test, fix and retest until each partner gets positive results. Organizations can hold testathons in a shared physical space or within an online group chat, or a mix of both. With few partners, a shared phone call might suffice. But, generally, it’s faster and more effective to test together.

Go to Original Article

For Sale – HP Desktops and Acer Monitors for Sale

8 Units- HP Desktop Computer Elite 8000 Core 2 Duo E8400 (3.00 GHz) 4 GB DDR3 160 GB HDD Windows 7 Professional 64-Bit

1 Unit- ASUS VS207T-P Black 19.5″ 5ms Widescreen LED Backlight LCD Monitor

7 Units- Acer K2 K202HQL Abd (UM.IX3AA.A04) Black 19.5″ Widescreen LED Backlight Monitors – LCD Flat Panel

All were purchased refurbished and were never used thereafter. All computers come with keyboard and mouse.

Whole Lot Available for $1200, pm if interested.

– Andrew




Price and currency: 1200
Delivery: Goods must be exchanged in person
Payment method: venmo, cash
Location: brooklyn, ny
Advertised elsewhere?: Advertised elsewhere
Prefer goods collected?: I prefer the goods to be collected

This message is automatically inserted in all classifieds forum threads.
By replying to this thread you agree to abide by the trading rules detailed here.
Please be advised, all buyers and sellers should satisfy themselves that the other party is genuine by providing the following via private conversation to each other after negotiations are complete and prior to dispatching goods and making payment:

  • Landline telephone number. Make a call to check out the area code and number are correct, too
  • Name and address including postcode
  • Valid e-mail address

DO NOT proceed with a deal until you are completely satisfied with all details being correct. It’s in your best interest to check out these details yourself.

Go to Original Article

How to Build a Hyper-V Performance Monitoring Tool with PowerShell

#requires -version 5.0

#requires -runasAdministrator

#requires -module Hyper-V

#use max of 3 VMS in the console. You may also need to resize the console window.

#use max of 4 VMs in the ISE


Function Show-VMMemoryPressure {

    [cmdletbinding(DefaultParameterSetName = “interval”)]


        [Parameter(Position = 0, Mandatory, HelpMessage = “Enter the name of a virtual machine.”)]



        [Parameter(HelpMessage = “The name of the Hyper-V Host”)]

        [Alias(“CN”, “vmhost”)]

        [string]$Computername = $env:computername,

        [Parameter(HelpMessage = “The sample interval in seconds”)]

        [int32]$Interval = 5,

        [Parameter(HelpMessage = “The maximum number of samples”, ParameterSetName = “interval”)]

        [ValidateScript( {$_ -gt 0})]

        [int32]$MaxSamples = 2,

        [Parameter(HelpMessage = “Take continuous measurements.”, ParameterSetName = “continous”)]




    DynamicParam {

        if ($host.name -eq ‘ConsoleHost’) {

            #define a collection for attributes

            $attributes = New-Object System.Management.Automation.ParameterAttribute

            $attributes.Mandatory = $False

            $attributes.HelpMessage = “Enter a console color”

            $attributeCollection = New-Object -Type System.Collections.ObjectModel.Collection[System.Attribute]


            #define the dynamic param

            $dynParam1 = New-Object -Type System.Management.Automation.RuntimeDefinedParameter(“ProgressBackground”, [system.consolecolor], $attributeCollection)

            $dynParam1.Value = $host.PrivateData.ProgressBackgroundColor

            #define the dynamic param

            $dynParam2 = New-Object -Type System.Management.Automation.RuntimeDefinedParameter(“ProgressForeground”, [system.consolecolor], $attributeCollection)

            $dynParam2.Value = $host.PrivateData.ProgressForegroundColor

            #create array of dynamic parameters

            $paramDictionary = New-Object -Type System.Management.Automation.RuntimeDefinedParameterDictionary

            $paramDictionary.Add(“ProgressBackground”, $dynParam1)

            $paramDictionary.Add(“ProgressForeground”, $dynParam2)

            #use the array

            return $paramDictionary

        } #if consolehost

    } #close DynamicParam

    Begin {

        $counters = @()

        $vmhash = @{}

        $j = 0

        if ($ClearHost) {




        #limit number of VMs

        if ($host.name -eq ‘ConsoleHost’) {

            $max = 3

            if ($PSBoundParameters.ContainsKey(“ProgressBackground”)) {

                $savedbg = $host.PrivateData.ProgressBackgroundColor

                 $host.PrivateData.ProgressBackgroundColor = $PSBoundParameters.ProgressBackground

                  Write-Verbose “Using progress background color $ProgressBackground”


            if ($PSBoundParameters.ContainsKey(“ProgressForeground”)) {

                $savedfg = $host.PrivateData.ProgressForegroundColor

                $host.PrivateData.ProgressForegroundColor = $PSBoundParameters.ProgressForeground

                Write-Verbose “Using progress foreground color $Progressforeground”



        elseif ($host.name -match “ISE”) {

            $max = 4


    } #begin

    Process {

        foreach ($item in $VMName[0..($max 1)]) {

            Try {

                Write-Verbose “Verifying $item on $Computername”

                $vm = Get-VM -ComputerName $computername -Name $item -ErrorAction stop

                if ($vm.state -ne ‘running’) {

                    $msg = “The VM {0} on {1} is not running. Its current state is {2}.” -f $item.Toupper(), $Computername, $vm.state

                    Write-Warning $msg


                else {

                    Write-Verbose “Adding VM data”

                    $counters += “Hyper-V Dynamic Memory VM($($vm.vmname))Average Pressure”

                    #create a hash with VMNames and a number for their progress id


                    $Vmhash.Add($vm.vmname, $j)


            } #Try

            Catch {

                Write-Warning $_.exception.message

            } #Catch

        } #foreach item

        if ($counters.count -gt 0) {

            $counterparams = @{

                Counter          = $counters

                ComputerName     = $Computername

                SampleInterval   = $Interval

                pipelinevariable = “pv”


            if ($Continuous) {

                $counterparams.Add(“Continuous”, $True)


            else {

                $counterparams.Add(“MaxSamples”, $MaxSamples)


            Write-Verbose “Getting counter data”

            $counterparams | Out-String | Write-Verbose

            Get-Counter @counterparams | ForEach-Object {

                $_.countersamples | Sort-Object -property Instancename |

                    Group-Object -property instancename | ForEach-Object {

                    #scale values over 100

                    $pct = ($_.group.cookedvalue) * .8

                    #if scaled value is over 100 then max out the percentage

                    if ($pct -gt 100) {

                        $pct = 100


                    $progparams = @{

                        Activity         = $_.Name.ToUpper()

                        Status           = $($pv.Timestamp)

                        CurrentOperation = “Average Pressure: $($_.group.cookedvalue)”

                        PercentComplete  = $pct

                        id               = $vmhash[$_.Name]


                    Write-Progress @progparams

                } #for each value

            } #foreach countersample

        } #if counters

    } #process

    End {

        #set private data values back

        if ($savedbg) {

            $host.PrivateData.ProgressBackgroundColor = $savedbg


        if ($savedfg) {

            $host.PrivateData.ProgressForegroundColor = $savedfg


    } #end

} #close function

Go to Original Article
Author: Jeffery Hicks

Announcing Windows 10 Insider Preview Build 18855 | Windows Experience Blog

Hello Windows Insiders, today we are releasing 20H1 Build 18855 to Windows Insiders who have opted into Skip Ahead. Remember – these builds are from the 20H1 development branch. Some things we are working on in 20H1 require a longer lead time. We will begin releasing 19H2 bits to Insiders later this spring after we get 19H1 nearly finished and ready; once 19H1 is “nearly finished and ready” we’ll also use the Release Preview ring for previews of drivers and quality updates on 19H1.
IMPORTANT: As is normal with builds early in the development cycle, these builds may contain bugs that might be painful for some. If you take this flight, you won’t be able to switch back to the Fast or Slow rings without doing a clean-install on your PC and starting over.
If you are looking for a complete look at what build is in which Insider ring – head on over to Flight Hub. You can also check out the rest of our documentation here including a complete list of new features and updates that have gone out as part of Insider flights for the current development cycle.

Notepad now automatically restores unsaved content when Windows restarts for updates.
We’ve enabled microphone in Windows Sandbox, which among other things will improve several accessibility scenarios.
We’ve added functionality to configure the audio input device via the Windows Sandbox config file.
We’ve fixed an issue in which the Windows Sandbox time zone was not synchronized with the host.
We’ve enabled the Shift + Alt + PrintScreen key sequence in Windows Sandbox which activates the ease of access dialog for enabling high contrast mode.
We’ve enabled the ctrl + alt + break key sequence in Windows Sandbox which allows entering/exiting fullscreen mode.
We fixed a recent issue resulting in some Insiders experiencing bug checks upon lid close, monitor plug, or monitor unplug.
We fixed an issue resulting in preferred region settings getting reset on upgrade in the last few flights.
We fixed an issue resulting in the Chinese version of multiple games not working.
We fixed an issue in memcpy that caused some drivers to hard-hang the system on load; this could manifest as a hang on upgrade, depending on the system.
We fixed an issue from recent builds that could result in monitors being missing from the built-in Color Management application.
We fixed an issue causing Explorer.exe to crash for some Insiders when Jump list content was updated.
We fixed an issue where text scaling values did not persist across upgrades for Win32 applications.
Due to a Narrator reading reliability issue for the “Change how capitalized text is read” feature, the feature has been disabled starting in build 18855.
We fixed an issue that could cause the touch keyboard to crash when switching from IME-based languages to another language.

Launching games that use anti-cheat software may trigger a bugcheck (GSOD).
While this flight contains some night light improvements, we’re continuing to investigate reported issues in this space.
When performing Reset this PC and selecting Keep my files on a device that has Reserved Storage enabled, the user will need to initiate an extra reboot to ensure Reserved Storage is working again properly.
Some Realtek SD card readers are not functioning properly. We are investigating the issue.
Creative X-Fi sound cards are not functioning properly. We are partnering with Creative to resolve this issue.
We’re investigating an issue preventing VMware from being able to install or update Windows Insider Preview builds. Hyper-V is a viable alternative if available to you.

If you install any of the recent builds from the Skip Ahead and switch to either the Fast ring or the Slow ring, optional content such as enabling developer mode will fail. You will have to remain in the Fast ring to add/install/enable optional content. This is because optional content will only install on builds approved for specific rings.

With Bing Pages, we will build your social media brand like we did for this celeb. You can promote your own posts on Bing.com, and these get promoted on relevant Bing queries. Try it out and tell us what you think. Your support will help us identify and smooth out the kinks before we launch externally!
If you want to be among the first to learn about these Bing features, join our Bing Insider Program.
No downtime for Hustle-As-A-Service,Dona

Use PowerShell Docker to manage Windows container components

Container management can be a daunting task, but if you’re familiar with PowerShell, the learning curve might not be as steep. You can install the PowerShell Docker module to manage Windows container components.

You probably already know that you can manage almost all Windows OS features and roles using PowerShell. Microsoft developers designed PowerShell cmdlets to specifically manage Hyper-V and the VMs running on it. However, the underlying architecture of Docker differs from Hyper-V, so you can’t apply those same cmdlets to containers. The Docker command-line interface (CLI) is the primary method for managing Docker components, but you can also use PowerShell for Docker. PowerShell for Docker uses the Docker REST API to connect to Docker Engine.

Benefits of using PowerShell Docker over Docker CLI

There are a few reasons why you might use PowerShell cmdlets over Docker CLI. One of the reasons is that Docker CLI syntaxes are complex. For example, to create a network stack using Docker CLI, you must use several parameters and also make sure the complete command syntax is lowercase. Secondly, PowerShell offers greater flexibility and helps simplify command use. To get help, simply run the Get-Help command in the PowerShell Docker module. Finally, PowerShell is a long-established technology, so you most likely have more experience with it than Docker CLI.

Install the PowerShell for Docker module

To install PowerShell for Docker, you must use the NuGet package manager. Execute the PowerShell command below on a Windows host machine running Windows Server 2016 to install the NuGet package:

Install-PackageProvider –Name NuGet –MinimumVersion –Force

Once the above command processes, the NuGet package installs on the machine, as shown in Figure A below:

Install NuGet
Figure A. Download NuGet to install the PowerShell for Docker module

The next step is to register the PowerShell for Docker repository using the Register-PSRepository PowerShell cmdlet as shown in the command below:

Register-PSRepository –Name DockerPS–Dev –SourceLocation https://ci.appveyor.com/nuget/docker-powershell-dev

To make sure the PowerShell Docker repository registered successfully, execute the following PowerShell command:

Get-PSRepository –Name DockerPS-Dev

After registering the repository, install the PowerShell for Docker module with the following command:

Install-Module –Name Docker –Repository DockerPS-Dev

If you only need to install the PowerShell Docker module for the current user, add the -CurrentUser parameter as shown below:

Install-Module –Name Docker –Repository DockerPS-Dev –Scope CurrentUser

Validate the module installation

To validate the PowerShell Docker installation, execute the Get-InstalledModule –Name Docker command. You should see the version information for the installed module, as shown in Figure B below:

Validate module installation
Figure B. Validate the module installation using PowerShell

Before you use the module, import it to the current PowerShell session using the Import-Module –Name Docker command. To validate that it loaded successfully, run the Get-Module –Name Docker command.

Get existing containers and create new containers

Now, you can use PowerShell Docker cmdlets to work with containers. To list all the containers available, execute the Get-Container PowerShell cmdlet, as shown in Figure C below:

Use Get-Container cmdlet
Figure C. Use the Get-Container PowerShell cmdlet to obtain a list of existing containers

To create a container, use the New-Container PowerShell cmdlet. An example PowerShell script appears below, for testing purposes:

$isolation = [Docker.PowerShell.Objects.IsolationType]::HyperV
$container = New-Container -Id “$ImageName” -Isolation $isolation -Command @(“cmd”, “/c”, “echo Worked”)
$container | Start-Container
$container | Wait-Container

To get a list of the all the PowerShell cmdlets available to work with containers, execute the Get-Module –Module Containers command in a PowerShell window.

Go to Original Article

10 of the latest Microsoft Teams integrations to help you work smarter, not harder – Microsoft 365 Blog

We built Microsoft Teams as a platform to bring together all of your workplace tools, apps, and services—whether or not we built them—to allow you to deliver better workday flow for you and your employees. A lot of you recognize the power of Teams, and you’ve been asking how to use Teams to its full advantage. Look no further. Today, we’re sharing ten of the latest Teams integrations you can use every day to simplify workflows, refocus your attention, and get back to working smarter—not harder. This is something our CEO, Satya Nadella, recently addressed in his interview on the future of communication at work with the Wall Street Journal.

Ten of the latest integrations to try in Teams

These ten integrations bring everything from customer feedback and employee polls, to workflow and project management, into Teams, to make your apps work for you.

  1. Funnel customer feedback straight into Teams: Twitter
    As one of the largest social media platforms around, Twitter mostly needs no introduction. However, did you know it’s a great way to gather customer feedback? By integrating Twitter into Teams, you can set up alerts relevant to your company. So, when a customer tweets at your handle or uses your hashtag, it flows directly into Teams, where you can share or respond without stopping your workflow.
  2. Transform the way you work: ServiceNow
    ServiceNow delivers digital workflows that create great experiences and unlock productivity. The cloud-based Now Platform transforms old, manual ways of working into modern digital workflows, so employees and customers get what they need when they need it. Read more about the ServiceNow integration for Virtual Agent, a chatbot that helps build conversational workflows to resolve common ServiceNow actions, as well as IntegrationHub, which lets anyone break down development backlog with codeless workflows in an easy-to-use interface.
  3. Get organized with your very own automated administrative assistant: Zoom.ai
    Zoom.ai lives inside your chat, email inbox, and calendar to help you offload and automate tasks. You interact with it by typing commands in a chat window, where it can schedule Teams meetings for you, brief you on your day, send and receive reminders, and create documents when you need them. It works for you, where you work. Watch the Zoom.ai video to learn more.
  4. Organize any of life’s projects: Trello
    Trello is a project management software whose boards, lists, and cards enable you to organize and prioritize your projects in a flexible way. By integrating in Teams, you can see your Trello assignments, tasks, and notifications and have conversations about them—without leaving Teams. A fun way to bring together project management and project collaboration. Watch the Trello video to learn more.
  5. Run polls in tandem with your conversations: Polly
    Polly is a survey app that lets you create surveys in Teams. You can quickly create polls in your Teams channels and view results in real-time. You have the option to create multiple choice polls, freeform polls, or a mixture of both. Turn on comments and you’ve got yourself a full discussion board. Get the answers you need without disrupting workflows or clogging inboxes. Visit Polly for Microsoft Teams to learn more.
  6. Celebrate your organization’s culture and values: Disco
    Disco is a solution that rallies your entire company around your core values. It makes it easy to give public shout-outs and congratulate your colleagues in real-time. So, next time a team member delivers a project ahead of schedule or demonstrates one of your team or company values in their work, pay it forward by giving them Disco “points” in Teams. They’ll feel supported and, who knows, maybe repay your appreciation. Watch the Disco video to learn more.
  7. Help teams deliver value to customers faster by releasing earlier, more often, and more iteratively: Jira
    Jira Software is a leading software development tool used by agile teams to plan, track, and release great software. Integrate Jira with Teams for a seamless way to visualize the important things like development velocity, workloads, bug resolution, and app performance all in real-time—from Teams. This makes it easy to inject insights into group collaboration without disrupting workflows. Learn more about Microsoft Teams Jira Connector.
  8. Bring more structure to online brainstorming: MindMeister
    MindMeister is an online mind-mapping tool that lets you capture, develop, and share ideas visually. And by integrating in Teams, you can take notes, brainstorm, visualize project plans, and easily show connections between ideas all while discussing details with your team in the chat. Read Create and Manage All Your Mind Maps in Microsoft Teams! to learn more.
  9. Bring creative work to team work: Adobe Creative Cloud
    Adobe Creative Cloud gives you the world’s best apps and services for video, design, photography, and the web including Adobe Photoshop , Illustrator CC, InDesign CC, Premiere Pro CC, and more. Integrate with Teams to bring your creative work and teamwork together. You can share work, get feedback, and stay up-to-date on tasks and actions. Read Adobe XD Adds Integration with Microsoft Teams—Creativity meets collaboration to learn more.
  10. Build software in the way that works best for you: GitHub
    GitHub is the platform where developers work together, solve challenging problems, and create the world’s most important technologies. Whether you are a student, hobbyist, consultant, or enterprise professional, the GitHub integration in Teams allows you to create, share, and ship the best code possible.

Get started with Teams

Bringing these apps and tools together in Teams is a great way to bring focus back to your workflow. They’re easy to integrate and offer something for everyone, whether you’re developing software, managing projects, or gathering customer feedback. And with new apps going live on Teams every day, your next productivity superpower is only a few clicks away. Check the Teams Store today so you don’t miss out!

Go to Original Article
Author: Steve Clarke

Understanding the three main goals of PowerShell Core 6

PowerShell is a critical tool for many administrators who manage Windows, but the goals of PowerShell Core 6 and its shift to open source might be a mystery to some in IT.

The book Learn PowerShell Core 6.0 by David das Neves and Jan-Hendrik Peters can help beginner PowerShell users with step-by-step guides to get started with the scripting language and management tool on Windows, Linux and Mac platforms, as well as Cloud Shell in Azure.

In addition to its technical content, the book covers the history of PowerShell — Jeffrey Snover’s Monad Manifesto in 2002 introduced the main concepts of the automation tool — and why Microsoft decided to open up its development and make it available on other operating systems.

The book explains that while Windows PowerShell made task automation easier for administrators, it vexed the PowerShell team. Tedious, manual legacy controls, a lack of organized reviews and a frustrating feedback process pushed Microsoft to overhaul the development process, which steered the company to make the switch to open source and publish the project on GitHub in 2016.

One of the goals of PowerShell Core 6, released for general availability in January 2018, is to build a more involved community around the tool and make it an integral component to automate processes beyond the Windows platform.

One of the goals of PowerShell Core 6, released for general availability in January 2018, is to build a more involved community around the tool and make it an integral component to automate processes beyond the Windows platform.

This excerpt from the book’s first chapter describes the goals of PowerShell Core 6 and how this new direction can help further the vision detailed by Snover for a more powerful automation tool.

There have been three primary goals for PowerShell Core 6. When we examine each of these goals, it becomes clear how PowerShell Core came into being and why it is a great management tool for any infrastructure

  • Ubiquity describes the platform-independency to work with PowerShell on Windows, Linux, and macOS operating systems. This is necessary because heterogeneous environments are today’s norm, and they are important to developers and IT professionals.
  • Cloud refers to the intention of being built for cloud scenarios, because IT is moving towards Azure, REST APIs (Swagger/OpenAPI), and other public clouds. For this, major improvements have been made to the Invoke-WebRequest, Invoke-RestMethod, and ConvertFrom-Json cmdlets. There is a collaboration with the Azure PowerShell team to support PowerShell Core. Third-party vendors, such as VMware and AWS, are also working to support PowerShell Core.
  • Community refers to being open source, contributing directly to the product, and allowing the retrieval of customer feedback directly to the engineering team. The current Request for Comments (RFCs) — asking for feedback for the current roadmap/new features or breaking changes, milestones, projects, and issues — should always be transparent and publicly available. This means that we have pull requests against code, tests, and documentation. In addition, issues from the community are dynamically reprioritized, which can also be discussed in the PowerShell Core Community Call. These calls are free to join for everybody, and you can just raise your voice and discuss your feedback directly with the engineers.

Editor’s note: This excerpt is from Learn PowerShell Core 6.0, authored by David das Neves and Jan-Hendrik Peters and published by Packt Publishing.

Go to Original Article

Postnatal care pathway moves to dashboards at U.K. hospital

Sunita Sharma ordered a pizza online one day and marveled at how much visibility she had into the process — from order to oven to delivery. If ordering a pizza could be that transparent, there must be a way to bring the same kind of digital experience to healthcare, she thought.

Sharma is a consultant obstetrician and gynecologist at Chelsea and Westminster Hospital in London, a 430-bed care facility with a 27-bed postnatal care ward. Sharma is in charge of the postnatal care ward, which manages about 5,500 deliveries annually. On the ward, communication between team members about patient status used to take place on a whiteboard, she said.  

But after her online pizza ordering experience, Sharma said she began to feel communicating by whiteboard was outdated, and she wanted a better way for team members to talk to each other — a change that would make for a better postnatal discharge experience for patients, as well.

“My vision was clear: I wanted to re-create what that pizza experience was like,” she said. “I had to order the pizza, and I thought, ‘Wow, I know it has gone in the oven; I know it’s coming to the door.’ I said, ‘If this could be in postnatal, what a transformation it would be.'”

Hospital goes digital in postnatal care

Before postnatal patients are discharged, they have to go through a distinct process, known as a care pathway, to ensure mom and baby go home safely. The care team at Chelsea and Westminster Hospital used whiteboards to keep track of what steps mom and baby, which the facility used to see as distinct patients, had completed. It was a process Sharma said she wanted to digitize.

“How efficiently a postnatal ward works is very important for any maternity service, because you need to have a good flow of patients through the system so that people get safely where they need to be,” she said. “You have to ensure they’re safely discharged in a timely manner, or else you’ll have a bottleneck. Flow matters.”

Using care pathway management software from Lumeon, a care pathway management vendor, Sharma and her team helped create a digital dashboard accessible from any computer or mobile device in the hospital. It enabled staff to monitor patients, communicate and digitally track the care pathway process for each patient leading up to being discharged.

In place of a whiteboard, a monitor displays a dashboard of patient statuses in the postnatal care ward. When a staff member logs on through a portal to access the digital postnatal care system, they can accept tasks to complete for patients. Sharma said the dashboard uses a traffic-light design, so when a care pathway task is complete, the task’s button goes from red to green. If the step has yet to be completed, the task’s button remains red.

Implementing the digital postnatal discharge system brought better communication and more transparency into the patient’s progress, according to Sharma. It also tied mom and baby together.

“It is cumbersome when you don’t have a tool that helps you bring everything together into one portal,” she said. “So, what Lumeon has enabled us to achieve is that mom and baby are seen as one unit we’re looking after, and we can have visibility into each of those journeys.”

As a team, Sharma said members are better informed about the holistic care of the patient.

Care pathway system improves communication

One of the main benefits of using Lumeon’s software is improved communication between staff members, Sharma said.

Mom and baby see a stream of healthcare professionals when they’re in the hospital, and getting information from one person to another is crucial when the average length of stay in the postnatal care ward is 1.8 days.

Communication, I can confidently say, has become easier.
Sunita Sharmaconsultant obstetrician and gynecologist, Chelsea and Westminster Hospital

“Communication, I can confidently say, has become easier,” she said.

Part of Sharma’s vision for the digital postnatal discharge system is to give patients an opportunity to communicate through the dashboard. The technology can function as a connection from patient to caretaker, allowing patients to notify ward staff of their specific needs.

“Women may sometimes trivialize their needs because they don’t want to bother someone, but if there was a portal through which they could let someone know, ‘Could I have something?’ that will be delivering on the good experience,” Sharma said.

The Lumeon technology allows that sort of interaction with the dashboard, Sharma said — a function she plans to eventually add to the system.

The newest buzzword

Digitization is the newest buzzword in care pathway management, according to Sowmya Rajagopalan, Frost & Sullivan’s global program director for transformational health. She said she believes digitizing care pathways will result in improved efficiency in patient care.  

Care pathways are beneficial in clinical conditions that require real-time monitoring, as well as regular clinical interventions, she said. She noted that there is widespread adoption of cancer and cardiac care pathways.

Care pathway management ties the clinical and administrative systems together. Organizations can customize, design and implement their own care pathway rules to achieve the desired outcome, Rajagopalan said.

“It supports a single, real-time view of the entire patient journey from referral to outcome,” she said.

Go to Original Article

For Sale – Watercooled PC (i7 4770K, GTX 980ti, 16GB DDR3, Corsair AX860) 480mm + 420mm Rads

Plan to strip this down. Would like to sell as complete system first.

Price: 1,100 GBP

Phanteks Enthoo Primo (includes PWM fan controller)


Price and currency: 1,100 GBP
Delivery: Delivery cost is not included
Payment method: BT
Location: Bristol / North Somerset
Advertised elsewhere?: Not advertised elsewhere
Prefer goods collected?: I prefer the goods to be collected

This message is automatically inserted in all classifieds forum threads.
By replying to this thread you agree to abide by the trading rules detailed here.
Please be advised, all buyers and sellers should satisfy themselves that the other party is genuine by providing the following via private conversation to each other after negotiations are complete and prior to dispatching goods and making payment:

  • Landline telephone number. Make a call to check out the area code and number are correct, too
  • Name and address including postcode
  • Valid e-mail address

DO NOT proceed with a deal until you are completely satisfied with all details being correct. It’s in your best interest to check out these details yourself.

Go to Original Article