Tag Archives: Interoperability

Experts say there’s still a long road ahead for the FHIR standard

A major issue hindering interoperability in healthcare is a lack of data standardization, something federal regulators are trying to change by pushing adoption of the Fast Healthcare Interoperability Resources standard.

FHIR is an interoperability standard developed by Health Level Seven International (HL7) for the electronic exchange of health data. The FHIR standard has gone through multiple iterations and taken five years to develop. It sets a consistent description for healthcare data formats and application programming interfaces that healthcare organizations can use to exchange electronic health records.

In a set of proposed rules for interoperability from the Office of the National Coordinator (ONC) for Health IT and the Centers for Medicare and Medicaid Services (CMS), the agencies would require healthcare organizations to use FHIR-enabled healthcare APIs that would allow patients to download their standardized electronic health information into a healthcare app on their smartphones.

During a panel discussion on the future of interoperability at ONC’s 3rd Interoperability Forum in Washington, D.C., Thursday, panelists including Kisha Hawthorne, CIO of Children’s Hospital of Philadelphia, focused on the reality of using the FHIR standard, and whether the standard will help achieve interoperability in healthcare.

The reality of FHIR standard use today

Will the FHIR standard be a key facilitator of interoperability in healthcare? Panelists agreed that it will — in time. Right now, though, the standard still needs work in the implementation department.

In the provider space there’s a ways to go. But we’re excited and we think it will take hold.
Kisha HawthorneCIO, Children’s Hospital of Philadelphia

Hawthorne said her team at Children’s Hospital of Philadelphia is looking to use the FHIR standard in the provider space to bridge the gaps between the different software vendors with which the organization works.

The hospital uses an Epic EHR, and Hawthorne said that while she thinks vendors like Epic are beginning to implement and use the FHIR standard, she hopes to see that work “fast forward” with Epic and other vendors to make it easier to gather and share, as well as use, data in the provider space. FHIR standard use is something that’s not quite there yet, she said.

“In the provider space, there’s a ways to go,” Hawthorne said. “But we’re excited and we think it will take hold.”

The potential of the FHIR standard is exciting and it will “open a lot of doors,” but the reality is that the standard is immature, said Kristen Valdes, CEO of personal health app b.well Connected Health.

Valdes said that although she thinks the FHIR standard will create a push toward interoperability in healthcare, challenges associated with implementation of the FHIR standard are hindering progress.

A significant number of providers and organizations aren’t “using a fraction” of the implementation guidelines that have been made available for the FHIR standard, she said. While organizations are thinking about the operational impacts of using FHIR on behalf of users, she said there continues to be ongoing debate about the proper HIPAA rules to provide consumers access to their own data, which also hinder its implementation.

“We really have to think about the operational workflows and how it’s going to affect the people who are expected to implement and deploy FHIR,” she said.

The problem with the FHIR standard isn’t the technical aspects of the standard, but the process and people implementing it, said Vik Kheterpal, principal of interoperability product vendor CareEvolution.

As a technology standard, Kheterpal said it makes sense and has already seen relative success in the launch of programs such as CMS’ Blue Button 2.0 program. Blue Button 2.0 uses the FHIR standard for beneficiary data, such as drug prescriptions, primary care cost and treatment. Yet, the problem with the rest of healthcare often lies in misinterpretation of policy when it comes to sharing patient data.

Anil Jain, chief health informatics officer at IBM Watson Health, said he thinks the value of the FHIR standard is real, and organizations already need to think about what’s next once the standard matures.

As use of the FHIR standard grows among healthcare organizations, Jain said it’s important to create businesses cases and models for sharing data that will work using the standard. Otherwise, providers and patients will continue to lack trust in the data, something a standard like FHIR alone won’t give healthcare.

Go to Original Article

Success of healthcare APIs hinges on data safety, patient awareness

The Office of the National Coordinator for Health IT is steadfast in fostering interoperability through healthcare APIs. But health IT leaders are asking for more nuance: specifically, how APIs can also keep patient data safe.

During ONC’s 3rd Interoperability Forum this week, Don Rucker, national coordinator for health IT, made it clear that the federal agency is dedicated to pursuing greater patient access to data through healthcare APIs, or code that enables software programs to talk to each other.

Earlier this year, ONC and the Centers for Medicare and Medicaid Services proposed new rules on interoperability and information blocking. The proposed rules would require healthcare organizations to use APIs, which ONC hopes will create a market for healthcare apps and inject competition into the mix.

“We are very serious in getting the American public to have the benefits of interoperability on their smartphone,” he said.

Rucker and ONC are focused on the interoperability rule and getting patient data access through apps, as well as keeping data secure. But during the forum, a panel of healthcare experts raised other issues that could affect the use of healthcare APIs.

Patient data safety

We are very serious in getting the American public to have the benefits of interoperability on their smartphone.
Don RuckerNational coordinator for health IT

Based on more than 2,000 comments on the proposed rules from the healthcare community, ONC is taking a harder look at a growing concern: secondary uses of data when healthcare apps store medical records.

Indeed, concerns about patient data safety were voiced even before the comment period on the rules concluded. During a hearing in May held by the U.S. Senate Committee on Health, Education, Labor and Pensions, several Senate members questioned whether patient data would be safe in an app ecosystem.

The community’s worry has to do with end-user license agreements, which users are asked to sign off on when using an app. The agreements are often cumbersome, long and filled with small print that, in part, detail potential secondary uses of data, something a patient could overlook or accept blindly.

The agreements “don’t work in this modern world,” Rucker said, and the agency is working to find more transparent ways of getting patient consent.

Patient awareness

Healthcare organizations are not yet required to use APIs so that patients can download their electronic health information into healthcare apps — nor are they incentivized to make it known when they do. Indeed, another concern the panel raised had less to do with functionality and more to do with awareness.

Philip Parker, CEO at Boston-based Coral Health, said as a tech company with a healthcare app, he works closely with EHR vendors and provider organizations to connect to APIs they have available. One of the issues he sees is lack of patient awareness about the availability of healthcare APIs enabling them to download their data into an app.

“There’s a big gap there where patients aren’t asking for this yet because they don’t know about it, and it makes it difficult,” Parker said.

While ONC’s proposed rule requiring organizations to use healthcare APIs has not been finalized, early adopters have seen dismal results, according to new research in the “Journal of the American Medical Association.”

Researchers studied 12 U.S. health systems with at least nine months of experience using healthcare APIs. From March to December 2018, the study found that only 0.7% of patients who logged into their patient portal also used an API to download their health data into an app.

The study acknowledged that because the capability is new, few applications are able to access and use electronic health information. But it also stressed that there has been “little effort by healthcare systems or health information technology vendors to market this new capability to patients, and there are not clear incentives for patients to adopt it.”

While APIs will be a good way to share information once patients become more familiar with the capability, another challenge is the content, according to panelist Jim Barnett, director of strategic intelligence analysis at AARP. Clinical or claims data doesn’t always make sense to consumers and can be difficult to interpret, he said. “We need more work there,” he said.

Go to Original Article

Expert questions funds for interoperability challenges in healthcare

One expert says the $2 million in funding ONC is offering developers to address interoperability challenges in healthcare — although commendable — may not be enough.

“I applaud ONC for recognizing this challenge and making funds available for development of interoperability platforms and solutions,” said John McDaniel, senior vice president of innovation and technology for health IT consulting firm The HCI Group. “However, based on the work we have done with vendors that offer interoperability solutions, I don’t believe $2 million will address the issue.”

ONC funding offered in two areas

ONC will provide up to $2 million in funding to two recipients focused on developing innovative and breakthrough advances in two areas: expanding the scope of population-level data-focused application programming interfaces (APIs) and advancing clinical knowledge at the point of care, according to ONC.

For expanding the scale of APIs, ONC wants to see projects that reduce provider burdens associated with reporting through API technology, as well as assessing trade-offs associated with various big data formats and challenges to the scope of FHIR-based APIs.

As for advancing clinical knowledge at the point of care, ONC hopes to see “emerging innovations” in clinical medicine, as well as data-driven medicine infrastructure and legal and policy implications for innovative approaches, according to the ONC news release.

Additional funding may be available

ONC will fund up to $1 million per area of interest by 2019. After the funds are awarded, there will be a two-year project and budget period, but applicants are encouraged to submit responses based on a five-year project and budget period because additional funding for three to five years could be provided based on the availability of funds and “meaningful progress.”

Based on the work we have done with vendors that offer interoperability solutions, I don’t believe $2 million will address the issue.
John McDanielsenior vice president of innovation and technology, The HCI Group

The funding opportunity will be open for three years, allowing for the possibility that ONC will issue additional awards to other eligible applicants for future “priority areas of interest.”  

ONC expects the funding to “further a new generation of health IT development and inform the innovative implementation and refinement of standards, methods and techniques for overcoming major barriers and challenges as they are identified.” Though he questions whether $2 million will be enough to address interoperability challenges in healthcare, McDaniel said he has seen ONC be successful with similar initiatives in the past, such as establishing incentives to motivate healthcare organizations to implement EHRs, which enabled the digitization of patient care documentation.

The full scope of interoperability challenges in healthcare

Now, McDaniel said, the challenge is to enable full interoperability to not only digitize retrospective patient data, but to “capture and use real-time patient information coupled with cognitive computing to assist care providers with decision-making and best practices given the full view of all relevant patient data.”

“Developing interoperability between EHR’s is a good start, but since only a percentage of relevant retrospective patient data is maintained in those systems, we need to establish interoperability standards for dynamic exchange of data from all source systems, including IoT, EHR’s medical devices, personal health devices, etc., to enable precision and predictive care models,” McDaniel said.

UNH InterOperability Lab expands IPv6 testing amid SDN growth

The University of New Hampshire InterOperability Lab updated its IPv6 testing program to comply with new government requirements specified by the National Institute of Standards and Technology. UNH-IOL, a technology testing facility in Durham, N.H., also added support for SDN protocols in its updated program.

The testing program applies specifically to U.S. government agencies, such as NASA, that procure networking equipment and need independent certification that the products meet regulation, according to Timothy Winters, senior IP manager at UNH-IOL. The new requirements come as IPv6 adoption continues to grow globally, as indicated by Google, which said over 20% of its users now have IPv6 addresses, Winters added.

Agencies and product vendors that are UNH-IOL members send devices that need certification to the lab, where UNH students and staff test the products for a month to ensure they support IPv6 and comply.

UNH-IOL tests a range of products, including routers, switches, phones, printers and security cameras. Increasingly, however, agencies and service providers have requested UNH-IOL’s help with SDN and IoT devices, Winters said.

“We’re encountering more devices we haven’t seen,” he said. “Some of this is because of IoT, where things are actually being networked and put on a network. They’re not sitting on a proprietary link anymore.”

IPv6 testing ramps up

Timothy Winters, UNH-IOL senior IP managerTimothy Winters

As operators and service providers realize IPv4 address space is decreasing, they’ve started moving to IPv6-only networks, Winters said. This transition caused UNH-IOL to update its IPv6 testing program accordingly.

“UNH-IOL is trying to push that support, so people building applications and services — or even routers and switches — can know which things work or don’t work in an IPv6-only network,” he said. These changes look at the requirements for building, installing and updating applications — processes that sometimes sound simple, but can actually be quite complicated, he added.

UNH-IOL also patched security loopholes in the IPv6 testing program and made the overall testing more generic, so governments outside the U.S. and other user groups could adopt it, Winters said.

Equipment suppliers have two years to comply with the new IPv6 testing specification. As a result, UNH-IOL will likely see 200 to 300 devices return to the lab to undergo the updated testing, according to Winters.

“I’m sure there are companies that have made some products legacy or don’t sell them anymore, so those won’t come back in,” Winters said. “But that’s a challenge: We have to get everybody back through the program.”

USGv6 testing program flow chart
This flow chart relays the process vendors undergo for IPv6 testing on their products.

IPv6 complements SDN

For us, the exciting part is getting students involved in learning a technology like this. It gives students the ability to build tools, see devices and test them.
Timothy Winterssenior IP manager, UNH-IOL

Additionally, he said the lab now regularly receives routers without a command-line interface to test. This change comes as more service providers and equipment providers find value in SDN — and discover how IPv6 complements SDN deployments, Winters said.

“For SDN, the ability to address multiple services is helpful when you’re trying to get into networks that are so complex they have to be programmed,” he said. Service providers, for example, can use IPv6, along with disaggregation, network slicing and segment routing. The IPv6 address helps identify to which service any particular packet is going.

Along with the other testing updates, UNH-IOL added support for SDN protocols, such as NETCONF and YANG, as well as specs for IoT capabilities. By doing so, Winters said he hopes the lab will help push IPv6 deployments. And, as another plus, UNH-IOL students tackle “the latest and greatest stuff” in networking.

“For us, the exciting part is getting students involved in learning a technology like this,” he said. “It gives students the ability to build tools, see devices and test them.”

MEF targets multivendor interoperability for network services

MEF this week touted its progress in multivendor interoperability by announcing its active software-defined WAN implementation project. Three SD-WAN vendors — Riverbed Technology, Nuage Networks from Nokia and VMware’s VeloCloud — are leading the MEF project, focusing on multivendor SD-WAN use cases. Software development services provider Amartus is also participating with the SD-WAN vendors.

MEF — a Los Angeles-based association, with more than 200 members — launched its multivendor SD-WAN implementation project last year in an attempt to standardize services across multiple providers and technologies. But multivendor interoperability has numerous aspects, according to Joe Ruffles, global standards architect at Riverbed, based in San Francisco, and co-leader of the SD-WAN implementation project. Companies merge; they need to partner with somebody to increase geographic reach, or they want basic interoperability and service chaining, he said.

The implementation project allows member vendors to get their hands dirty, while actively testing and proving out proposed SD-WAN interoperability issues, Ruffles said. Each vendor uses MEF’s cloud-based dev-test platform, MEFnet, to develop its respective SD-WAN technology. They then interconnect and orchestrate those SD-WAN implementations using MEF’s Presto API, which is part of MEF’s Lifecycle Service Orchestration (LSO) framework.

The Presto API communicates with orchestration and management to help service providers manage multiple SD-WAN implementations with a single orchestrator. Additionally, it helps create better multivendor interoperability among SD-WAN controllers and edge devices, according to Ralph Santitoro, head of SDN, network functions virtualization and SD-WAN at Fujitsu and MEF distinguished fellow.

“Member companies can get together and connect their appliances or run software in the environment and actually do things,” Santitoro said. “They can actually prove out different topics or items that are important to them or the industry.”

Other MEF members can build from the existing SD-WAN implementation project or suggest additional projects and issues, Ruffles said. “It’s not so much a phase as it is continuous, depending on who has an issue and who’s available to work on it,” he added.

Standardized specs lead to better automation processes

The SD-WAN implementation project work benefits more than its current participants, according to Santitoro. By “playing in the sandbox,” members can feed the knowledge learned from the testing environment into MEF’s work on SD-WAN specifications. For example, participants can more accurately define SD-WAN requirements, capabilities, architecture and what’s needed for multivendor interoperability.

“We learn by hand what has to be done, and then we use that information to make changes or additions to the API,” Ruffles said.

In addition to the SD-WAN specs, MEF this week published specs for retail and wholesale Ethernet services, subscriber and operator Layer 1 services, and IP services. These services — especially IP services — have historically been defined in various ways, Santitoro said, which can impede automation. To combat the discrepancies, MEF is defining the fundamentals of IP services and their attributes, which will then help define and build broader services.

“We’ll create things like the VPN [virtual private network] service, internet access service, private cloud access service and operator wholesale services — particularly the IP-VPN case,” said David Ball, MEF’s services committee co-chair and editor of the MEF IP services project.

These definitions and specs will then be fed into MEF’s LSO architecture to help establish a standard vocabulary, so SD-WAN buyers and sellers understand what they need or get with particular services, Santitoro said. Further, defining services and their requirements helps create standardized processes for orchestration and automation, he added.

“Automation is really about consistency and being able to create a model of a service, so services are deployed, designed and implemented in a similar fashion,” he said.

One Virtual System Worldwide: Intra-Epic interoperability

One Virtual System Worldwide, Epic Systems Corp.’s new intra-Epic interoperability framework, is getting a warm reception from the electronic health record users and others in health IT.

The new features are contained in a simple and apparently easy-to-use clinician-facing interface in the vendor’s Epic 2018 EHR system upgrade, expected to be released in February.

Health data sharing for Epic users

The functions enable different Epic healthcare providers around the world to share medical images, book appointments, search health data and text among each other. Another function that is part of the One Virtual System Worldwide, intra-Epic system messaging, had already been available.

“I’m strongly encouraged. It’s really important for the electronic medical record vendors to lower barriers to interoperability,” Brian Clay, M.D., an Epic user and chief medical informatics officer at University of California San Diego Health, told SearchHealthIT. “This move by Epic should make sharing information easier, both for providers and patients.”

Despite wide and long-standing criticism of the giant vendor for allegedly making it hard to share data from its EHR, Epic has long maintained that it has always provided full interoperability within its own user base and with outside entities, as well.

New openness for Epic

In what looks like part of a concerted new effort to combat those perceptions, the privately held company revealed the One Virtual System Worldwide concept with an upbeat news release. It may have been the first time Epic made a major announcement publicly.

Nancy Fabozzi, principal analyst of connected health at Frost & Sullivan, said she was impressed after looking over publicly available materials about One Virtual System Worldwide.

“Anything they can do to move the needle forward on interoperability is going to be appreciated in the marketplace. What’s not to like?” Fabozzi said. “The interface, from what I see, with its clean buttons, is really nice. This is exactly the kind of thing that clinicians want to see and how they want to interact with electronic health records.”

Fabozzi added that she sees the latest Epic interoperability move as a simultaneous way to open up to the outside world, answer questions about its commitment to interoperability, and stay abreast of the fast-changing healthcare and health IT markets.

Healthcare markets changing quickly

Epic understands that the world is changing very, very dramatically, and the cloistered world they had is gone.
Nancy Fabozziprincipal analyst of connected health at Frost & Sullivan

In addition to Apple’s move into health records, the healthcare industry was roiled in recent days by the blockbuster news that Amazon, Berkshire Hathaway and JPMorgan Chase are forming an independent healthcare company for their employees.

Meanwhile, huge deals — such as CVS’ $69 billion acquisition of Aetna last year — are also reshaping healthcare, and many expect Amazon and Google, among other tech giants, to make major healthcare moves.

Amid that upheaval, Fabozzi said she thinks Epic understands it is no longer an unrivaled leader in health IT, a position it occupied — along with its chief competitor, Cerner Corp., to some extent — during the meaningful use era when Epic grew explosively, as dozens of big healthcare systems standardized on its EHR platform.

“I think Epic understands that the world is changing very, very dramatically, and the cloistered world they had is gone,” Fabozzi said. “Now, it’s about optimizing these EHR systems and responding to this changing ecosystem that demands more openness and more interoperability.”

On the patient side, Epic said its MyChart patient portal already gives patients of Epic-based healthcare systems the ability to combine health data from different providers as a personal health record that is portable among different providers.

Perhaps coincidentally, Epic recently collaborated with Cerner to help develop Apple’s new personal health record system for the Apple Health app, a similarly interoperability-focused new product.

Epic’s ‘Working Together’

With One Virtual System Worldwide, Epic is expanding data sharing and other options on the provider-facing side for clinicians and other hospital staff.

These fall under the “Working Together” concept, the newest level of the three-tier system that makes up One Virtual System Worldwide.

The first tier, Come Together, consisting of gathering data in one place, and second tier, Happy Together, presenting combined health data in an easy-to-read format, are not new and have been included for several years in versions of the Epic EHR.

Epic describes Working Together as new software capabilities that enable healthcare providers to take actions across organizations.

“We’re taking interoperability from being able to ‘view more’ to being able to ‘do more,'” Dave Fuhrmann, vice president of interoperability at Epic, based in Verona, Wis., said in the release. “Over the last decade, we expanded the amount of data that customers can exchange. Now, our new functionality ‘Working Together’ will allow clinicians to work across Epic organizations to improve the care for their patients.” 

New Epic interoperability functions

These One Virtual System Worldwide features, according to the vendor, include the following:

  • Images Everywhere enables Epic users to see medical image thumbnails from other Epic providers, click on an image from the original source and retrieve an image for review.
  • Book Anywhere allows schedulers who refer a patient to another Epic provider to directly book the appointment in the other system.
  • Search Everywhere allows users to search data from other healthcare organizations on Epic and also examine free text, such as in notes and documents.

Clay, the San Diego healthcare system CMIO, noted that physicians at UC San Diego Health routinely coordinate care with nearby providers, such as Rady Children’s Hospital, another Epic user, and now clinicians from both systems can share health data faster and better.

“This will enable us to share information more easily,” he said.

ONC unveils plan for health information sharing framework

If you want a say in how the government deals with health data interoperability, now’s your chance.

The Office of the National Coordinator for Health IT (ONC) has released draft rules for a health information sharing plan, called the Trusted Exchange Framework, and the public has until Feb. 18 to comment.

The framework stems from the interoperability provisions of the 21st Century Cures Act of 2016, a wide-ranging law that includes many aspects of healthcare and health IT, of which the health information sharing plan is only one part.

In a conference call with reporters, ONC National Coordinator Donald Rucker, M.D., called the framework concept a “network of networks,” and he noted that Congress explicitly called for a way to link disparate existing health information networks.

“How do these networks, which are typically moving very similar sets of information, how do we get them connected?” Rucker said.

Donald Rucker, M.D., ONC National CoordinatorDonald Rucker

The framework, Rucker said, is a response to what he called the “national challenge” of interoperability.

“It hasn’t been easy. Folks have made some great progress, but obviously there’s a lot of work to be done,” he said.

Among the existing networks that ONC officials are looking to link within the health information sharing framework are the many health information exchanges that have sprung up since the HITECH Act of 2009 spurred data sharing with the meaningful use program.

Other such networks include vendor-driven interoperability environments, such as the one overseen by the CommonWell Health Alliance.

CommonWell’s director, Jitin Asnaani, told Politico that he thinks the ONC model is a “path to scalable nationwide interoperability.”

Mariann Yeager, CEO of another vendor network, the Sequoia Project, was quoted by Politico expressing a somewhat more neutral assessment: “Overall, the approach seems reasonable,” but “we need to better understand the details.”

ONC envisions the Trusted Exchange Network — expected to be started by the end of 2018 and fully built out by 2021 — as being used by federal agencies, individuals, healthcare providers, public and private health organizations, insurance payers and health IT developers.

[The Trusted Exchange Framework is a] network of networks. How do these networks, which are typically moving very similar sets of information, how do we get them connected?
Donald RuckerONC national coordinator

The agency conceives of the network as a technical and governance infrastructure that connects health information networks around a core of “qualified health information networks” (Qualified HIN) overseen by a single, “recognized coordinating entity” to be chosen by ONC in a competitive bid process.

According to ONC, among other things, Qualified HINs must be able to locate and transmit electronic protected health information between multiple organizations and individuals; have mechanisms to audit participants’ compliance with certain core obligations; use a connectivity broker; and be neutral as to which participants are allowed to use the network.

A connectivity broker is a service provided by a Qualified HIN that provides the following:

  • A master patient index to accurately identify and match patients with their health information;
  • A health records locator service;
  • Both widely broadcast and specifically directed queries for health information; and
  • Guaranteed return of electronic health information to an authorized Qualified HIN that requests it.

Governance for the proposed framework consists of two parts. Part A is a set of “guardrails” and principles that health information networks should adopt to support interoperability; part B is a set of minimum required legal terms and conditions detailing how network participation agreements should be constructed to ensure health information networks can communicate with each other.

Genevieve Morris, ONC’s principal deputy national coordinator, specifically acknowledged the efforts of private sector organizations in laying groundwork for health data interoperability and noted that another private organization will coordinate the health information sharing framework.

“We at ONC recognize that our role is to make sure there is equity, scalability, integrity and sustainability in health information sharing,” Morris said.

Microsoft showcases latest industrial IoT innovations at SPS 2017

We are excited to extend our lead in standards-based Industrie 4.0 cloud solutions using the industrial interoperability standard OPC UA, with several new product announcements at SPS IPC Drives 2017 in Nuernberg, Europe’s leading industrial automation exhibition, which takes place next week.

We continue to be the only cloud provider that offers both OPC UA client/server as well as the upcoming (in OPC UA version 1.04) Publish/Subscribe communication pattern to the cloud and back with open-source modules for easy connection to existing machines, without requiring changes to these machines and without requiring opening the on-premises firewall. We achieve this though the two Azure IoT Edge modules OPC Proxy and OPC Publisher, which are available open-source on GitHub and as Docker containers on DockerHub.

As previously announced, we have now contributed an open-source OPC UA Global Discovery Server (GDS) and Client to the OPC Foundation GitHub. This contribution now brings us close to the 4.5 million source lines of contributed code landmark, keeping us in the lead as largest contributor of open-source software to the OPC Foundation. This server can be run in a container and used for self-hosting in the OT network.

Additionally at SPS, Microsoft will demo its commercial, fully Azure IoT integrated version of the GDS and accompanying client at the OPC Foundation booth. This version runs as part of the Azure IoT Edge offering made available to the public last week.

We have continued to release monthly updates to our popular open-source Azure IoT Suite Connected Factory preconfigured solution which we launched at Hannover Messe this year. Most recently, we have updated the look and feel to be in line with the new set of preconfigured solutions currently being released. We will continue to release new versions of the Connected Factory on a monthly basis with improvements and new features, so check them out on a regular basis.

Our OPC UA Gateway program also keeps growing rapidly. Since we launched the program just six months ago at Hannover Messe, we now have 13 partners signed up including Softing, Unified Automation, Hewlett Packard Enterprise, Kepware, Cisco, Beckhoff, Moxa, Advantech, Nexcom, Prosys OPC, MatrikonOPC, Kontron, and Hilscher.

Furthermore, we are excited to announce that the Industrial IoT Starter Kit, previously announced at OPC Day Europe, is now available to order online from our partner Softing. The kit empowers companies to securely link their production line from the physical factory floor to the Microsoft Azure IoT Suite Connected Factory in less than one day. This enables the management, collection, and analysis of OPC UA telemetry data in a centralized way to gain valuable business insights immediately and improve operational efficiency. As with all our Industrial IoT products, this kit uses OPC UA, but also comes with a rich PLC protocol translation software from Softing called dataFEED OPC Suite. It comes with industry-grade hardware from Hewlett Packard Enterprise in the form of the HPE GL20 IoT Gateway. Swing by the Microsoft booth to check it out and get the chance to try it out for 6 months in your own industrial installation.

Stop by our booth in Hall 6, as well as the OPC Foundation booth in Hall 7 to see all of this for yourself!

New Skype, Microsoft Teams integrations focus on video

Microsoft has been retooling its collaboration roadmap to focus on Microsoft Teams, the cloud and interoperability. As a result, many of the vendor’s partners are capitalizing on Microsoft’s roadmap with new integrations and services.

At Microsoft Ignite, several Microsoft partners unveiled Skype for Business and Microsoft Teams integrations that focused on video conferencing. These integrations focus on interoperability and user experience.

Polycom said its video and audio portfolio will integrate with Microsoft Teams. The Microsoft Teams integrations will allow users and administrators to have the Teams user interface, workflow and functionality on Polycom devices.

The integration extends to Polycom’s RealPresence Group Series, a Microsoft-certified standards-based group video conferencing system, the RealConnect interoperability service, and the Polycom Trio conferencing phones.

At Ignite, Polycom also announced RealConnect Hybrid, which connects on-premises Skype for Business users with existing video devices. When users are ready to move to Office 365 and Skype for Business Online, they can update their subscription to RealConnect. The hybrid interoperability service will be available in October.

The Polycom MSR Series, its next-generation Skype for Business room system, is also available for pre-order. The MSR Series includes a Surface Pro tablet with the Skype for Business interface, an MSR dock to connect to existing room displays and peripherals, the Polycom Trio 8500/8800 speakerphone and an EagleEye camera for medium-to-large meeting rooms.

Pexip develops Microsoft Teams video interoperability

Pexip said it will develop and deliver standards-based video conferencing interoperability with Microsoft Teams, which will allow traditional video conferencing users to join Microsoft Teams video calls and meetings.

Pexip offers a similar service for Skype for Business in its Infinity Fusion gateway. Pexip’s Microsoft Teams integration will use the platform to provide video, audio and content sharing capabilities between Microsoft Teams users and video conferencing users. Organizations can extend Microsoft Teams video meetings to legacy video meeting room services.

The gateway service will offer a native user experience for both Teams users and legacy video conferencing users. Video conferencing systems joining a Microsoft Teams meeting can be managed like a Teams participant, while users on the video conferencing system will have the standard video conferencing experience.

The Infinity Fusion gateway for Microsoft Teams is currently in development. It is designed for any size organization and can accommodate any number of users and simultaneous meetings.

Videxio offers Skype for Business conferencing gateway

Videxio introduced a video conferencing gateway service within the Microsoft Azure cloud that enables users on dedicated video conferencing devices to join Skype for Business meetings.

The company said the gateway service addresses interoperability challenges between Skype for Business and video conferencing devices from vendors such as Cisco and Huawei. The service allows video conferencing and Skype for Business users to hold meetings with their respective native user experience.

Tom-Erik Lia, Videxio CEO, said in a statement that the number of Skype for Business users connecting with third-party video systems on the Videxio service has increased significantly over the past two years, indicating a growing need for interoperability.

The service was created in partnership with Pexip’s Skype for Business Server-certified gateway and will be deployed in Microsoft Azure. The gateway service will work with Skype for Business Online, as well as on-premises and hybrid Skype for Business deployments. It will be available in the fourth quarter for video conferencing systems hosted in Videxio’s cloud.

Healthcare quality goals set for telehealth, interoperability

The quality of healthcare and health IT interoperability are continuing concerns among healthcare professionals. To address these concerns, the National Quality Forum and its telehealth committee met recently to discuss ways to measure healthcare quality and interoperability.

The National Quality Forum (NQF) was asked by the Health Department to accomplish two tasks: identify critical areas where measurement can effectively assess the healthcare quality and impact of telehealth services, and assess the current state of interoperability and its impact on quality processes and outcomes.

In a media briefing last week, NQF experts and members of the committee in charge of the two aforementioned tasks discussed the thought process behind the development of healthcare quality measures and the goal the committee hopes these measures will help achieve.

“After a comprehensive literature review conducted by NQF staff, the telehealth committee developed measurement concepts … across four distinct domains: access to care; financial impact and cost; telehealth experience for patients, care givers, care team members and others; as well as effectiveness, including system, clinical, operational and technical,” said Jason Goldwater, senior director at NQF, during the briefing.

Goldwater said that, ultimately, the following areas were identified as the highest priorities: “The use of telehealth to decrease travel, timeliness of care, actionable information, the added value of telehealth to provide evidence-based best practices, patient empowerment and care coordination.”

Those of us that live in the world of telemedicine believe not only are there quality enhancements, but there’s convenience enhancements that are going to make medicine easier to deliver.
Judd Hollanderassociate dean of strategic health initiatives, Thomas Jefferson University

Judd Hollander, associate dean of strategic health initiatives at Thomas Jefferson University and a member of the NQF telehealth committee, explained that the committee wanted to begin this process of creating measures for telehealth and interoperability in healthcare by conducting an “environmental scan.”

“Where is there data and where are there data holes and what do we need to know?” Hollander said. “After we informed that and took a good look at it we started thinking, what are types of domains and subdomains and measure concepts that the evidence out there helps us illustrate but the evidence we’re lacking can also be captured? … So it was a really nice way to begin the discussion.”

Hollander added that the implications of the NQF report and the measures the committee is working on are “expected to inform policy across the entire spectrum of alternative payment models, government funded healthcare, and care funded by commercial payers because it’s just what you should be assessing to provide quality care.”

NQF’s telehealth measures: Patient experience

For healthcare to truly reap the benefits of telehealth, the industry has to focus on quality first. And to improve healthcare quality, there has to be a way to measure and report it, Hollander said.

“Those of us that live in the world of telemedicine believe not only are there quality enhancements, but there’s convenience enhancements that are going to make medicine easier to deliver,” Hollander said.

Hollander used a personal experience as an example of the benefits telehealth can bring to patients, even if a diagnosis isn’t or cannot be made via telehealth technologies.

“I had a patient who hurt his knee working in Staples, actually, at about 5:15, 5:30 in the evening. He had a prior knee injury and he had an orthopedist, but he couldn’t reach the orthopedist because their offices were closed,” Hollander said.

Without telehealth, this patient would have had to go to the emergency department, he would’ve waited hours to be seen, and then he would’ve been examined and had X-rays done, Hollander said.

Not only would this have taken a long time, it also would’ve cost this patient a lot of money, Hollander added.

Instead of going to the ER, the patient was able to connect with Hollander through JeffConnect, Jefferson University Hospitals’ app that enables patients to connect with doctors anytime, anywhere.

“I was the doc on call. We do know how to examine knees by telemedicine and we can tell with over 99% accuracy whether someone has a fracture or not and he did not,” he said.

Hollander explained that they then did a little “wilderness medicine.” Using materials lying around, the patient was splinted with yard sticks and an ace bandage and then was able to wait to see his orthopedist the next day.

“So we didn’t actually really solve his problem, but we saved him a ton of time and money; he didn’t have to go get X-rays one day, [then] have them repeated by the orthopedist who couldn’t see him [until] the next day because the systems aren’t interoperable,” Hollander said.

NQF’s telehealth measures: Rural communities

Marcia Ward, director of the Rural Telehealth Research Center at the University of Iowa and also an NQF telehealth committee member, brings a rural perspective to the telehealth conversation.

“Creating this framework we had to look across all of those different aspects of telehealth and how it could be applied. I find it particularly interesting that telehealth has been thought of as an answer for increasing access in rural healthcare … and I think that’s been one of the strongest suits,” she said during the briefing. “But now it’s developing into an urban application and I think we’ll see particular growth in that.”

Ward used the concept of travel in rural areas as an example of thinking of a unique, and maybe not always obvious, issue to address when creating telehealth measures.

“Travel is a concept that is very important, particularly in rural telehealth,” Ward said. “An example of that is there’s a telestroke program at the Medical University of South Carolina and one of the measures that they use is how many of the patients that are seen through their telestroke program at the rural hospitals are able to stay at their local rural hospital.”

This is an example of a healthcare quality measure that wouldn’t normally be seen in conventional medicine but is very appropriate for telehealth in rural areas.

“That’s a very important measure concept … able to be captured. Another one particularly important in the rural area is workforce shortages and we’re seeing evidence that telehealth programs can be implemented that help bridge that gap [and] be able to deliver services in very rural areas and have the backup from telehealth hub where there’s emergency physicians,” Ward said.  And we’re seeing evidence that telehealth, in terms of rural communities in particular, it’s really filling a particular need.”

NQF’s interoperability measures

While the experts focused mainly on telehealth during the briefing, Goldwater explained that when the committee was discussing and creating measures for interoperability they conducted several interviews to help them define guiding principles.

Goldwater said that these guiding principles include:

  • “Interoperability is more than just EHR to EHR;
  • “Various stakeholders with diverse needs are involved in the exchange and use of data, and the framework and concepts will differ based on these perspectives;
  • “The term ‘electronically exchanged information’ is more appropriate to completely fulfill the definition of interoperability;
  • “And all critical data elements should be included in the analysis of measures as interoperability increases access to information.”

Ultimately the committee developed healthcare quality measures across four domains: The exchange of electronic health information to the quality of data content and the method of exchange, the usability of the exchange of electronic health information such as the data’s relevance and its accessibility, the application of exchange of electronic health information such as “Is it computable?” and the impact of interoperability such as patient safety and care coordination, Goldwater said.