Tag Archives: Interoperability

UNH InterOperability Lab expands IPv6 testing amid SDN growth

The University of New Hampshire InterOperability Lab updated its IPv6 testing program to comply with new government requirements specified by the National Institute of Standards and Technology. UNH-IOL, a technology testing facility in Durham, N.H., also added support for SDN protocols in its updated program.

The testing program applies specifically to U.S. government agencies, such as NASA, that procure networking equipment and need independent certification that the products meet regulation, according to Timothy Winters, senior IP manager at UNH-IOL. The new requirements come as IPv6 adoption continues to grow globally, as indicated by Google, which said over 20% of its users now have IPv6 addresses, Winters added.

Agencies and product vendors that are UNH-IOL members send devices that need certification to the lab, where UNH students and staff test the products for a month to ensure they support IPv6 and comply.

UNH-IOL tests a range of products, including routers, switches, phones, printers and security cameras. Increasingly, however, agencies and service providers have requested UNH-IOL’s help with SDN and IoT devices, Winters said.

“We’re encountering more devices we haven’t seen,” he said. “Some of this is because of IoT, where things are actually being networked and put on a network. They’re not sitting on a proprietary link anymore.”

IPv6 testing ramps up

Timothy Winters, UNH-IOL senior IP managerTimothy Winters

As operators and service providers realize IPv4 address space is decreasing, they’ve started moving to IPv6-only networks, Winters said. This transition caused UNH-IOL to update its IPv6 testing program accordingly.

“UNH-IOL is trying to push that support, so people building applications and services — or even routers and switches — can know which things work or don’t work in an IPv6-only network,” he said. These changes look at the requirements for building, installing and updating applications — processes that sometimes sound simple, but can actually be quite complicated, he added.

UNH-IOL also patched security loopholes in the IPv6 testing program and made the overall testing more generic, so governments outside the U.S. and other user groups could adopt it, Winters said.

Equipment suppliers have two years to comply with the new IPv6 testing specification. As a result, UNH-IOL will likely see 200 to 300 devices return to the lab to undergo the updated testing, according to Winters.

“I’m sure there are companies that have made some products legacy or don’t sell them anymore, so those won’t come back in,” Winters said. “But that’s a challenge: We have to get everybody back through the program.”

USGv6 testing program flow chart
This flow chart relays the process vendors undergo for IPv6 testing on their products.

IPv6 complements SDN

For us, the exciting part is getting students involved in learning a technology like this. It gives students the ability to build tools, see devices and test them.
Timothy Winterssenior IP manager, UNH-IOL

Additionally, he said the lab now regularly receives routers without a command-line interface to test. This change comes as more service providers and equipment providers find value in SDN — and discover how IPv6 complements SDN deployments, Winters said.

“For SDN, the ability to address multiple services is helpful when you’re trying to get into networks that are so complex they have to be programmed,” he said. Service providers, for example, can use IPv6, along with disaggregation, network slicing and segment routing. The IPv6 address helps identify to which service any particular packet is going.

Along with the other testing updates, UNH-IOL added support for SDN protocols, such as NETCONF and YANG, as well as specs for IoT capabilities. By doing so, Winters said he hopes the lab will help push IPv6 deployments. And, as another plus, UNH-IOL students tackle “the latest and greatest stuff” in networking.

“For us, the exciting part is getting students involved in learning a technology like this,” he said. “It gives students the ability to build tools, see devices and test them.”

MEF targets multivendor interoperability for network services

MEF this week touted its progress in multivendor interoperability by announcing its active software-defined WAN implementation project. Three SD-WAN vendors — Riverbed Technology, Nuage Networks from Nokia and VMware’s VeloCloud — are leading the MEF project, focusing on multivendor SD-WAN use cases. Software development services provider Amartus is also participating with the SD-WAN vendors.

MEF — a Los Angeles-based association, with more than 200 members — launched its multivendor SD-WAN implementation project last year in an attempt to standardize services across multiple providers and technologies. But multivendor interoperability has numerous aspects, according to Joe Ruffles, global standards architect at Riverbed, based in San Francisco, and co-leader of the SD-WAN implementation project. Companies merge; they need to partner with somebody to increase geographic reach, or they want basic interoperability and service chaining, he said.

The implementation project allows member vendors to get their hands dirty, while actively testing and proving out proposed SD-WAN interoperability issues, Ruffles said. Each vendor uses MEF’s cloud-based dev-test platform, MEFnet, to develop its respective SD-WAN technology. They then interconnect and orchestrate those SD-WAN implementations using MEF’s Presto API, which is part of MEF’s Lifecycle Service Orchestration (LSO) framework.

The Presto API communicates with orchestration and management to help service providers manage multiple SD-WAN implementations with a single orchestrator. Additionally, it helps create better multivendor interoperability among SD-WAN controllers and edge devices, according to Ralph Santitoro, head of SDN, network functions virtualization and SD-WAN at Fujitsu and MEF distinguished fellow.

“Member companies can get together and connect their appliances or run software in the environment and actually do things,” Santitoro said. “They can actually prove out different topics or items that are important to them or the industry.”

Other MEF members can build from the existing SD-WAN implementation project or suggest additional projects and issues, Ruffles said. “It’s not so much a phase as it is continuous, depending on who has an issue and who’s available to work on it,” he added.

Standardized specs lead to better automation processes

The SD-WAN implementation project work benefits more than its current participants, according to Santitoro. By “playing in the sandbox,” members can feed the knowledge learned from the testing environment into MEF’s work on SD-WAN specifications. For example, participants can more accurately define SD-WAN requirements, capabilities, architecture and what’s needed for multivendor interoperability.

“We learn by hand what has to be done, and then we use that information to make changes or additions to the API,” Ruffles said.

In addition to the SD-WAN specs, MEF this week published specs for retail and wholesale Ethernet services, subscriber and operator Layer 1 services, and IP services. These services — especially IP services — have historically been defined in various ways, Santitoro said, which can impede automation. To combat the discrepancies, MEF is defining the fundamentals of IP services and their attributes, which will then help define and build broader services.

“We’ll create things like the VPN [virtual private network] service, internet access service, private cloud access service and operator wholesale services — particularly the IP-VPN case,” said David Ball, MEF’s services committee co-chair and editor of the MEF IP services project.

These definitions and specs will then be fed into MEF’s LSO architecture to help establish a standard vocabulary, so SD-WAN buyers and sellers understand what they need or get with particular services, Santitoro said. Further, defining services and their requirements helps create standardized processes for orchestration and automation, he added.

“Automation is really about consistency and being able to create a model of a service, so services are deployed, designed and implemented in a similar fashion,” he said.

One Virtual System Worldwide: Intra-Epic interoperability

One Virtual System Worldwide, Epic Systems Corp.’s new intra-Epic interoperability framework, is getting a warm reception from the electronic health record users and others in health IT.

The new features are contained in a simple and apparently easy-to-use clinician-facing interface in the vendor’s Epic 2018 EHR system upgrade, expected to be released in February.

Health data sharing for Epic users

The functions enable different Epic healthcare providers around the world to share medical images, book appointments, search health data and text among each other. Another function that is part of the One Virtual System Worldwide, intra-Epic system messaging, had already been available.

“I’m strongly encouraged. It’s really important for the electronic medical record vendors to lower barriers to interoperability,” Brian Clay, M.D., an Epic user and chief medical informatics officer at University of California San Diego Health, told SearchHealthIT. “This move by Epic should make sharing information easier, both for providers and patients.”

Despite wide and long-standing criticism of the giant vendor for allegedly making it hard to share data from its EHR, Epic has long maintained that it has always provided full interoperability within its own user base and with outside entities, as well.

New openness for Epic

In what looks like part of a concerted new effort to combat those perceptions, the privately held company revealed the One Virtual System Worldwide concept with an upbeat news release. It may have been the first time Epic made a major announcement publicly.

Nancy Fabozzi, principal analyst of connected health at Frost & Sullivan, said she was impressed after looking over publicly available materials about One Virtual System Worldwide.

“Anything they can do to move the needle forward on interoperability is going to be appreciated in the marketplace. What’s not to like?” Fabozzi said. “The interface, from what I see, with its clean buttons, is really nice. This is exactly the kind of thing that clinicians want to see and how they want to interact with electronic health records.”

Fabozzi added that she sees the latest Epic interoperability move as a simultaneous way to open up to the outside world, answer questions about its commitment to interoperability, and stay abreast of the fast-changing healthcare and health IT markets.

Healthcare markets changing quickly

Epic understands that the world is changing very, very dramatically, and the cloistered world they had is gone.
Nancy Fabozziprincipal analyst of connected health at Frost & Sullivan

In addition to Apple’s move into health records, the healthcare industry was roiled in recent days by the blockbuster news that Amazon, Berkshire Hathaway and JPMorgan Chase are forming an independent healthcare company for their employees.

Meanwhile, huge deals — such as CVS’ $69 billion acquisition of Aetna last year — are also reshaping healthcare, and many expect Amazon and Google, among other tech giants, to make major healthcare moves.

Amid that upheaval, Fabozzi said she thinks Epic understands it is no longer an unrivaled leader in health IT, a position it occupied — along with its chief competitor, Cerner Corp., to some extent — during the meaningful use era when Epic grew explosively, as dozens of big healthcare systems standardized on its EHR platform.

“I think Epic understands that the world is changing very, very dramatically, and the cloistered world they had is gone,” Fabozzi said. “Now, it’s about optimizing these EHR systems and responding to this changing ecosystem that demands more openness and more interoperability.”

On the patient side, Epic said its MyChart patient portal already gives patients of Epic-based healthcare systems the ability to combine health data from different providers as a personal health record that is portable among different providers.

Perhaps coincidentally, Epic recently collaborated with Cerner to help develop Apple’s new personal health record system for the Apple Health app, a similarly interoperability-focused new product.

Epic’s ‘Working Together’

With One Virtual System Worldwide, Epic is expanding data sharing and other options on the provider-facing side for clinicians and other hospital staff.

These fall under the “Working Together” concept, the newest level of the three-tier system that makes up One Virtual System Worldwide.

The first tier, Come Together, consisting of gathering data in one place, and second tier, Happy Together, presenting combined health data in an easy-to-read format, are not new and have been included for several years in versions of the Epic EHR.

Epic describes Working Together as new software capabilities that enable healthcare providers to take actions across organizations.

“We’re taking interoperability from being able to ‘view more’ to being able to ‘do more,'” Dave Fuhrmann, vice president of interoperability at Epic, based in Verona, Wis., said in the release. “Over the last decade, we expanded the amount of data that customers can exchange. Now, our new functionality ‘Working Together’ will allow clinicians to work across Epic organizations to improve the care for their patients.” 

New Epic interoperability functions

These One Virtual System Worldwide features, according to the vendor, include the following:

  • Images Everywhere enables Epic users to see medical image thumbnails from other Epic providers, click on an image from the original source and retrieve an image for review.
  • Book Anywhere allows schedulers who refer a patient to another Epic provider to directly book the appointment in the other system.
  • Search Everywhere allows users to search data from other healthcare organizations on Epic and also examine free text, such as in notes and documents.

Clay, the San Diego healthcare system CMIO, noted that physicians at UC San Diego Health routinely coordinate care with nearby providers, such as Rady Children’s Hospital, another Epic user, and now clinicians from both systems can share health data faster and better.

“This will enable us to share information more easily,” he said.

ONC unveils plan for health information sharing framework

If you want a say in how the government deals with health data interoperability, now’s your chance.

The Office of the National Coordinator for Health IT (ONC) has released draft rules for a health information sharing plan, called the Trusted Exchange Framework, and the public has until Feb. 18 to comment.

The framework stems from the interoperability provisions of the 21st Century Cures Act of 2016, a wide-ranging law that includes many aspects of healthcare and health IT, of which the health information sharing plan is only one part.

In a conference call with reporters, ONC National Coordinator Donald Rucker, M.D., called the framework concept a “network of networks,” and he noted that Congress explicitly called for a way to link disparate existing health information networks.

“How do these networks, which are typically moving very similar sets of information, how do we get them connected?” Rucker said.

Donald Rucker, M.D., ONC National CoordinatorDonald Rucker

The framework, Rucker said, is a response to what he called the “national challenge” of interoperability.

“It hasn’t been easy. Folks have made some great progress, but obviously there’s a lot of work to be done,” he said.

Among the existing networks that ONC officials are looking to link within the health information sharing framework are the many health information exchanges that have sprung up since the HITECH Act of 2009 spurred data sharing with the meaningful use program.

Other such networks include vendor-driven interoperability environments, such as the one overseen by the CommonWell Health Alliance.

CommonWell’s director, Jitin Asnaani, told Politico that he thinks the ONC model is a “path to scalable nationwide interoperability.”

Mariann Yeager, CEO of another vendor network, the Sequoia Project, was quoted by Politico expressing a somewhat more neutral assessment: “Overall, the approach seems reasonable,” but “we need to better understand the details.”

ONC envisions the Trusted Exchange Network — expected to be started by the end of 2018 and fully built out by 2021 — as being used by federal agencies, individuals, healthcare providers, public and private health organizations, insurance payers and health IT developers.

[The Trusted Exchange Framework is a] network of networks. How do these networks, which are typically moving very similar sets of information, how do we get them connected?
Donald RuckerONC national coordinator

The agency conceives of the network as a technical and governance infrastructure that connects health information networks around a core of “qualified health information networks” (Qualified HIN) overseen by a single, “recognized coordinating entity” to be chosen by ONC in a competitive bid process.

According to ONC, among other things, Qualified HINs must be able to locate and transmit electronic protected health information between multiple organizations and individuals; have mechanisms to audit participants’ compliance with certain core obligations; use a connectivity broker; and be neutral as to which participants are allowed to use the network.

A connectivity broker is a service provided by a Qualified HIN that provides the following:

  • A master patient index to accurately identify and match patients with their health information;
  • A health records locator service;
  • Both widely broadcast and specifically directed queries for health information; and
  • Guaranteed return of electronic health information to an authorized Qualified HIN that requests it.

Governance for the proposed framework consists of two parts. Part A is a set of “guardrails” and principles that health information networks should adopt to support interoperability; part B is a set of minimum required legal terms and conditions detailing how network participation agreements should be constructed to ensure health information networks can communicate with each other.

Genevieve Morris, ONC’s principal deputy national coordinator, specifically acknowledged the efforts of private sector organizations in laying groundwork for health data interoperability and noted that another private organization will coordinate the health information sharing framework.

“We at ONC recognize that our role is to make sure there is equity, scalability, integrity and sustainability in health information sharing,” Morris said.

Microsoft showcases latest industrial IoT innovations at SPS 2017

We are excited to extend our lead in standards-based Industrie 4.0 cloud solutions using the industrial interoperability standard OPC UA, with several new product announcements at SPS IPC Drives 2017 in Nuernberg, Europe’s leading industrial automation exhibition, which takes place next week.

We continue to be the only cloud provider that offers both OPC UA client/server as well as the upcoming (in OPC UA version 1.04) Publish/Subscribe communication pattern to the cloud and back with open-source modules for easy connection to existing machines, without requiring changes to these machines and without requiring opening the on-premises firewall. We achieve this though the two Azure IoT Edge modules OPC Proxy and OPC Publisher, which are available open-source on GitHub and as Docker containers on DockerHub.

As previously announced, we have now contributed an open-source OPC UA Global Discovery Server (GDS) and Client to the OPC Foundation GitHub. This contribution now brings us close to the 4.5 million source lines of contributed code landmark, keeping us in the lead as largest contributor of open-source software to the OPC Foundation. This server can be run in a container and used for self-hosting in the OT network.

Additionally at SPS, Microsoft will demo its commercial, fully Azure IoT integrated version of the GDS and accompanying client at the OPC Foundation booth. This version runs as part of the Azure IoT Edge offering made available to the public last week.

We have continued to release monthly updates to our popular open-source Azure IoT Suite Connected Factory preconfigured solution which we launched at Hannover Messe this year. Most recently, we have updated the look and feel to be in line with the new set of preconfigured solutions currently being released. We will continue to release new versions of the Connected Factory on a monthly basis with improvements and new features, so check them out on a regular basis.

Our OPC UA Gateway program also keeps growing rapidly. Since we launched the program just six months ago at Hannover Messe, we now have 13 partners signed up including Softing, Unified Automation, Hewlett Packard Enterprise, Kepware, Cisco, Beckhoff, Moxa, Advantech, Nexcom, Prosys OPC, MatrikonOPC, Kontron, and Hilscher.

Furthermore, we are excited to announce that the Industrial IoT Starter Kit, previously announced at OPC Day Europe, is now available to order online from our partner Softing. The kit empowers companies to securely link their production line from the physical factory floor to the Microsoft Azure IoT Suite Connected Factory in less than one day. This enables the management, collection, and analysis of OPC UA telemetry data in a centralized way to gain valuable business insights immediately and improve operational efficiency. As with all our Industrial IoT products, this kit uses OPC UA, but also comes with a rich PLC protocol translation software from Softing called dataFEED OPC Suite. It comes with industry-grade hardware from Hewlett Packard Enterprise in the form of the HPE GL20 IoT Gateway. Swing by the Microsoft booth to check it out and get the chance to try it out for 6 months in your own industrial installation.

Stop by our booth in Hall 6, as well as the OPC Foundation booth in Hall 7 to see all of this for yourself!

New Skype, Microsoft Teams integrations focus on video

Microsoft has been retooling its collaboration roadmap to focus on Microsoft Teams, the cloud and interoperability. As a result, many of the vendor’s partners are capitalizing on Microsoft’s roadmap with new integrations and services.

At Microsoft Ignite, several Microsoft partners unveiled Skype for Business and Microsoft Teams integrations that focused on video conferencing. These integrations focus on interoperability and user experience.

Polycom said its video and audio portfolio will integrate with Microsoft Teams. The Microsoft Teams integrations will allow users and administrators to have the Teams user interface, workflow and functionality on Polycom devices.

The integration extends to Polycom’s RealPresence Group Series, a Microsoft-certified standards-based group video conferencing system, the RealConnect interoperability service, and the Polycom Trio conferencing phones.

At Ignite, Polycom also announced RealConnect Hybrid, which connects on-premises Skype for Business users with existing video devices. When users are ready to move to Office 365 and Skype for Business Online, they can update their subscription to RealConnect. The hybrid interoperability service will be available in October.

The Polycom MSR Series, its next-generation Skype for Business room system, is also available for pre-order. The MSR Series includes a Surface Pro tablet with the Skype for Business interface, an MSR dock to connect to existing room displays and peripherals, the Polycom Trio 8500/8800 speakerphone and an EagleEye camera for medium-to-large meeting rooms.

Pexip develops Microsoft Teams video interoperability

Pexip said it will develop and deliver standards-based video conferencing interoperability with Microsoft Teams, which will allow traditional video conferencing users to join Microsoft Teams video calls and meetings.

Pexip offers a similar service for Skype for Business in its Infinity Fusion gateway. Pexip’s Microsoft Teams integration will use the platform to provide video, audio and content sharing capabilities between Microsoft Teams users and video conferencing users. Organizations can extend Microsoft Teams video meetings to legacy video meeting room services.

The gateway service will offer a native user experience for both Teams users and legacy video conferencing users. Video conferencing systems joining a Microsoft Teams meeting can be managed like a Teams participant, while users on the video conferencing system will have the standard video conferencing experience.

The Infinity Fusion gateway for Microsoft Teams is currently in development. It is designed for any size organization and can accommodate any number of users and simultaneous meetings.

Videxio offers Skype for Business conferencing gateway

Videxio introduced a video conferencing gateway service within the Microsoft Azure cloud that enables users on dedicated video conferencing devices to join Skype for Business meetings.

The company said the gateway service addresses interoperability challenges between Skype for Business and video conferencing devices from vendors such as Cisco and Huawei. The service allows video conferencing and Skype for Business users to hold meetings with their respective native user experience.

Tom-Erik Lia, Videxio CEO, said in a statement that the number of Skype for Business users connecting with third-party video systems on the Videxio service has increased significantly over the past two years, indicating a growing need for interoperability.

The service was created in partnership with Pexip’s Skype for Business Server-certified gateway and will be deployed in Microsoft Azure. The gateway service will work with Skype for Business Online, as well as on-premises and hybrid Skype for Business deployments. It will be available in the fourth quarter for video conferencing systems hosted in Videxio’s cloud.

Healthcare quality goals set for telehealth, interoperability

The quality of healthcare and health IT interoperability are continuing concerns among healthcare professionals. To address these concerns, the National Quality Forum and its telehealth committee met recently to discuss ways to measure healthcare quality and interoperability.

The National Quality Forum (NQF) was asked by the Health Department to accomplish two tasks: identify critical areas where measurement can effectively assess the healthcare quality and impact of telehealth services, and assess the current state of interoperability and its impact on quality processes and outcomes.

In a media briefing last week, NQF experts and members of the committee in charge of the two aforementioned tasks discussed the thought process behind the development of healthcare quality measures and the goal the committee hopes these measures will help achieve.

“After a comprehensive literature review conducted by NQF staff, the telehealth committee developed measurement concepts … across four distinct domains: access to care; financial impact and cost; telehealth experience for patients, care givers, care team members and others; as well as effectiveness, including system, clinical, operational and technical,” said Jason Goldwater, senior director at NQF, during the briefing.

Goldwater said that, ultimately, the following areas were identified as the highest priorities: “The use of telehealth to decrease travel, timeliness of care, actionable information, the added value of telehealth to provide evidence-based best practices, patient empowerment and care coordination.”

Those of us that live in the world of telemedicine believe not only are there quality enhancements, but there’s convenience enhancements that are going to make medicine easier to deliver.
Judd Hollanderassociate dean of strategic health initiatives, Thomas Jefferson University

Judd Hollander, associate dean of strategic health initiatives at Thomas Jefferson University and a member of the NQF telehealth committee, explained that the committee wanted to begin this process of creating measures for telehealth and interoperability in healthcare by conducting an “environmental scan.”

“Where is there data and where are there data holes and what do we need to know?” Hollander said. “After we informed that and took a good look at it we started thinking, what are types of domains and subdomains and measure concepts that the evidence out there helps us illustrate but the evidence we’re lacking can also be captured? … So it was a really nice way to begin the discussion.”

Hollander added that the implications of the NQF report and the measures the committee is working on are “expected to inform policy across the entire spectrum of alternative payment models, government funded healthcare, and care funded by commercial payers because it’s just what you should be assessing to provide quality care.”

NQF’s telehealth measures: Patient experience

For healthcare to truly reap the benefits of telehealth, the industry has to focus on quality first. And to improve healthcare quality, there has to be a way to measure and report it, Hollander said.

“Those of us that live in the world of telemedicine believe not only are there quality enhancements, but there’s convenience enhancements that are going to make medicine easier to deliver,” Hollander said.

Hollander used a personal experience as an example of the benefits telehealth can bring to patients, even if a diagnosis isn’t or cannot be made via telehealth technologies.

“I had a patient who hurt his knee working in Staples, actually, at about 5:15, 5:30 in the evening. He had a prior knee injury and he had an orthopedist, but he couldn’t reach the orthopedist because their offices were closed,” Hollander said.

Without telehealth, this patient would have had to go to the emergency department, he would’ve waited hours to be seen, and then he would’ve been examined and had X-rays done, Hollander said.

Not only would this have taken a long time, it also would’ve cost this patient a lot of money, Hollander added.

Instead of going to the ER, the patient was able to connect with Hollander through JeffConnect, Jefferson University Hospitals’ app that enables patients to connect with doctors anytime, anywhere.

“I was the doc on call. We do know how to examine knees by telemedicine and we can tell with over 99% accuracy whether someone has a fracture or not and he did not,” he said.

Hollander explained that they then did a little “wilderness medicine.” Using materials lying around, the patient was splinted with yard sticks and an ace bandage and then was able to wait to see his orthopedist the next day.

“So we didn’t actually really solve his problem, but we saved him a ton of time and money; he didn’t have to go get X-rays one day, [then] have them repeated by the orthopedist who couldn’t see him [until] the next day because the systems aren’t interoperable,” Hollander said.

NQF’s telehealth measures: Rural communities

Marcia Ward, director of the Rural Telehealth Research Center at the University of Iowa and also an NQF telehealth committee member, brings a rural perspective to the telehealth conversation.

“Creating this framework we had to look across all of those different aspects of telehealth and how it could be applied. I find it particularly interesting that telehealth has been thought of as an answer for increasing access in rural healthcare … and I think that’s been one of the strongest suits,” she said during the briefing. “But now it’s developing into an urban application and I think we’ll see particular growth in that.”

Ward used the concept of travel in rural areas as an example of thinking of a unique, and maybe not always obvious, issue to address when creating telehealth measures.

“Travel is a concept that is very important, particularly in rural telehealth,” Ward said. “An example of that is there’s a telestroke program at the Medical University of South Carolina and one of the measures that they use is how many of the patients that are seen through their telestroke program at the rural hospitals are able to stay at their local rural hospital.”

This is an example of a healthcare quality measure that wouldn’t normally be seen in conventional medicine but is very appropriate for telehealth in rural areas.

“That’s a very important measure concept … able to be captured. Another one particularly important in the rural area is workforce shortages and we’re seeing evidence that telehealth programs can be implemented that help bridge that gap [and] be able to deliver services in very rural areas and have the backup from telehealth hub where there’s emergency physicians,” Ward said.  And we’re seeing evidence that telehealth, in terms of rural communities in particular, it’s really filling a particular need.”

NQF’s interoperability measures

While the experts focused mainly on telehealth during the briefing, Goldwater explained that when the committee was discussing and creating measures for interoperability they conducted several interviews to help them define guiding principles.

Goldwater said that these guiding principles include:

  • “Interoperability is more than just EHR to EHR;
  • “Various stakeholders with diverse needs are involved in the exchange and use of data, and the framework and concepts will differ based on these perspectives;
  • “The term ‘electronically exchanged information’ is more appropriate to completely fulfill the definition of interoperability;
  • “And all critical data elements should be included in the analysis of measures as interoperability increases access to information.”

Ultimately the committee developed healthcare quality measures across four domains: The exchange of electronic health information to the quality of data content and the method of exchange, the usability of the exchange of electronic health information such as the data’s relevance and its accessibility, the application of exchange of electronic health information such as “Is it computable?” and the impact of interoperability such as patient safety and care coordination, Goldwater said.

ONC focuses on creating interoperability between health systems

Efforts to create interoperability between health systems and stop data blocking have been going on for some time now. Although there are some isolated examples of interoperability in healthcare, in general, experts agree it still is not widely happening.

In a call with reporters, officials from the Office of the National Coordinator for Health IT (ONC) discussed the federal entity’s role in promoting and creating interoperability between health systems.

Donald Rucker, M.D., national coordinator for health information technology at ONC, said the agency is focused on three interoperability use cases. The first is enabling health information to move from point A to point B, regardless of location or IT system. The second is enterprise accountability. This use case mainly addresses “complaints about ‘I can’t get my data out of a system,'” Rucker said. “That bulk transferability, that sort of fundamental data liquidity and … data in bulk so you can actually do analytics on it and see what’s going on overall.” And the third is competition and modernity. “And that’s open APIs,” Rucker said. “You look at Silicon Valley, you look at modern computing, if you go to any of the modern computer science conferences it’s all about APIs.”

The challenge is a lot of these things are far more than just standards.
Donald Rucker, M.D.national coordinator for health information technology, ONC

Genevieve Morris, principal deputy national coordinator for health information technology at ONC, explained that, because of the use cases Rucker cited, ONC is tweaking its interoperability roadmap.

“The way that we’re thinking about interoperability right now is basically four targets: technical, trust, financial and workforce,” she said.

The roadmap to interoperability

“The challenge is a lot of these things are far more than just standards,” Rucker said, explaining that business relationships tend to complicate things as well.

Rucker used problem lists — a list of patient’s ailments — as an example.

“It can accrue over time. [For example, the patient] had a cold in 1955, right? Do they still have a cold? Probably not. So you have to curate it,” Rucker said. “It’s often said, ‘I don’t have a shareable problem list. I don’t have an interoperable problem list’. … We don’t have a business model to keep the problem list up to date and meaningful.”

Rucker’s point is that, when people talk about interoperability between health systems, they’re often talking about many different issues. “They’re often asking for a whole bunch of other stuff as well,” Rucker said. “They’re asking for data curation, and maybe a part of the goal of ACOs and HMOs and value-based payments is to provide incentives for these things, but we’re not there yet.”

Underneath all of this is one question, Rucker said: “Are we going to pay for structure?”

“Is it going to be free text and maybe we throw in natural language processing or machine learning or you just read it?” Rucker said. “We’re trying to be mindful of that, we’re trying to be mindful, if you will, of the physics of information and what is plausible to regulate, what society has to sort out, what payment mechanisms we have to sort out.”

APIs: It’s complicated

Some laud APIs as the key to interoperability between health systems. But Rucker says it’s a bit more complicated.

“Two things to consider: One is the API on the vendor level, right? The technical support for the EMR vendor. The second is what does an open API mean at the provider level? Open API sort of gets thrown around, but potentially they are very different approaches,” he said.

Rucker explained that while an open API from a vendor, for example, is a set of tools enabling access to information, that information actually sits in the databases of the healthcare providers. And this is where the complexities come into play, despite APIs enabling access to information.

“So if I’m a Silicon Valley app developer, I can’t hook up to some large national EMR vendor because they don’t actually have any of the clinical data,” he said. “The data is all sitting in, you know, pick your hospital system, pick your provider. So that’s really the dichotomy.”

ONC’s role in preventing data blocking

From a regulatory point of view, creating boundaries around data blocking can be tough, Rucker said.

“A large part of our work is coming up with language that meets everyone’s needs here and that’s a difficult task,” Rucker said.

He added that ONC can’t simply mandate everyone throw away whatever systems they’re currently using and implement a totally new IT system. “We have to be mindful of what’s out there and what can be done,” Rucker said.

Rucker pointed out, however, that the 21st Century Cures Act, a U.S. law enacted in December 2016, does include a data blocking reporting requirement. Meaning that when healthcare organizations experience or come across any instance of data blocking, they must report it to ONC.

“To the extent that information blocking exists, we’re presumably going to see some set of reports from some folks on that,” he said.

Powered by WPeMatico

Introducing WebRTC 1.0 and interoperable real-time communications in Microsoft Edge

We’re excited to announce the preview availability of the WebRTC 1.0 API, and support for the H.264/AVC and VP8 video codecs for RTC in Microsoft Edge, enabling plugin-free, interoperable video communications solutions across browsers and platforms.

These features are enabled by default in Windows Insider Preview builds starting with last week’s release, 15019, and will be in stable releases beginning with the Windows 10 Creator’s Update.

Background and Object RTC

Microsoft Edge introduced support for ORTC beginning in EdgeHTML 13 (Windows 10 version 1511), providing the initial foundation for real-time communications in Edge. Our priority with the WebRTC 1.0 API support is to provide interoperability with legacy implementations on existing websites, which leverage the WebRTC API as previously deployed in other browsers.

Our WebRTC 1.0 API implementation provides support for peer-to-peer audio and video based on a subset of the W3C WebRTC-PC API circa 2015, prior to the addition of the WebRTC object model.  Because this implementation is focused on legacy interoperability (including mobile applications built from early versions of the WebRTC.org source code), we don’t plan to further update the native WebRTC 1.0 API beyond this release.

To utilize the most advanced features in the Microsoft Edge RTC stack, we recommend that considering the ORTC API, especially in situations where it is desirable to control individual transport, sender, and receiver objects directly, or to set up group audio and video calls. If you need support for objects or advanced features such as multi-stream and simulcast with the current WebRTC 1.0 API, we recommend the adapter.js library, which now supports Microsoft Edge.

Codec interoperability

The H.264/AVC and VP8 video codecs are supported in the Microsoft Edge RTC stack, which means video communications are now interoperable between Microsoft Edge and other major WebRTC browsers and RTC services. We have implemented the following congestion control and robustness mechanisms for both H.264/AVC and VP8 video codecs:

These features are available within both the ORTC API and native WebRTC 1.0 API, so you can make API and video codec decisions independently.

Note that while the Edge H.264/AVC implementation supports hardware offload within both the encoder and decoder, VP8 is implemented purely in software, and as a result may exhibit higher power consumption and CPU load.  If your application uses VP8, we recommend testing on lower-end devices to ensure acceptable performance.

What’s next

As we continue to chart our roadmap for real-time communications, our next priorities are adding support for the W3C Screen Capture specification, as well as improved support for enterprise scenarios. We look forward to sharing updates and preview builds as the work progresses.  Meanwhile, we look forward to your feedback on WebRTC and our current RTC roadmap. You can try out WebRTC 1.0 in preview builds today, and if you encounter any bugs, share feedback on Microsoft Edge Platform Issues, via Twitter at @MSEdgeDev, or in the comments below.

― Shijun Sun, Principal Program Manager, Microsoft Edge
― Bernard Aboba, Principal Architect, Microsoft Skype

Extending User Control of Flash with Click-to-Run

Adobe Flash has been an integral part of the web for decades, enabling rich content and animations in browsers since before HTML5 was introduced. In modern browsers, web standards pioneered by Microsoft, Adobe, Google, Apple, Mozilla, and many others are now enabling sites to exceed those experiences without Flash and with improved performance and security. Starting in the Anniversary Edition of Windows 10, we began to give users more control over Flash by selectively pausing certain Flash content, like ads, that were not central to the page.

In our next release, we will extend this functionality and encourage the transition to HTML5 alternatives by providing additional user control over when Flash content loads. Windows Insiders will be able to try an early implementation of this feature soon in upcoming preview builds. The user experience will evolve as we move towards a stable release in the Windows 10 Creator’s Update next year.

Sites that support HTML5 will default to a clean HTML5 experience. In these cases, Flash will not even be loaded, improving performance, battery life, and security. For sites that still depend on Flash, users will have the opportunity to decide whether they want Flash to load and run, and this preference can be remembered for subsequent visits.

Screen capture showing an Edge browser window with a dialog from the address bar which reads "Adobe Flash content was blocked. Do you want to allow Adobe Flash to run on this site?" The options are Close, Allow Once, and Allow Always.

Sample of the user experience when the user clicks on a blocked Flash control.

We are deeply aware that Flash is an integral part of many web experiences today. To ease the transition to HTML5, these changes initially will not affect the most popular sites which rely on Flash today. In the coming months, we will actively monitor Flash consumption in Microsoft Edge and will gradually shorten the list of automatic exceptions. At the end of this process, users will remain in control, and will be able to choose Flash for any site they visit.

We advise web developers to migrate to standardized content delivery mechanisms like JavaScript and HTML5 Encrypted Media Extensions, Media Source Extensions, Canvas, Web Audio, and RTC in the coming months.

This change will provide all users improved performance, greater stability, and stronger security. These changes are similar to updates coming from our friends at Apple, Mozilla, and Google. We look forward to continued work with these partners, and with Adobe, to improve the capabilities and security of the web for all users.

― John Hazen, PM Manager, Microsoft Edge
Crispin Cowan, Senior Program Manager, Microsoft Edge