Tag Archives: ongoing

Healthcare APIs get a new trial run for Medicare claims

In the ongoing battle to make healthcare data ubiquitous, the U.S. Digital Service for the Department of Health and Human Services has developed a new API, Blue Button 2.0, aimed at sharing Medicare claims information.

Blue Button 2.0 is part of an API-first strategy within HHS’ Centers for Medicare and Medicaid Services, and it comes at a time when a number of major companies, including Apple, have embraced the potential of healthcare APIs. APIs are the building blocks of applications and make it easier for developers to create software that can easily share information in a standardized way. Like Apple’s Health Records API, Blue Button 2.0 is based on a widely accepted healthcare API standard known as Fast Healthcare Interoperability Resources, or FHIR

Blue Button 2.0 is the API gateway to 53 million Medicare beneficiaries, including comprehensive part A, B and D data. “We’re starting to recognize that claims data has value in understanding the places a person has been in the healthcare ecosystem,” said Shannon Sartin, executive director of the U.S. Digital Service at HHS.

“But the problem is, how do you take a document that is mostly codes with very high-level information that’s not digestible and make it useful for a nonhealth-savvy individual? You want a third-party app to add value to that information,” Sartin said.

So, her team was asked to work on this problem. And out of their work, Blue Button 2.0 was born.

More than 500 developers have signed on

To date, over 500 developers are working with the new API to develop applications that bring claims data to consumers, providers, hospitals and, ultimately, into an EHR, Sartin said. But while there is a lot of interest, Sartin said this is just the first step when it comes to healthcare APIs.

“The government does not build products super well, and it does not do the marketing engagement necessary to get someone interested in using it,” she said. “We’re taking a different approach, acting as evangelists, and we’re spending time growing the community.”

And while a large number of developers are experimenting with Blue Button 2.0, Sartin’s group will be heavily vetting to eventually get to a much smaller number that will release applications due to privacy concerns around the claims data.

Looking for a user-friendly approach

We’re … acting as evangelists, and we’re spending time growing the community.
Shannon Sartinexecutive director of the U.S. Digital Service at HHS

In theory, the applications will make it easier for a Medicare consumer to let third parties access their claims information and then, in turn, make that data meaningful and actionable. But Arielle Trzcinski, senior analyst serving application development and delivery at Forrester Research, said she is concerned Blue Button 2.0 isn’t pushing the efforts around healthcare APIs far enough.

“Claims information is not the full picture,” she said. “If we’re truly making EHR records portable and something the consumer can own, you have to have beneficiaries download their medical information. That’s great, but how are they going to share it? What’s interesting about the Apple effort as a consumer is that you’re able to share that information with another provider. And it’s easy, because it’s all on your phone. I haven’t seen from Medicare yet how they might do it in the same user-friendly way.”

Sartin acknowledged Blue Button 2.0 takes aim at just a part of the bigger problem.

“My team is focused just on CMS and healthcare in a very narrow way. We recognize there are broader data and healthcare issues,” she said.

But when it comes to the world of healthcare APIs, it’s important to take that first step. And it’s also important to remember the complexity of the job ahead, something Sartin said her team — top-notch developers from private industry who chose government service to help — realized after they jumped in to the world of healthcare APIs. 

“We have engineers who’ve not worked in healthcare who thought the FHIR standard was overly complex,” she said. “But when you start to dig in to the complexity of health data, you recognize sharing health data with each doctor means something different. This is not as seamless as with banks that can standardize on numbers. There, a one is a one. But in health terminology, a one can mean 10 different things. You can’t normalize it. Having an outside perspective forces the health community to question it all.”

Chief data officer role: Searching for consensus

Big data continues to be a force for change. It plays a part in the ongoing drama of corporate innovation — in some measure, giving birth to the chief data officer role. But consensus on that role is far from set.

The 2018 Big Data Executive Survey of decision-makers at more than 50 blue-chip firms found 63.4% of respondents had a chief data officer (CDO). That is a big uptick since survey participants were asked the same question in 2012, when only 12% had a CDO. But this year’s survey, which was undertaken by business management consulting firm NewVantage Partners, disclosed that the background for a successful CDO varies from organization to organization, according to Randy Bean, CEO and founder of NewVantage, based in Boston.

For many, the CDO is likely to be an external change agent. For almost as many, the CDO may be a long-trusted company hand. The best CDO background could be that of a data scientist, line executive or, for that matter, a technology executive, according to Bean.

In a Q&A, Bean delved into the chief data role as he was preparing to lead a session on the topic at the annual MIT Chief Data Officer and Information Quality Symposium in Cambridge, Mass. A takeaway: Whatever it may be called, the chief data officer role is central to many attempts to gain business advantage from key emerging technologies. 

Do we have a consensus on the chief data officer role? What have been the drivers?

Randy Bean: One principal driver in the emergence of the chief data officer role has been the growth of data.

Randy Bean, CEO, NewVantage PartnersRandy Bean

For about a decade now, we have been into what has been characterized as the era of big data. Data continues to proliferate. But enterprises typically haven’t been organized around managing data as a business asset.

Additionally, there has been a greater threat posed to traditional incumbent organizations from agile data-driven competitors — the Amazons, the Googles, the Facebooks.

Organizations need to come to terms with how they think about data and, from an organization perspective, to try to come up with an organizational structure and decide who would be a point person for data-related initiatives. That could be the chief data officer.

Another driver for the chief data officer role, you’ve noted, was the financial crisis of 2008.

Bean: Yes, the failures of the financial markets in 2008-2009, to a significant degree, were a data issue. Organizations couldn’t trace the lineage of the various financial products and services they offered. Out of that came an acute level of regulatory pressure to understand data in the context of systemic risk.

Banks were under pressure to identify a single person to regulators to address questions about data’s lineage and quality. As a result, banks took the lead in naming chief data officers. Now, we are into a third or fourth generation in some of these large banks in terms of how they view the mandate of that role.

Isn’t that type of regulatory driver somewhat spurred by the General Data Protection Regulation (GDPR), which recently went into effect? Also, for factors defining the CDO role, NewVantage Partners’ survey highlights concerns organizations have about being surpassed by younger, data-driven upstarts. What is going on there?

Bean: GDPR is just the latest of many previous manifestations of this. There have been the Dodd-Frank regulations, the various Basel reporting requirements and all the additional regulatory requirements that go along with classifying banks as ‘too large to fail.’

That is a defensive driver, as opposed to the offensive and innovation drivers that are behind the chief data officer role. On the offensive side, the chief data officer is about how your organization can be more data-driven, how you can change its culture and innovate. Still, as our recent survey finds, there is defensive aspect, even there. Increasingly, organizations perceive threat coming from all kinds of agile, data-driven competitors.

Organizations need to come to terms with how they think about data and, from an organization perspective, to try to come up with an organizational structure and decide who would be a point person for data-related initiatives. That could be the chief data officer.
Randy BeanCEO and founder, NewVantage

You have written that big data and AI are on a continuum. That may be worthwhile to emphasize, as so much attention turns to artificial intelligence these days.

Bean: A key point is that big data has really empowered artificial intelligence.

AI has been around for decades. One of the reasons why it hasn’t gained traction is, in its aspects as a learning mechanism, it requires large volumes of data. In the past, data was only available in subsets or samples or in very limited quantities, and the corresponding learning on the part of the AI was slow and constrained.

Now, with the massive proliferation of data and new sources — in addition to transactional information, you also now have sensor data, locational data, pictures, images and so on — that has led to the breakthrough in AI in recent years. Big data provides the data that is needed to train the AI learning algorithms.

So, it is pretty safe to say there is no meaningful artificial intelligence without good data — without an ample supply of big data.

And it seems to some of us, on this continuum, you still need human judgment.

Bean: I am a huge believer in the human element. Data can help provide a foundation for informed decision-making, but ultimately it’s the combination of human experience, human judgment and the data. If you don’t have good data, that can hamper your ability to come to the right conclusion. Just having the data doesn’t lead you to the answer.

One thing I’d say is, just because there are massive amounts of data, it hasn’t made individuals or companies any wiser in and of itself. It’s just one element that can be useful in decision-making, but you definitely need human judgment in that equation, as well.

Page Locking comes to OneNote Class Notebooks |

Educators face an array challenges, not least of which is ongoing classroom management. As more and more teachers use Class Notebooks, stand alone or integrated with Microsoft Teams, the most common request we’ve heard from teachers is the ability to “lock” a page. This capability allows educators to have control and make the OneNote page read only for students while still allowing the teacher to add feedback or marks.  Today, we are excited to deliver on this request and begin rolling out page locking broadly to help teachers manage their classrooms and save time.

Page Locking—To further simplify classroom workflows, we are delivering on the number-one request from teachers for OneNote Class Notebooks—enabling lock pages. With our new page locking, the following capabilities are enabled:

  • Teachers can now lock all the pages of a distributed page as read-only after giving feedback to the student.
  • Teachers can unlock or lock individual pages by simply right clicking on the page on a student.
  • Teachers using Microsoft Teams to create OneNote assignments can have the page of the OneNote assignment automatically lock as read only when the due date/time passes

During our early testing process, we’ve had teachers trying out the page locking in their classrooms.  Robin Licato, an AP Chemistry and Forensic Science from St. Agnes Academy, Houston, TX had this to say: “This feature is an absolute game changer.  I am enjoying the ability to unlock a specific student who has an extension on an assignment due to illness or absence while keeping the page locked for students who did not complete the assignment on time!”

Scott Titmas, Technology Integration Specialist Old Bridge Township Public Schools, NJ was also an early beta tester of the new page locking feature. “The page locking feature is extremely intuitive, easy to use, and opens a whole new world of possibilities for teachers. It will be a welcomed feature addition for all teachers.  More encouraging than just this feature is the fact that Microsoft has consistently shown they listen to their users and user voice drives the direction of product development”

Platforms supported Initially, we are rolling this out for OneNote for Windows 10, OneNote 2016 Desktop Addin, OneNote Online, and OneNote for iPad. Most platforms will provide page locking built in to the toolbar. For OneNote desktop, download the new free add in.

For additional details on which version of OneNote is required for both teacher and students, please visit this new OneNote Class Notebook page locking support article.  It is important to read this article to understand the details before rolling this out.

Important Note #1: for OneNote 2016 Desktop MSI customers, you must deploy this Public Update first before student and teacher pages will properly lock.  Please work with your IT Admin to ensure you properly deployment this patch first.  Page Locking is not supported for OneNote 2013 Desktop clients 

Important note #2: Page Locking works best when a page is distributed or made into an assignment. For example, if students copy the pages manually from the Content Library into their own notebooks and change the page title, the teacher will have to manually right click on the student page to lock it, instead of being able to use the single checkbox to lock all pages.

Page Locking in OneNote for Windows 10

Page Locking in OneNote 2016 Desktop

 

Teacher right click to unlock a page

Class Notebook Addin version 2.5.0.0

  • Page Locking support to allow teachers to make a page of set of student pages read-only
  • Bug fixes and performance improvements

We hope you enjoy these new updates! Share any feedback at @OneNoteEDU, and if you need support or help, you can file a ticket here: http://aka.ms/edusupport.

This post was originally published on this site.

PCs made everyone more productive at home, school, and at work — and artificial intelligence could change the world just as much

better capitalism header
_


harry shum microsoft researchMicrosoft

  • This post is part of Business Insider’s ongoing series
    on Better
    Capitalism.
  • We’re already seeing the impact of AI, argues
    Microsoft’s Harry Shum.
  • To make it truly benefit everyone, it needs to be
    developed responsibly.

When Bill Gates and Paul Allen founded Microsoft more than 40
years ago, their aim was to bring the benefits of computing —
then largely locked up in mainframes — to everyone.

They set out to build software for a “personal” computer that
would help people be more productive at home, at school and at
work.

The personal computer democratized technology that was previously
available only to a few select people. Today, artificial
intelligence has the same potential.

AI offers incredible opportunities to drive global economic and
social progress.

The key to bringing the benefits of AI to everyone — not
just a select few — is to develop AI to be human-centered.

Put simply, AI systems should be created to augment human
abilities. We want AI technology to enable people to achieve
more, and we’re optimistic that this can happen.

Already, we are seeing how AI can have a tangible, useful impact.

For example, with the world’s population expected to grow by
nearly 2.5 billion people over the next quarter century, AI can
help to increase food production. A Microsoft research project
called FarmBeats is providing farmers with insights
that can help them improve agricultural yield, lower overall
costs and reduce their environmental impact.

A collaboration between Microsoft and university researchers,
called Project Premonition , aims to use AI to detect
dangerous pathogens in the environment before a disease such as
Zika becomes a full-fledged public health emergency. The system
uses everything from autonomous drones to robotic mosquito traps
to try to identify pathogens as they are emerging.

Microsoft recently announced a partnership with Seattle-based
Adaptive Biotechnologies. Our shared goal is to create a
universal blood test that reads a person’s immune system to
detect a wide variety of diseases, including infections, cancers
and autoimmune disorders, when they can be most effectively
diagnosed and treated.

Clearly, AI is beginning to augment human understanding and
decision-making. Therefore, it’s imperative for companies to
develop and adopt clear principles that guide the people
building, using and applying AI systems.

Among other things, these principles should ensure that AI
systems are fair, reliable, safe, private, secure, inclusive,
transparent and accountable.  

To help achieve this, the people designing AI systems should be
diverse, reflecting the diversity of the world in which we live.

When AI systems are used to help make life decisions, it is
particularly important that they are transparent, so people
understand how those decisions were made. And those who develop
and deploy AI systems need to be accountable for how their
systems operate.

There’s no single company or organization that can develop these
principles in a vacuum.  

Business leaders, policymakers, researchers, academics and
representatives of non-governmental groups must work together to
ensure that AI-based technologies are designed and deployed in a
responsible manner. Organizations such as the Partnership on
AI , which brings together experts from industry, academia
and civil society, will be important vehicles in advancing this
important dialogue, including by developing best practices.

By encouraging open and honest discussion, we believe that
everyone can help create a culture of cooperation, trust and
openness among AI developers, users, and the public at large.

Harry Shum is Microsoft’s executive vice president for
Artificial Intelligence and Research. Microsoft recently
published the book “
The Future Computed: Artificial intelligence
and its role in society

Get the latest Microsoft stock price here.

For Sale – PNY GTX470 GPU

As part of my ongoing clear out I have the following for sale;

GTX470 GPU – £30 inc

Sold;
i5 2500k gigabyte GA-Z68AP motherboard bundle with 8GB RAM (Corsair Vengeance LP 1600 or HyperX Genesis 1600 if you prefer – I don’t mind which sticks I keep of the 2 pairs. – £120 inc fully insured delivery

Price and currency: £150
Delivery: Delivery cost is included within my country
Payment method: ppg/bank xfer
Location: Farnborough
Advertised elsewhere?: Not advertised elsewhere
Prefer goods collected?: I have no preference

______________________________________________________
This message is automatically inserted in all classifieds forum threads.
By replying to this thread you agree to abide by the trading rules detailed here.
Please be advised, all buyers and sellers should satisfy themselves that the other party is genuine by providing the following via private conversation to each other after negotiations are complete and prior to dispatching goods and making payment:

  • Landline telephone number. Make a call to check out the area code and number are correct, too
  • Name and address including postcode
  • Valid e-mail address

DO NOT proceed with a deal until you are completely satisfied with all details being correct. It’s in your best interest to check out these details yourself.

For Sale – i5 2500k mobo bundle & GTX470

As part of my ongoing clear out I have the following for sale;

i5 2500k gigabyte GA-Z68AP motherboard bundle with 8GB RAM (Corsair Vengeance LP 1600 or HyperX Genesis 1600 if you prefer – I don’t mind which sticks I keep of the 2 pairs. – £120 inc fully insured delivery
GTX470 GPU – £30 inc

£140 the pair inc fully insured delivery

Not 100% sure on Prices so if I’m out please feel free to offer

Price and currency: £150
Delivery: Delivery cost is included within my country
Payment method: ppg/bank xfer
Location: Farnborough
Advertised elsewhere?: Not advertised elsewhere
Prefer goods collected?: I have no preference

______________________________________________________
This message is automatically inserted in all classifieds forum threads.
By replying to this thread you agree to abide by the trading rules detailed here.
Please be advised, all buyers and sellers should satisfy themselves that the other party is genuine by providing the following via private conversation to each other after negotiations are complete and prior to dispatching goods and making payment:

  • Landline telephone number. Make a call to check out the area code and number are correct, too
  • Name and address including postcode
  • Valid e-mail address

DO NOT proceed with a deal until you are completely satisfied with all details being correct. It’s in your best interest to check out these details yourself.

For Sale – i5 2500k mobo bundle & GTX470

As part of my ongoing clear out I have the following for sale;

i5 2500k gigabyte GA-Z68AP motherboard bundle with 8GB RAM (Corsair Vengeance LP 1600 or HyperX Genesis 1600 if you prefer – I don’t mind which sticks I keep of the 2 pairs. – £120 inc fully insured delivery
GTX470 GPU – £30 inc

£140 the pair inc fully insured delivery

Not 100% sure on Prices so if I’m out please feel free to offer

Price and currency: £150
Delivery: Delivery cost is included within my country
Payment method: ppg/bank xfer
Location: Farnborough
Advertised elsewhere?: Not advertised elsewhere
Prefer goods collected?: I have no preference

______________________________________________________
This message is automatically inserted in all classifieds forum threads.
By replying to this thread you agree to abide by the trading rules detailed here.
Please be advised, all buyers and sellers should satisfy themselves that the other party is genuine by providing the following via private conversation to each other after negotiations are complete and prior to dispatching goods and making payment:

  • Landline telephone number. Make a call to check out the area code and number are correct, too
  • Name and address including postcode
  • Valid e-mail address

DO NOT proceed with a deal until you are completely satisfied with all details being correct. It’s in your best interest to check out these details yourself.

For Sale – i5 2500k mobo bundle & GTX470

As part of my ongoing clear out I have the following for sale;

i5 2500k gigabyte motherboard bundle with 8GB RAM – £120 inc fully insured delivery
GTX470 GPU – £30 inc

£140 the pair inc fully insured delivery

Not 100% sure on Prices so if I’m out please feel free to offer

Price and currency: £150
Delivery: Delivery cost is included within my country
Payment method: ppg/bank xfer
Location: Farnborough
Advertised elsewhere?: Not advertised elsewhere
Prefer goods collected?: I have no preference

______________________________________________________
This message is automatically inserted in all classifieds forum threads.
By replying to this thread you agree to abide by the trading rules detailed here.
Please be advised, all buyers and sellers should satisfy themselves that the other party is genuine by providing the following via private conversation to each other after negotiations are complete and prior to dispatching goods and making payment:

  • Landline telephone number. Make a call to check out the area code and number are correct, too
  • Name and address including postcode
  • Valid e-mail address

DO NOT proceed with a deal until you are completely satisfied with all details being correct. It’s in your best interest to check out these details yourself.

GE’s Predix platform bolstered by data, domain expertise

BOSTON — CIOs looking for a digital transformation case study will find an ongoing master class at GE. With its Predix platform, which collects and analyzes sensor data from industrial assets such as MRI machines and has been generally available for 18 months, the company is attempting to position itself as the backbone of the industrial internet of things.

But the transformation efforts have been slow to produce results. The company’s earnings are lagging behind industrial competitors — making shareholders uneasy and ultimately leading to the recent departure of CEO Jeff Immelt, digital advocate and proponent of the two-year-old GE Digital.

That was the backdrop for a recent media day event at the company’s temporary headquarters in Boston. Three representatives from GE Digital — Mark Bernardo, vice president of professional services; Mike Varney, senior director of product management; and Jeff Erhardt, vice president of intelligent systems — provided an informal presentation on GE’s Predix platform, the critical role of data and domain expertise for machine learning, and what the future of GE’s young business unit might look like.

Predix platform is key

Immelt was replaced last month by John Flannery, a GE veteran who most recently worked with the company’s healthcare division. One of Flannery’s early tasks as CEO is performing a deep dive into each of GE’s businesses. He plans to complete his audit later this year and present recommendations to investors.

What Flannery’s investigation will mean for the future of the company is yet to be seen. But the representatives from GE Digital said they’ve seen no change in strategy to date and that Immelt’s vision to create the platform for the industrial IoT will likely continue.

In fact, Bernardo, a GE employee for more than 10 years, described reports that GE Digital will need to step up revenue production in 2018 as “normal GE behavior” and not a deviation from strategy.

“Our platform, our application investments, our investment in machine learning, our investment in our talent, the reason why domain expertise is important to us is because we need it in order to generate the outcomes our customers need, and to generate the growth and productivity that we need as a business,” he said. “We are as dependent on this strategy as any of our customers.”

With the mention of machine learning, Bernardo is referring, in part, to GE Digital’s 2016 acquisition of Wise.io, a startup out of Berkeley, Calif., that specialized in predicting customer behavior. That may seem like a far cry from industrial assets, but Erhardt, CEO at Wise.io at the time of acquisition, said the key to solving hard problems like predicting customer or machine behavior hinges on a common, underlying data platform that provides a foundation for application development.

“That’s what Salesforce.com has done,” Erhardt said. GE’s Predix platform is built on the same basic model. Erhardt said Wise.io observed from dealings with customers that a data platform is necessary to successfully scale a company based around machine learning, and that it was one of the reasons why being acquired by GE made sense for the startup.

Data is the new oil pipeline

For Wise.io’s part, its job is to make GE applications intelligent. Doing so generally requires computational power and machine learning algorithms — both of which have become commoditized at this point — as well as the increasingly valuable data and domain expertise, according to Erhardt.

“[Data and domain expertise] are at the forefront of both research and how you apply these intelligent techniques, as well as where you can create value,” he said.

He used GE’s intelligent pipeline integrity services products, which rely on the same basic imaging technology packaged in the healthcare business’s products, as an example. “We stick [them] in an oil pipeline and we use [them] to look for defects and weaknesses indicative of that pipeline potentially blowing up,” Erhardt said.

But the technology captures so much data — Erhardt said roughly a terabyte of images — that it can take highly trained experts months to sort out. The machine learning technology, which he defines as “the ability for computers to mimic human decision-making around a data-driven work flow,” relies on past data and decisions to flag problematic areas at super-human speeds.

“The purpose and the idea behind this is to clean up the noise and allow the people to focus on the highest risk, [the] most uncertain areas,” Erhardt said.

The technology doesn’t replace human decision-making outright. Erhardt said his team is spending a good chunk of its time striking the right balance between automation, augmentation and deference. In the latter case, the system defers to domain experts, who may have decades of experience working with complex industrial assets. Domain experts also help GE’s managed service customers prioritize anomalies surfaced by machine learning technology.

Keeping a human in the loop, in other words, is essential. “What’s really important here — and this is different than the consumer space — the cost of being wrong can be very, very high,” Erhardt said.

It’s another reason why machine learning algorithms have to be well-trained, which requires enormous amounts of data. Instead of relying on data generated by a single pipeline integrity product or even a single customer, the Predix platform enables the company to collect and aggregate data across its customer base — and even across its businesses — in a single location. This gives the machine learning tech plenty of training data to learn with and potentially gives GE Digital the raw material to create new revenue streams.

“We’re looking for commonality across these very powerful business cases that exist within our business. What it then gives us the ability to do is to create these derivative products,” Erhardt said. He cited Google’s 2015 acquisition of Waze, an application that helps users avoid traffic jams by using geolocation driver data, as an example of how companies are using data generated by one application to help power other applications. Waze remains a stand-alone application, but the data shared by drivers is now used for city planning purposes.

“The way that we approach this is if you get the core product right — if you can entice your customers to contribute back more data — you not only make that good but you create opportunities you didn’t know about before,” Erhardt said. “That’s what we’re working on.”