Go to Original Article
The financial services community has unprecedented opportunity ahead. With new technologies like cloud, AI and blockchain, firms are creating new customer experiences, managing risk more effectively, combating financial crime, and meeting critical operational objectives. Banks, insurers and other services providers are choosing digital innovation to address these opportunities at a time when competition is increasing from every angle – from traditional and non-traditional players alike.
At the same time, our experience is that lack of clarity in regulation can hinder adoption of these exciting technologies, as regulatory compliance remains fundamental to financial institutions using technology they trust. Indeed, the common question I get from customers is: Will regulators let me use your technology, and have you built in the capabilities to help me meet my compliance obligations?
With this in mind, we applaud the European Banking Authority’s (EBA) revised Guidelines on outsourcing arrangements which, in part, address the use of cloud computing. For several years now we have shared perspectives with regulators on how regulation can be modernized to address cloud computing without diminishing the security, privacy, transparency and compliance safeguards necessary in a native cloud or hybrid-cloud world. In fact, cloud computing can afford financial institutions greater risk assurance – particularly on key things like managing data, securing data, addressing cyber threats and maintaining resilience.
At the core of the revised guidelines are a set of flexible principles addressing cloud in financial services. Indeed, the EBA has been clear these “guidelines are subject to the principle of proportionality,” and should be “applied in a manner that is appropriate, taking into account, in particular, the institution’s or payment institution’s size … and the nature, scope and complexity of its activities.” In addition, the guidelines set out to harmonize approaches across jurisdictions, a big step forward for financial institutions to have predictability and consistency among regulators in Europe. We think the EBA took this smart move to support leading-edge innovation and responsible adoption, and prepare for more advanced technology like machine learning and AI going forward.
Given these guidelines reflect a modernized approach that transcends Europe, we have updated our global Financial Services Amendment for customers to reflect these key changes. We have also created a regulatory mapping document which shows how our cloud services and underlying contractual commitments map to these requirements in an EU Checklist. The EU Checklist is accessible on the Microsoft Service Trust Portal. In essence, Europe offers the benchmark in establishing rules to permit use of cloud for financial services and we are proud to align to such requirements.
Because this is such an important milestone for the financial sector, we wanted to share our point-of-view on a few key aspects of the guidelines, which may help firms accelerate technology transformation with the Microsoft cloud going forward:
- Auditability: As cloud has become more prevalent, we think it is natural to extend audit rights to cloud vendors in circumstances that warrant it. We also think that audits are not a one-size-fits-all approach but adaptable based on use cases – particularly whether it involves running core banking systems in the cloud. Microsoft has provided innovations to help supervise and audit hyper-scale cloud, including:
- Data localization: We are pleased there are no data localization requirements in the EBA guidance. Rather, customers must assess the legal, security and other risks where data is stored, as opposed to mandating data be stored strictly in Europe. We help customers manage and assess such risk by providing:
- Contractual commitments to store data at rest in a specified region (including Europe).
- Transparency where data is stored.
- Full commitments to meet key privacy requirements, like the General Data Protection Regulation (GDPR).
- Flow-through of such commitments to our subcontractors.
- Subcontractors. The guidelines address subcontractors, particularly those that provide “critical or important” functions. Management, governance and oversight of Microsoft’s subcontractors is core to what we do. Among other things:
- Microsoft’s subcontractors are subject to a vetting process and must follow the same privacy and governance controls we ourselves implement to protect customer data.
- We provide transparency about subcontractors who may have access to customer data and provide 180 days notification about any new subcontractors as well.
- We provide customers termination rights should they conclude a subcontractor presents a material increase in risk to a critical or important function of their operations.
- Core platforms: We welcome the EBA’s position providing clarity that core platforms may run in the cloud. What matters is governance, documenting protocols, the security and resiliency of such systems, and having appropriate oversight (and audit rights), and commitments to terminate an agreement, if and when that becomes necessary. These are all capabilities Microsoft offers to its customers and we now see movement among leading banks to put core systems into our cloud because of the benefits we provide.
- Business Continuity and Exit Planning. Institutions must have business continuity plans and test them periodically for use of critical or important functions. Microsoft has supported our customers to meet this requirement, including providing a Modern Cloud Risk Assessment toolkit and, in addition, in the Service Trust Portal documentation on our service resilience architecture, our Enterprise Business Continuity Management team (EBCM), and a quarterly report detailing results from our recent EBCM testing. In addition, we have supported our customers in preparing exit planning documentation, and we work with industry bodies like the European Banking Federation towards further industry guidance for these new EBA requirements.
- Concentration risk: The EBA addresses the need to assess whether concentration risk may exist due to potential systemic failures in use of cloud services (and other legacy infrastructure). However, this is balanced with understanding what the risks are of a single point of failure, and to balance those risks and trade-offs from existing legacy systems. In short, financial institutions should assess the resiliency and safeguards provided with our hyper-scale cloud services, which can offer a more robust approach than systems in place today. When making those assessments, financial institutions may decide to lean-in more with cloud as they transform their businesses going forward.
The EBA framework is a great step forward to help modernize regulation and take advantage of cloud computing. We look forward to participating in ongoing industry discussion, such as new guidance under consideration by the European Insurance and Occupational Pension Authority concerning use of cloud services, as well as assisting other regions and countries in their journey to creating more modern policy that both supports innovation while protecting the integrity of critical global infrastructure.
For more information on Microsoft in the financial services industry, please go here.
Top photo courtesy of the European Banking Authority.
Go to Original Article
Author: Microsoft News Center
Security researchers discovered a set of vulnerabilities in Supermicro servers that could allow threat actors to remotely attack systems as if they had physical access to the USB ports.
Researchers at Eclypsium, based in Beaverton, Ore., discovered flaws in the baseboard management controllers (BMCs) of Supermicro servers and dubbed the set of issues “USBAnywhere.” The researchers said authentication issues put servers at risk because “BMCs are intended to allow administrators to perform out-of-band management of a server, and as a result are highly privileged components.
“The problem stems from several issues in the way that BMCs on Supermicro X9, X10 and X11 platforms implement virtual media, an ability to remotely connect a disk image as a virtual USB CD-ROM or floppy drive. When accessed remotely, the virtual media service allows plaintext authentication, sends most traffic unencrypted, uses a weak encryption algorithm for the rest, and is susceptible to an authentication bypass,” the researchers wrote in a blog post. “These issues allow an attacker to easily gain access to a server, either by capturing a legitimate user’s authentication packet, using default credentials, and in some cases, without any credentials at all.”
The USBAnywhere flaws make it so the virtual USB drive acts in the same way a physical USB would, meaning an attacker could load a new operating system image, deploy malware or disable the target device. However, the researchers noted the attacks would be possible on systems where the BMCs are directly exposed to the internet or if an attacker already has access to a corporate network.
Rick Altherr, principal engineer at Eclypsium, told SearchSecurity, “BMCs are one of the most privileged components on modern servers. Compromise of a BMC practically guarantees compromise of the host system as well.”
Eclypsium said there are currently “at least 47,000 systems with their BMCs exposed to the internet and using the relevant protocol.” These systems would be at additional risk because BMCs are rarely powered off and the authentication bypass vulnerability can persist unless the system is turned off or loses power.
Altherr said he found the USBAnywhere vulnerabilities because he “was curious how virtual media was implemented across various BMC implementations,” but Eclypsium found that only Supermicro systems were affected.
According to the blog post, Eclypsium reported the USBAnywhere flaws to Supermicro on June 19 and provided additional information on July 9, but Supermicro did not acknowledge the reports until July 29.
“Supermicro engaged with Eclypsium to understand the vulnerabilities and develop fixes. Supermicro was responsive throughout and worked to coordinate availability of firmware updates to coincide with public disclosure,” Altherr said. “While there is always room for improvement, Supermicro responded in a way that produced an amicable outcome for all involved.”
Altherr added that customers should “treat BMCs as a vulnerable device. Put them on an isolated network and restrict access to only IT staff that need to interact with them.”
Supermicro noted in its security advisory that isolating BMCs from the internet would reduce the risk to USBAnywhere but not eliminate the threat entirely . Firmware updates are currently available for affected Supermicro systems, and in addition to updating, Supermicro advised users to disable virtual media by blocking TCP port 623.
Go to Original Article
On last week’s earnings call with financial analysts, Workday Inc. CEO Aneel Bhusri was asked for his opinion on the broader economic outlook. He was both vague and definitive. His company wasn’t seeing problems in its own product pipeline, “but there is no question there is uncertainty in the air,” he said.
In HR departments, the uncertainty has turned into action, according to Gartner. In a survey of 171 HR managers, 92% said they are now “prioritizing budgeting and cost optimization initiatives.”
This means HR managers are taking specific steps to control spending, said Daniel Dirks, managing vice president at Gartner’s HR practice. The most likely effects are on department hiring and technology buying decisions, he said.
A hiring freeze would be near the top of HR budget actions, “because it is relatively easy to do,” Dirks said. HR managers are also taking a hard look at their tech vendor contracts. They are “making sure what was promised in the contract is really being delivered,” he said.
But no worries, so far, in HR tech
The concern about an economic downturn is not turning up in HR vendor spending.
On Aug. 29, Workday, for instance, reported total revenues of nearly $888 million, an increase of 32% from the same quarter a year ago.
Aneel BhusriCEO, Workday
ADP LLC recently reported a revenue increase of 6% to $14.2 billion. Lisa Ellis, a MoffettNathanson partner who leads its payments, processors, and IT services business, described the increase as “great results” on a July 31 analyst call. Ellis also noted on the call that ADP’s guidance for 2020 “implies a pretty robust outlook on the U.S. economy.”
Venture capital (VC) investments in HR remain strong, according to HRWins, which reported nearly $1.5 billion in global HR VC investment in the second quarter. It sees 2019 VC investment outpacing last year by a strong margin.
Nonetheless, Dirks said there is a “change in sentiment and in mindset” in HR because of the economy. For the last 10 years, HR priorities have focused on finding talent and investing in tech; cost optimization is now emerging as a new priority. But Dirks said the new priority isn’t necessarily emerging at the expense of HR’s other priorities.
Gartner is also advising HR managers to play a broader role in watching the economy. It recommends teaming up with peers in finance and sales, for instance, to look at broader economic data.
HR managers have expertise in the labor market and may be able to identify market shifts. This could include, for instance, an increase in part-time hiring, if firms are becoming more conservative in hiring full time.
Go to Original Article
At the IFA tradeshow in Berlin, ASUS celebrated its 30th anniversary and unveiled a lineup of digital solutions aimed at content creators and gamers.To meet the evolving needs of professional workflows in traditional content-creation fields, such as photography and videography, as well as provide new solutions for 3D designers, game developers and more, ASUS is introducing a brand-new lineup of ProArt products.
ASUS touts the ProArt StudioBook One as its most powerful StudioBook ever. It’s the first laptop to feature NVIDIA Quadro RTX 6000 graphics and is powered by the latest 9th Generation Intel Core i9 processor. NVIDIA Quadro RTX graphics provide users with more CUDA, RT and Tensor cores, enabling smoother and more efficient rendering of animations, 8K video editing and data calculations.
ProArt StudioBook One has a powerful cooling system featuring a lightweight aerospace-grade Titanium alloy thermal module designed to optimize inlet and exhaust flow. All heat generating components, including the CPU, GPU and thermal systems are located in the lid to ensure comfortable use, even when placed on the user’s lap. When the laptop is opened, the outer cover of the lid opens automatically by 4.57 degrees to aid cooling.
The ProArt StudioBook Pro X is the first Quadro laptop to feature the four-sided ASUS NanoEdge display. This display design provides a 92% screen-to-body ratio and 16:10 aspect ratio for immersive visuals. It also supports a wide color gamut with 97% DCI-P3 color space coverage and delivers high color-accuracy. This is also the first in the series to feature ScreenPad 2.0. This interactive secondary touchscreen upgrades the traditional laptop experience, providing you with an intuitive smartphone-like interface on which you can easily manage tasks and streamline your workflows.
You can edit and render multilayered files with professional-grade NVIDIA Quadro RTX 5000 graphics, optimized for stability and performance with professional software apps. This device is powered by a 9th Generation Intel Xeon or Intel Core i7 hexa-core processor designed to handle complex, multithreaded applications. It is designed to operate at full load without the need to throttle the speed of the CPU or GPU, making it outstandingly reliable for even the toughest workloads.
The ProArt Station D940MX is a compact workstation-grade desktop designed for content creators and media professionals. It has a dual-sided logic board to house its powerful CPU, GPU and memory. It’s powered by an Intel Core i9 processor with up to 64GB DDR4 2666MHz memory and NVIDIA Quadro RTX 4000 or NVIDIA GeForce RTX 2080 Ti graphics. It also has dual storage with a 512GB PCIe SSD and 1TB HDD, and ultrafast connectivity, including dual Thunderbolt 3 ports on the front panel.
Strix laptop in Glacier Blue
The ASUS Republic of Gamers also introduced laptops at IFA 2019. The Zephyrus and Strix laptops in a new Glacier Blue hue are intended to appeal to graphic artists, 3D animators, game designers and video editors – to name just a few examples – who can speed up their work thanks to pre-installed NVIDIA Creator Ready Drivers that improve performance for creative apps and programs, such as the Adobe Creative suite, CINEMA 4D and Unreal Engine.
Strix G elevates core gaming essentials in an affordable yet potent package, while Zephyrus M and Zephyrus S are hybrid powerhouses that offer a mix for work and play. Each is available with up to a 9th Generation Intel Core i7-9750H processor, enabling these machines to slice through serious workloads with speed. The new six-core CPU can hit frequencies of up to 4GHz on a single core with Turbo Boost 2.0 technology, and Hyper-Threading enables up to 12 parallel threads to accelerate heavy duty work.
Add up to 32GB of DDR4-2666 memory, and each one of these machines can handle serious multitasking for gamers and content creators alike. Multimedia professionals can work on intensive projects, like video editing and 3D rendering, quickly and efficiently. Gamers benefit from the ability to broadcast high-quality streams, chat with the channel and play the latest titles – all at the same time.
Find out more about ASUS at IFA 2019.
Days after VMware’s CEO proclaimed his vSAN product the winner in the hyper-converged infrastructure space, the CEO of VMWare rival Nutanix countered that VMware “sells a lot of vaporware.”
“We’re crushing Nu … I mean we’re winning in the marketplace,” VMware CEO Pat Gelsinger said during his opening VMworld keynote last week. “We’re separating from No. 2. We’re winning in the space.”
Two days later on Nutanix’s earnings call, CEO Dheeraj Pandey took a shot at VMware without mentioning the company by name. “We don’t sell vaporware,” he said, when referring to why Nutanix wins in competitive deals.
In an exclusive interview after the call, Pandey admitted the vaporware charge was aimed mostly at VMware’s vSAN HCI software.
“VMware sells a lot of vaporware,” Pandey said. “A lot of that vaporware becomes evident to customers who buy that stuff. When bundled products don’t deliver on their promise, they call us. What we sell is not shelfware.”
Whatever VMware is selling with its vSAN HCI software, it is working. VMware reported license bookings of its vSAN HCI software grew 45% year-over-year last quarter, while Nutanix revenue and bookings slipped from last year. VMware’s parent Dell also claimed a 77% increase in orders of its Dell EMC VxRail HCI appliances that run vSAN software. Those numbers suggest Dell increased market share against Nutanix, even if Nutanix did better than expected last quarter following a disappointing period. IDC listed VMware as the HCI software market leader and Dell as the hardware HCI leader in the first quarter of 2019, with Nutanix second in both categories. Gartner lists Nutanix as the HCI software leader, but No. 2 VMware made up ground in Gartner’s first-quarter numbers.
Nutanix’s Pandey attributed at least some of VMware’s HCI success to bundling its vSAN software with its overall virtualization stack. Like VMware, Nutanix has its own hypervisor (AHV) and its share of hardware partners — including Dell — but VMware has a huge vSphere installed base to sell vSAN into.
Pandey said he was unimpressed by VMware’s Kubernetes and open source plans laid out at VMworld, which included Tanzu and Project Pacific. Both are still roadmap items but reflect a commitment from VMware to containers and open source software.
“That’s worse than vaporware, that’s slideware,” Pandey said of VMware’s announcements. “Everything works in slides. We’re based on Linux; we get a lot of leverage out of open source. AHV was based on Linux, and we’ve made it enterprise grade.”
Making vSAN part of its vSphere virtualization platform has paid off for VMware. Customers at VMworld pointed to their familiarity with VMware and vSAN’s integration with vSphere, and its NSX software-defined networking as reasons for going with vSAN HCI.
“What really end up selling it for us was, we were already using VMware for our base product and the vast majority of the deliverables that our customers request is in vSphere,” said Lester Shisler, senior IT systems engineer at Harmony Healthcare IT, based in South Bend, Ind. “So whatever pain points we learned along the way with vSAN, we were going to have to learn [with a competing HCI product] as well, along with new software and new management and everything else.”
Matthew Douglas, chief enterprise architect at Sentara Healthcare in Norfolk, Va., said Nutanix was among the HCI options he looked at before picking vSAN.
“VMware was ultimately the choice,” he said. “All the others were missing some components. VMWare was a consistent platform for hyper-converged infrastructure. Plus, there was NSX and all these things that fit together in a nice, uniform fashion. And as an enterprise, I couldn’t make a choice of all these independent different tools. Having one consistent tool was the differentiator.”
Despite losing share, Nutanix’s last-quarter results were mixed. Its revenue of $300 million and billings of $372 million were both down from last year but better than expected following the disappointing previous quarter. Nutanix’s software and support revenue of $287 million increased 7%, a good sign for the HCI pioneer’s move to a software-centric business model. Nutanix also reported a 16% growth in deals over $1 million from the previous quarter.
However, operating expenses also increased. Sales and marketing spend jumped to $254 million from $183 million the previous year. Nutanix, which has never recorded a profit, lost $194 million in the quarter — more than double its losses from a year ago. It finished the quarter with $909 million in cash, down from $943 million last year.
Pandey said he is more concerned about growth and customer acquisition than profitability.
“Profitability is a nuanced word,” Pandey said. “We defer so much in our balance sheet. Right now we care about doing right by the customer when we sell them subscriptions.”
Go to Original Article
At IFA 2019, Acer announced its ConceptD Pro series notebook family with NVIDIA Quadro GPUs, built for peak performance and hours of uninterrupted use. The Pro series with Windows 10 meets the stringent requirements of applications in virtual reality, artificial intelligence and big data analytics. The notebooks also feature amber backlit keyboards.The new ConceptD Pro series notebooks featuring 9th Generation Intel Core processors and Windows 10 are designed not only for professional creators but also AI engineers and software developers.
At the top end, the ConceptD 9 Pro is a 17.3-inch notebook that targets complex engineering simulations with up to NVIDIA Quadro RTX 5000 GPUs with 16GB VRAM and 32 GB of DDR4 memory. Thanks to Acer’s CNC-machined Ezel Aero Hinge, the 4K display (3840 x 2160) can flip, extend and recline for convenient collaboration between team members. It’s aimed to handle AI/deep learning, engineering simulations and large animation studios requiring power, flexibility and cross-compatibility. A Wacom EMR stylus is also included and magnetically attaches to the device.
At 15.6 inches, the ConceptD 7 Pro is designed for power on-the-go with up to NVIDIA Quadro RTX 5000 GPUs with 16 GB VRAM and 32 GB of DDR4 memory. This model is ideal for data scientists, software developers, engineers and professional design studios who demand power, flexibility and compatibility that is tested with all certified professional software applications.
The ConceptD 5 Pro is aimed at users in the advanced creative space and is available with 15- and 17-inch options – both with 4K UHD resolution – and NVIDIA Quadro RTX 3000 GPUs. It’s a device geared for complex CAD design, animation and simulation workflows, so it’s designed to appeal to architects, 3D animators, special effects producers and small design studios.
The ConceptD 3 Pro is a notebook for creators like photographers, industrial design students and graphic designers, featuring NVIDIA Quadro T1000 GPUs and a PANTONE-Validated display. Users can process all their media with true color reproduction and can login via a touch of the built-in fingerprint reader through Windows Hello for easy and more secure access.
Finally, the new ConceptD CM2241W is a desktop monitor featuring a slim bezel, excellent color accuracy supporting 99% of the Adobe RGB color gamut and hyper-smooth viewing, with a refresh rate up to 75 Hz.
Acer also introduced the Predator Triton 300 gaming notebook, which expands the Triton line with an affordable solution for mainstream gamers who also appreciate thin and lightweight designs. Up to a 9th Generation Intel Core i7 Processor is paired with an NVIDIA GeForce GTX 1650 GPU and 16GB of DDR4 2666Hz memory (upgradable to 32GB). To accommodate massive amounts of game storage, it will support up to two 1TB PCIe NVMe SSDs in RAID 0 and up to a 2TB hard drive. Killer Wi-Fi 6 AX 1650, along with Killer Ethernet, keeps the action moving quickly and lag-free.
A bright 15.6-inch Full-HD IPS display with a narrow bezel design includes a 144Hz refresh rate and 3ms overdrive response time, delivering realistic graphics and colors that pop, highlight every detail and bring games to life.
Now with a 300Hz 15.6-inch Full-HD display, the Predator Triton 500 is a powerful gaming notebook slimmed down to just .70 inches thin and weighing 4.6 pounds. It has a durable, all-metal chassis and narrow bezels for an 81% body-to-screen ratio.
In addition to these announcements, Acer also debuted refreshed models in its popular Swift ultra-portable notebook and Aspire all-in-one desktop ranges, all powered by the latest 10th Generation Intel Core processors and Windows 10.
The latest generation, 14-inch Swift 5 weighs just 990 grams with a new discrete NVIDIA GeForce MX250 graphics option, or even less with solely integrated Intel Iris Pro graphics while measuring just 14.95 mm thin. The screen features a three-side narrow bezel with a high screen-to-body ratio of 86.4%. It has a full-function USB3.1 Type-C connector that supports Thunderbolt 3, dual-band Wi-Fi 6 (802.11ax) and supports Windows Hello through a fingerprint reader.
The Swift 3 packs a lot of power into its 15.95 mm thin chassis with up to a 10th Generation Intel Core i7-1065G7 processor and an optional discrete NVIDIA GeForce MX250 GPU, keeping everything running at its optimum speed. It also includes up to 512 GB PCIe Gen 3×4 SSD storage, 16GB LPDDR4 RAM and dual-band Wi-Fi 6, for a smoother and more enjoyable wireless experience, making it an ultra-portable, supercharged laptop for work and play. It delivers up to 12.5 hours of battery life and fast charging capabilities: 30 minutes charging can provide four hours of battery life in video playback conditions.
Within its space-saving design, the ultra-thin Aspire C all-in-one comes with plenty of practical features for both students and the whole family to enjoy. It is available with 27-inch, 24-inch and 22-inch Full HD IPS screen options, all powered by 10th Generation Intel Core processors and an optional discrete NVIDIA MX130 GPU.
Find out more about all of Acer’s announcements at IFA.
We’re just a short time away from the biggest Gears yet with Early Access for either Xbox Game Pass Ultimate or with Gears 5 Ultimate Edition, and today while fans eagerly await its release, we’ve got a brand new trailer showing off Gears 5’s five modes.
In Gears 5, there are five thrilling ways to play: the all-new aggressive, high-stakes co-op mode Escape; the competitive Versus mode, featuring nine modes including the all-new Arcade for players of all levels; the deepest Horde Mode ever; the intuitive Map Builder and the biggest Campaign yet.
Yesterday, we shared news about latest blockbuster partnership in Gears 5 with WWE Superstar Batista making an appearance in Gears 5 as a multiplayer character, donning the armor of the legendary Marcus Fenix. Fans can also look forward to the inclusion of Sarah Connor and the T-800 Endoskeleton from Terminator: Dark Fate and Spartans Emile-A239 and Kat-B320 from Halo: Reach with Xbox Game Pass Ultimate and Gears 5 Ultimate Edition.
— Dave Bautista (@DaveBautista) September 3, 2019
Gears 5 early access will begin at 9pm on September 5th in your local time zone. For countries with multiple time zones, the earliest time zone will determine when you can play. For example, North American early access will begin at simultaneously at 9pm ET, 8pm CT and 6pm PT.
Both Xbox Game Pass Ultimate members and Gears 5 Ultimate Edition owners will be able to gear up and take the fight to the Swarm. On behalf of everyone at The Coalition, we’re excited to have fans jump into the world of Gears 5.
Go to Original Article
Author: Microsoft News Center
Transitioning to value-based care can be a tough road for healthcare organizations, but creating a plan and focusing on communication with stakeholders can help drive the change.
Value-based care is a model that rewards the quality rather than the quantity of care given to patients. The model is a significant shift from how healthcare organizations have functioned, placing value on the results of care delivery rather than the number of tests and procedures performed. As such, it demands that healthcare CIOs be thoughtful and deliberate about how they approach the change, experts said during a recent webinar hosted by Definitive Healthcare.
Andrew Cousin, senior director of strategy at Mayo Clinic Laboratories, and Aaron Miri, CIO at the University of Texas at Austin Dell Medical School and UT Health Austin, talked about their strategies for transitioning to value-based care and focusing on patient outcomes.
Cousin said preparedness is crucial, as organizations can jump into a value-based care model, which relies heavily on analytics, without the institutional readiness needed to succeed.
“Having that process in place and over-communicating with those who are going to be impacted by changes to workflow are some of the parts that are absolutely necessary to succeed in this space,” he said.
Mayo Clinic Labs’ steps to value-based care
Cousin said his primary focus as a director of strategy has been on delivering better care at a lower cost through the lens of laboratory medicine at Mayo Clinic Laboratories, which provides laboratory testing services to clinicians.
That lens includes thinking in terms of a mathematical equation: price per test multiplied by the number of tests ordered equals total spend for that activity. Today, much of a laboratory’s relationship with healthcare insurers is measured by the price per test ordered. Yet data shows that 20% to 30% of laboratory testing is ordered incorrectly, which inflates the number of tests ordered as well as the cost to the organization, and little is being done to address the issue, according to Cousin.
That was one of the reasons Mayo Clinic Laboratories decided to focus its value-based care efforts on reducing incorrect test ordering.
To mitigate the errors, Cousin said the lab created 2,000 evidence-based ordering rules, which will be integrated into a clinician’s workflow. There are more than 8,000 orderable tests, and the rules provide clinicians guidance at the start of the ordering process, Cousin said. The laboratory has also developed new datasets that “benchmark and quantify” the organization’s efforts.
To date, Cousins said the lab has implemented about 250 of the 2,000 rules across the health system, and has identified about $5 million in potential savings.
Cousin said the lab crafted a five-point plan to begin the transition. The plan was based on its experience in adopting a value-based care model in other areas of the lab. The first three steps center on what Cousin called institutional readiness, or ensuring staff and clinicians have the training needed to execute the new model.
The plan’s first step is to assess the “competencies and gaps” of care delivery within the organization, benchmarking where the organization is today and where gaps in care could be closed, he said.
The second step is to communicate with stakeholders to explain what’s going to happen and why, what criteria they’ll be measured on and how, and how the disruption to their workflow will result in improving practice and financial reimbursement.
The third step is to provide education and guidance. “That’s us laying out the plans, training the team for the changes that are going to come about through the infusion of new algorithms and rules into their workflow, into the technology and into the way we’re going to measure that activity,” he said.
Cousin said it’s critical to accomplish the first three steps before moving on to the fourth step: launching a value-based care analytics program. For Mayo Clinic Laboratories, analytics are used to measure changes in laboratory test ordering and assess changes in the elimination of wasteful and unnecessary testing.
The fifth and final step focuses on alternative payments and collaboration with healthcare insurers, which Cousin described as one of the biggest challenges in value-based care. The new model requires a new kind of language that the payers may not yet speak.
Mayo Clinic Laboratories has attempted to address this challenge by taking its data and making it as understandable to payers as possible, essentially translating clinical data into claims data.
Cousin gave the example of showing payers how much money was saved by intervening in over-ordering of tests. Presenting data as cost savings can be more valuable than documenting how many units of laboratory tests ordered it eliminated, he said.
How a healthcare CIO approaches value-based care
UT Health Austin’s Miri approaches value-based care from both the academic and the clinical side. UT Health Austin functions as the clinical side of Dell Medical School.
The transition to value-based care in the clinical setting started with a couple of elements. Miri said, first and foremost, healthcare CIOs will need buy-in at the top. They also will need to start simple. At UT Health Austin, simple meant introducing a new patient-reported outcomes program, which aims to collect data from patients about their personal health views.
UT Health Austin has partnered with Austin-based Ascension Healthcare to collect patient reported outcomes as well as social determinants of health, or a patient’s lifestyle data. Both patient reported outcomes and social determinants of health “make up the pillars of value-based care,” Miri said.
The effort is already showing results, such as a 21% improvement in the hip disability and osteoarthritis outcome score and a 29% improvement in the knee injury and osteoarthritis outcome score. Miri said the organization is seeing improvement because the organization is being more proactive about patient outcomes both before and after discharge.
For the program to work, Miri and his team needs to make the right data available for seamless care coordination. That means making sure proper data use agreements are established between all UT campuses, as well as with other health systems in Austin.
Value-based care data enables UT Health Austin to “produce those outcomes in a ready way and demonstrate that back to the payers and the patients that they’re actually getting better,” he said.
In the academic setting at Dell Medical School, Miri said the next generations of providers are being prepared for a value-based care world.
“We offer a dual master’s track academically … to teach and integrate value-based care principles into the medical school curriculum,” Miri said. “So we are graduating students — future physicians, future surgeons, future clinicians — with value-based at the core of their basic medical school preparatory work.”
Go to Original Article