Tag Archives: Director

Centra embraces transformation, improves patient care with Microsoft 365 intelligent business tools – Microsoft 365 Blog

Today’s post was written by Joseph (Jody) Hobbs, managing director of business applications and information security officer at Centra.

Profile picture of Jody Hobbs.Centra is proud to count itself among the early adopters of cloud technology in the healthcare field. Back in 2014, we saw cloud computing as a way to keep up with the rapid growth we were experiencing across the enterprise—and the challenge of adapting to industry changes under the Patient Protection and Affordable Care Act (ACA). Five years later, we’re still using Microsoft Cloud services to remain on the leading edge of business productivity software so that we can provide exceptional patient care.

With Microsoft 365, we are better able to adapt to industry-wide changes introduced by ACA, such as the transition from a fee-for-service model to a quality-based model. This change made capturing data and analytics very important, because now reimbursement is based on quality of care, not quantity of services. We use Power BI, the data analytics tool from Microsoft Office 365 E5, to meet new healthcare reporting requirements and provide a wealth of data to our clinicians. They use this data to measure their performance against quality benchmarks to improve patient experiences and health outcomes.

We also turned to Microsoft 365 to help address Centra data security and privacy policies. Microsoft accommodated our requirement for data to remain in the continental United States, which helps us comply with Health Insurance Portability and Accountability Act (HIPAA) regulations that are standard in the healthcare industry. We also found a great solution for emailing sensitive information by combining a Microsoft two-factor authentication solution with our existing encryption appliance. Microsoft invests an incredible amount in its security posture, more than we ever could, and this, along with the knowledge that our data is not intermingled with others’ data in the tenant, gives us peace of mind. And we use Office 365 Advanced Threat Protection, which gives us great insight into malicious activities aimed at our employees’ inboxes.

Keeping our Firstline Workers flexible and mobile is another major priority. We plan to get all our clinical workers online with Office 365 to actualize our vision for a more productive, mobile workforce. We have almost 4,000 employees taking advantage of Office 365 ProPlus and downloading up to five instances of Office 365 on a range of devices. This makes it seamless for them to work from home or the office using the same powerful, cloud-based productivity apps.

As Centra continues to grow from a network of hospitals to an assortment of health-related enterprises, adding everything from a college of nursing to our own insurance business, we see a cloud-based workplace solution as key to staying agile and making the most of our momentum. In Microsoft 365, we have found a solution that marries the strict security requirements of our industry with the needs of a workforce that demands anytime, anywhere access to colleagues and information. For Centra, change isn’t just a matter of increasing productivity or mobility—at the end of the day, our ability to stay up to date with the latest technology innovations means we are providing the best care possible.

Read the case study to learn more about how Centra uses Microsoft 365 to improve quality-based healthcare practices and establish mobile, highly secure work environments to expedite patient care.

Adventist Health System is enhancing healthcare delivery using Microsoft 365 – Microsoft 365 Blog

Today’s post was written by Tony Qualls, director of enterprise technical services at Adventist Health System in Altamonte Springs, Florida.

Over the years, healthcare has changed from hospital-based care to preventive and continuous care that happens throughout an individual’s life—outside of hospital walls and inside patient homes and neighborhood clinics. Consequently, Adventist Health System is in the midst of a big transformation to a more consumer-centric organization to meet the needs of patients and families at every stage of health.

Our more than 80,000 employees are embracing this new care delivery model, and as many of them are frequently on the go, they need secure, quick access to information from anywhere.

With Microsoft 365, we’re able to give them access to the information they need in a secure, compliant environment. We’ve been a longtime user of Microsoft Office 365 to deliver the latest productivity innovations to our clinical and non-clinical employees. We migrated to Microsoft 365 to gain more flexibility with our licensing for Office 365 and for the Windows 10 operating system and Microsoft Enterprise Mobility + Security (EMS).

We have 28,000 Microsoft 365 E3 licenses for our office staff and 41,000 Office 365 F1 licenses for our Firstline team members—nurses, doctors, and other employees. These individuals carry laptops and tablets with them throughout the day or access shared devices using badge-tapping technology. With Microsoft 365, we can cost-effectively license the specific applications that employees need to accomplish various tasks throughout their workdays.

For example, our clinical staff uses Skype for Business Online to improve patient flow and connect physicians with remote patients. Now, we’re taking it to the next level with Microsoft Teams—probably the fastest-growing Office 365 application we have deployed. Everything’s in one place—SharePoint Online sites, files, chat, meetings, and Microsoft Planner. It’s so easy to use, and we find that after people get invited to one Teams channel, they turn around and create channels of their own to support other projects. With Teams, we have persistent conversations, documents, and other resources about a topic in one place, which helps groups focus and move faster. In addition, it’s a highly secure environment that we trust, and we can remain completely compliant with HIPAA and other healthcare regulations.

At Adventist Health System, we strive for excellence in all that we do. Our IT employees strive to be recognized as an industry leader. Utilizing Teams is just one way we are supporting our organization’s vision to be wholistic, exceptional, connected, affordable, and viable.

Communication is crucial to the success of any organization, and Adventist Health System is no different. The quicker we can share information, updates, and plans, the faster we gain buy-in from our team members. The clinical workspace thrives on rapid communication and collaboration around patient care. This, in turn, helps foster better outcomes and patient satisfaction.

It’s exciting to see the Teams roadmap incorporating artificial intelligence capabilities by offering speech-to-text and meeting transcription services. As we gather takeaways and valuable information from meetings, I am happy that Teams allows me to focus on listening to my staff and peers while it captures and transcribes meeting notes for later review.

There’s an abundance of innovation coming from Microsoft, and we’ve taken the approach of releasing new Office 365 applications directly to employees and letting user communities provide guidance, tips, and support on Yammer channels. This has been a great adoption model that has empowered employees to put these tools to work in ways that make sense for them.

Because Microsoft matches productivity innovation with security innovation, we can confidently utilize new technologies on tens of thousands of mobile devices. We’ve standardized on Windows 10 Enterprise, chiefly for security features such as default encryption. But EMS also includes a great bundle of security tools and licensing options that have significantly decreased our licensing costs while giving us enhanced security capabilities.

From a support perspective, Microsoft Intune and mobile email with Exchange Online have been tremendous timesavers. Employees had to unenroll and re-enroll devices in a previous email security program, and our infrastructure support team was inundated with support tickets around the need to resync mobile email accounts. But with Intune, employees download the Microsoft Outlook mobile app, we apply the correct policies, and they’re off and running.

With Microsoft 365, our clinical, support, and IT staffs are all better equipped to help Adventist Health System transform its business in a secure, compliant manner to meet the needs of today’s changing healthcare landscape.

—Tony Qualls

Pharmacy benefit manager turns to analytics to fight healthcare fraud

Billions of dollars are lost to healthcare fraud annually, and one pharmacy benefit manager is fighting back by applying advanced data analytics to a combination of pharmacy and medical claims.

Prime Therapeutics, a privately held company that serves 22 Blue Cross and Blue Shield plans (Blue Plans) and more than 27 million members, is partnering with SAS, an analytics software company, to build an analytics platform designed to help reduce healthcare fraud.

Prime’s relationship with Blue Plans gives it the unique opportunity to apply data analytics not only to its pharmacy claims, but also to the health plans’ medical claims — and the SAS Fraud Framework for Healthcare gives Prime the advanced analytics power to handle the volume of data. The two companies hope to see initial results of their work by fall 2018.

Fraud becoming more complex

In the industry, pharmacy benefit managers typically have built their analytics from the ground up internally, said Jo-Ellen Abou Nader, Prime’s assistant vice president of fraud, waste and abuse operations. “The problem is fraud has gotten much more complex, and we need a big engine, as well as a partner in the industry to move this forward,” Abou Nader said. “That’s why we selected SAS. They have experience in the healthcare industry with IT, and their Fraud Framework was a great engine and opportunity for us to partner with them on building this out.”

Stu Bradley, vice president of fraud and security intelligence at SASStu Bradley

The SAS Fraud Framework, a cloud-based product hosted by SAS, enables Prime to manage the entire fraud protection lifecycle, said Stu Bradley, vice president of fraud and security intelligence at SAS. The product “has the capabilities to ingest, enrich, transform data. So we’ve got a big data management component which is inclusive of data quality. It gives us the ability to build and deploy rules all the way through to analytic models, leveraging things like machine learning and artificial intelligence, and deploy those models into a runtime environment such that we can execute it against claims from a healthcare perspective.”

Before SAS entered the picture, Prime used rule-based analytics to scour its pharmacy claims for potential fraud. “The rules are very foundational to identify fraud that we knew of,” Abou Nader said. “But where we’re heading with SAS is being able to look across not only one client’s claims, but we’re able to pull all of the claims in for all our clients under Prime, all of our Blue Plans, to be able to look across the entire network of pharmacies and of members and physicians so that we’re not working in a silo.”

Jo-Ellen Abou Nader, assistant vice president of fraud, waste and abuse operations at Prime TherapeuticsJo-Ellen Abou Nader

For Prime, the SAS Fraud Framework will include more than 1,000 scenarios, or models, designed to identify the risk of healthcare fraud. The framework will be built out by SAS “so that all these scenarios trigger risk scores,” Abou Nader said. “There are risk scores for each member, for each physician, for each pharmacy. That’s how our analytics team will prioritize.”

By integrating pharmacy and medical claims, Abou Nader said, “a scenario we may look at is pharmacy claims with no associated medical office visits– [for example,] a pain management patient where they are on controlled substances but we see no office visits within the last six months. We’re looking also at duplicate therapy across the pharmacy benefit and medical benefit.

“It’s not that you have to look in each scenario,” Abou Nader said. “SAS has done the job for us so it’s not a needle in a haystack. They have identified where each physician, for example, triggers on each scenario.”

Taking a holistic approach

What Bradley finds intriguing from an analytics perspective is the ability to look at data patterns across various insurance plans.

I like to say [fraud] is like whack-a-mole. As soon as we’ve caught the last fraud, here comes the next scenario.
Jo-Ellen Abou NaderAVP of fraud, waste and abuse operations, Prime Therapeutics

By managing healthcare fraud detection at an individual level, the Blue Plans “have insight only into their own data and their own claims,” Bradley said. “But if you can look holistically across those 22 plans, you have a much better opportunity to identify some of the more complex fraud schemes that may extend across multiple plans and be able to link that data together such that you can find and identify where you might have great risk. We call that consortium modeling.”

This approach, he said, provides “better accuracy, better prioritization of risk and it reduces the false positives because you can look at, say, a specific pharmacy that is paying prescriptions across multiple plans and be able to look at their behavior holistically versus in their respective silos.”

Because the Fraud Framework is hosted by SAS, implementation on Prime’s part centered on ensuring data quality. “There has been a lot of discussion around pushing this amount of data to SAS,” Abou Nader said. The focus particularly has been on “how they digest the information, because medical claims look very different than pharmacy claims. With this integration we’ve really had to focus on how the analytics will look and the outcomes we want, so it’s really been a process to go through with SAS to get to where we need to be.”

Don’t overlook the importance of data

One example of the tandem approach Abou Nader mentioned centered on preserving the quality of the claims data under examination.

When you consider the overall lifecycle of fraud detection, Bradley said, “it’s got to start with the data. We’ve got to be able to provide the appropriate analytic capabilities to build and deploy those models. You need to be able to execute and score every single transaction, and in this case it’s claims.”

Ensuring the quality of Prime’s data was a joint effort, Bradley said. The effort starts “with the identification of the appropriate data sources. We leverage their domain experts who know a lot about Prime’s data, in concert with our domain experts who know a lot about broader pharmacy claims data across multiple organizations and how we need to best structure that for fraud. Part of that is cross-pollinating those skill sets and defining how we’re going to transform that data and get it into an analytical-ready format such that we can deliver [more quickly] analytics that is going to be more accurate, and serve up the data to the end users that is going to be in as valuable a format as possible.” 

Fraud continually evolves

SAS has “gathered all of our data,” Abou Nader said. “Now we’re in the process of pushing all that data and defining all the analytics and the model scoring as it relates to our set of data. So it’s quite a bit of lift on the SAS side.”

She said Prime and SAS are targeting an October go-live date for the healthcare fraud framework. “Our goal is to be able to get cases out to test them. We’re expecting to see results as early as September, when we will start getting alerts to test — the actual alerts to say look at this member, look at this physician, look at this pharmacy, they are the highest risk. We will start getting some of those in the September, October timeframe to be able to test and go validate to say is this fraud or is this not fraud.”

But this won’t see the end of the work. Fraud is an evolution, Abou Nader said, and so is healthcare fraud data analytics modeling. “I like to say [fraud] is like whack-a-mole. As soon as we’ve caught the last fraud, here comes the next scenario.

“The great thing about SAS is this is a long-term partnership where we will continue to update the models with the latest fraud schemes that are happening.”

REWIND’s high-flying work with Microsoft HoloLens

Leila Martine, Director of Product Marketing at Microsoft, sees first-hand the excitement HoloLens is causing. “HoloLens is helping companies to work better by empowering staff. Every day we are seeing that workers from a range of sectors can easily collaborate to make complicated problems simple to solve. It really is taking human experiences to the next level.”

Virtual, augmented and mixed reality is becoming increasingly important to companies across the globe. According to market intelligence firm IDC, “worldwide revenues for the augmented reality and virtual reality market will grow … to more than $162 billion in 2020″.

REWIND is at the cutting-edge of that market. The company, which is based in St Albans (Rogers: “We’re only 2.5 miles outside the M25, so we’re London”), was only founded in 2011 but has grown quickly, boasting a team of more than 50 people. The group has already created a multi-award-winning virtual reality spacewalk for the BBC, as well as experiences with Jaguar, Lexus, Nissan, Rolls-Royce, Nike, Stella Artois, Savills and singer Bjork, among many others.

That level of technical experience led to REWIND being added to Microsoft’s HoloLens Agency Readiness Partner programme, which means the company will help other businesses use the mixed-reality headset to transform how they work. Rogers is excited by the possibilities.

“HoloLens is the first device humans have ever had that can augment human intelligence in real time. We have the world’s knowledge at our fingertips with one of these [he holds up his smartphone] but it’s a layer away, a search algorithm away. We have laptops, but what if the second screen is a HoloLens screen? If I can make this [he points to my laptop] as good as talking like we are now, as though I’m really here in the same space as you [when I’m really somewhere else], then why do we need to commute in the way we currently do, why do we all need to be compacted down this end of the country? What if you don’t like the weather so you change it to something else? That’s a little far away, but it’s not that big a leap. HoloLens has some amazing stuff, which is just the tip of the iceberg of what mixed reality can do.”

However, rather than see what he can do with HoloLens in the commercial sector, where the device has been predominantly used since its launch in 2016, Rogers wants members of the public to get their hands on the technology, too.

REWIND’s high-flying work with Microsoft HoloLens

Leila Martine, Director of Product Marketing at Microsoft, sees first-hand the excitement HoloLens is causing. “HoloLens is helping companies to work better by empowering staff. Every day we are seeing that workers from a range of sectors can easily collaborate to make complicated problems simple to solve. It really is taking human experiences to the next level.”

Virtual, augmented and mixed reality is becoming increasingly important to companies across the globe. According to market intelligence firm IDC, “worldwide revenues for the augmented reality and virtual reality market will grow … to more than $162 billion in 2020″.

REWIND is at the cutting-edge of that market. The company, which is based in St Albans (Rogers: “We’re only 2.5 miles outside the M25, so we’re London”), was only founded in 2011 but has grown quickly, boasting a team of more than 50 people. The group has already created a multi-award-winning virtual reality spacewalk for the BBC, as well as experiences with Jaguar, Lexus, Nissan, Rolls-Royce, Nike, Stella Artois, Savills and singer Bjork, among many others.

That level of technical experience led to REWIND being added to Microsoft’s HoloLens Agency Readiness Partner programme, which means the company will help other businesses use the mixed-reality headset to transform how they work. Rogers is excited by the possibilities.

“HoloLens is the first device humans have ever had that can augment human intelligence in real time. We have the world’s knowledge at our fingertips with one of these [he holds up his smartphone] but it’s a layer away, a search algorithm away. We have laptops, but what if the second screen is a HoloLens screen? If I can make this [he points to my laptop] as good as talking like we are now, as though I’m really here in the same space as you [when I’m really somewhere else], then why do we need to commute in the way we currently do, why do we all need to be compacted down this end of the country? What if you don’t like the weather so you change it to something else? That’s a little far away, but it’s not that big a leap. HoloLens has some amazing stuff, which is just the tip of the iceberg of what mixed reality can do.”

However, rather than see what he can do with HoloLens in the commercial sector, where the device has been predominantly used since its launch in 2016, Rogers wants members of the public to get their hands on the technology, too.

FBI encryption argument draws fire from senator

The new FBI director has taken the same stance on encryption as the last and now a U.S. senator is asking for proof behind FBI claims that secure lawful access is possible.

FBI director Christopher Wray spoke publicly about encryption at Fordham University on Jan. 9, 2018, where he repeated the long-standing FBI encryption argument that security and lawful access shouldn’t be mutually exclusive. On Thursday, Senator Ron Wyden (D-Ore.) wrote a letter to Wray expressing disappointment that Wray made public comments on the FBI’s encryption stance without first discussing it with Wyden.

Wray claimed he didn’t want an FBI encryption “backdoor” but Wyden noted that not using the term “backdoor” doesn’t change the underlying issues.

“Regardless of whether the Federal Bureau of Investigation labels vulnerability by design a backdoor, a front door, or a ‘secure golden key,’ it is a flawed policy that would harm American security, liberty, and our economy,” Wyden wrote in the letter. “Your stated position parrots the same debunked arguments espoused by your predecessors, all of whom ignored the widespread and vocal consensus of cryptographers. For years, these experts have repeatedly stated that what you are asking for is not, in fact, possible.”

Wyden closed his letter with a challenge to director Wray to prove the FBI encryption argument transparently by providing “a list of the cryptographers” that Wray spoke with and to “specifically identify those experts who advised you that companies can feasibly design government access features into their products without weakening cybersecurity.”

Sen. Wyden has long been a defender of strong encryption and had questioned former FBI director James Comey in mid-2015 during the early FBI encryption debate related to iPhone access. At the time, Wyden asked Comey if he would be able to guarantee encryption keys needed for lawful access would never be stolen, to which Comey replied, “Surely not.”

Across the pond

One day after Sen. Wyden asked for more transparency behind the FBI encryption argument, British Prime Minister Theresa May voiced support for encryption backdoors at the World Economic Forum in Davos, Switzerland. May argued that because tech “companies have some of the best brains in the world,” a solution to secure encryption access must be possible, despite those “best brains” saying for decades that it is not.

May even chose to attempt some fear-mongering to drum up support for encryption backdoors.

“And just as these big companies need to step up, so we also need cross-industry responses because smaller platforms can quickly become home to criminals and terrorists. We have seen that happen with Telegram. And we need to see more co-operation from smaller platforms like this,” May said. “No-one wants to be known as ‘the terrorists’ platform’ or the first choice app for paedophiles.”

John Q. Martin, product manager for SentryOne, suggested on Twitter that May should listen to experts before making more public comments.

“Politicians need to take their responsibilities to society seriously and respect the rights of the people. You want a backdoor into encryption that only law enforcement can use. Not possible, it is fundamental mathematics preventing it, not tech companies being awkward,” Martin wrote. “How can you say this and at the same time espouse the weakening of encryption? You really need to pay attention to advisors who know what they are talking about or get better ones. Quit with the soundbites and actually listen to experts for once.”

Microsoft announces expansion of Montreal research lab, new director

Geoffrey Gordon has been named Microsoft Research Montreal’s new research director. Photo by Nadia Zheng.

Microsoft plans to significantly expand its Montreal research lab and has hired a renowned artificial intelligence expert, Geoffrey Gordon, to be the lab’s new research director.

The company said Wednesday that it hopes to double the size of Microsoft Research Montreal within the next two years, to as many as 75 technical experts. The expansion comes as Montreal is becoming a worldwide hub for groundbreaking work in the fields of machine learning and deep learning, which are core to AI advances.

“Montreal is really one of the most exciting places in AI right now,” said Jennifer Chayes, a technical fellow and managing director of Microsoft Research New England, New York City and Montreal.

Chayes said Gordon, currently a professor of machine learning at Carnegie Mellon University, was a natural choice for the job in part because he’s interested in both the foundational AI research that addresses fundamental AI challenges and the applied work that can quickly find its way into mainstream use.

“We want to be doing the research that will be infusing AI into Microsoft products today and tomorrow, and Geoff’s research really spans that,” she said. “He’ll be able to help us improve our products and he’ll also be laying the foundation for AI to do much more than is possible today.”

Jennifer Chayes, technical fellow and managing director of Microsoft Research New England, New York City and Montreal.

Chayes also noted that Gordon’s broad and deep AI expertise will be a major asset to the lab. She noted that Gordon is an expert in reinforcement learning, in which systems learn through trial and error, and he’s also done groundbreaking work in areas such as robotics and natural language processing. The ability to combine all those areas of expertise will be key to developing sophisticated AI systems in the future.

“Given that we want a very broad AI lab, Geoff is the ideal person to lead it, and to create the fundamental research that underlies the next generation of AI,” she said.

Gordon said he’s especially interested in creating AI systems that have what we think of as long-term thinking: the ability to come up with a coherent plan to solve a problem or to take multiple actions based on clues it gets along the way. That’s the kind of thing that comes easily to people but is currently rudimentary in most AI systems.

Over the last few years, AI systems have gotten very good at individual tasks, like recognizing images or comprehending words in a conversation, thanks to a combination of improved data, computing power and algorithms.

Now, researchers including Gordon are working on ways to combine those skills to create systems that can augment people’s work in more sophisticated ways. For example, a system that could accurately read clues based on what it is seeing and hearing to anticipate when it would be useful to step in and help would be much more valuable than one that requires a person to ask for help with a specific task when needed.

“We have, in some cases, superhuman performance in recognizing patterns, and in very restricted domains we get superhuman performance in planning ahead,” he said. “But it’s surprisingly difficult to put those two things together – to get an AI to learn a concept and then build a chain of reasoning based on that learned concept.”

Microsoft began developing its research presence in Montreal a year ago, when it acquired the deep learning startup Maluuba.

The Microsoft Research team in Montreal has already made groundbreaking advances in AI disciplines that are key to the type of systems Gordon imagines. That includes advances in machine reading comprehension – the ability to read a document and provide information about it in a plainspoken way – and in methods for teaching AI systems to do complex tasks, such as by dividing large tasks into small tasks that multiple AI agents can handle.

Gordon said he was drawn to the new position both because of the work the team in Montreal is doing and the opportunity to collaborate with the broader Montreal AI community.

“Research has always been about standing on the shoulders of giants, to borrow a phrase from a giant – and it’s even more so in the current age,” Gordon said.

The city has become a hotbed for AI advances thanks to a strong academic and research presence, as well as government funding commitments.



Yoshua Bengio, an AI pioneer who heads the Montreal Institute for Learning Algorithms, said Gordon’s presence and the Microsoft lab’s expansion will help continue to build the momentum that the Montreal AI community has seen in recent years. He noted that Gordon’s area of focus, on AI systems that can learn to do more complex tasks, is complementary to the work he and others in the community also are pursuing.

“It’s one of the strengths of Montreal,” said Bengio, who is also an AI advisor to Microsoft.

Joelle Pineau, an associate professor of computer science at McGill University and director of Montreal’s Facebook AI Research Lab, said she was thrilled to hear Gordon would be joining the Montreal AI ecosystem.

“There is no doubt that the Montreal AI community will be deeply enriched by his presence here,” Pineau said.

Navdeep Bains, Canada’s minister of innovation, science and economic development, said he was looking forward to seeing the work that Gordon and Microsoft Research Montreal will produce.

“I am pleased that our government’s investment in innovation and skills continues to position Canada as a world-leading destination for AI companies and impressive researchers like Geoff Gordon,” he said.

The expansion of the Montreal lab is part of Microsoft’s long history of investing in international research hubs, including labs in the U.S., Asia, India and Cambridge, United Kingdom. Chayes said the company’s international presence has helped it attract and retain some of the world’s best researchers in AI and other fields, and it also has helped ensure that the company’s AI systems reflect a diversity of experiences and cultures.

For example, Chayes said the fact that Montreal is a bilingual city could help inform the company’s work in areas such as translation and speech recognition.

“It’s a culture where you go back and forth between two languages. That’s a very interesting environment in which to develop tools for natural language understanding,” she said.

The French version of this blog post can be found on the Microsoft News Center Canada.

Related:

Allison Linn is a senior writer at Microsoft. Follow her on Twitter.

Panasas storage, director blades split into separate devices

Panasas has revamped its scale-out NAS, adding a separate hardware appliance to disaggregate ActiveStor director blades from its hybrid arrays of the same name.

The Panasas storage rollout encompasses two interrelated products with different launch dates. The ActiveStor Hybrid 100 (ASH-100), the latest generation of Panasas’ hybrid storage, is due for general availability in December. The ASH-100 uses solid-state drives to accelerate metadata requests.

The new product entry is the ActiveStor Director 100 (ASD-100), a control-plane engine that sits atop a rack of ActiveStor arrays. ASD-100 director blade appliances are scheduled for release by March 2018, in tandem with its PanFS 7.0 parallel file system.

The ASH-100 array and ASD-100 blade appliance are compatible with ActiveStor AS18 and AS20 systems. Until now, Panasas integrated director blades in a dedicated slot on the 11-slot array chassis.

Addison Snell, CEO of IT analyst firm Intersect360 in Sunnyvale, Calif., said adding a separate metadata server allows Panasas to expand on its PanFS parallel file system.

“The reason this is important is that different levels of workloads will require different levels of performance,” Snell said. “Panasas lets you right-size your metadata performance to your application. Enterprise storage increasingly is migrating to different things that are classified as high-performance workloads, beyond the traditional uses. You’ve got big data, AI and machine learning starting to take off. The attention has turned to ‘How do I achieve reliable performance at scale so that I can tailor to my individual workload?'”

The revamp improves performance of high-performance computing and hyperscale workloads, especially seeking and opening lots of small files, said Dale Brantley, a Panasas director of systems engineering.

“This is a disaggregated director appliance that lets you unlock the full functionality of the software contained within. You will be able to cache millions or tens of millions of entries in the Director’s memory, rather than doing memory thrashing,” Brantley said.

“These products together allow us to tailor the environment more for specific workloads. Our customers are using more small-file workloads. This is just one more workload that the HPC cluster has to support. This will be a foundational platform for our next-generations systems.”

Panasas' storage stack
The Panasas ASD-100 director blade sits atop the vendor’s ActiveStor Hybrid storage, allowing customers to scale them separately.

Panasas storage protocol reworks memory allocation for streaming

ASH-100 uses a system-on-a-chip CPU design based on an Intel Atom C2558 processor. The 2U Panasas storage array tops out at 57 TB of raw capacity with 200 populated shelves. A shelf scales to 264 TB of disk storage and 21 TB of flash.

All I/O requests are buffered in RAM. Each ASH-100 blade includes a built-in 16 GB DDR3 RAM card to speed client requests. A new feature is the ability to independently scale HDDs and SSDs of varying capacities in the ASH-100 box.

Brantley said changes to the Linux kernel in recent years have hindered the streaming capability of large file systems. To compensate, Panasas wrote code that enables its DirectFlow parallel file system protocol in PanFS to enhance read-ahead techniques and boost throughput.

The ASD-100 Director appliance is a 2U four-node chassis with 96 GB of DDR4 nonvolatile dual-inline memory modules (NVDIMM) to protect metadata transactions. Previous ActiveStor blades used an onboard battery to back up DRAM as persistent cache for the metadata logs.

Brantley said Panasas storage engineers wrote an NVDIMM driver that they will share with the FreeBSD operating system community. Updates to FreeBSD are slated for PanFS 7.0, along with a dynamic GUI and aids for implementing NFS on Linux servers.

Panasas said PanFS 7.0 will include an improved NFS Server implementation and updates to the FreeBSD operating system. Panasas storage engineers wrote a SNIA-compatible NVDIMM driver that Brantley said will be made available to the FreeBSD community.

Introducing Microsoft Machine Learning Server 9.2 Release

This post is authored by Nagesh Pabbisetty, Partner Director of Program Management at Microsoft.

Earlier this year, Microsoft CEO Satya Nadella shared his vision for Microsoft and AI, pointing to Microsoft’s beginnings as a tools company, and our current focus on democratizing AI by putting tools “in the hands of every developer, every organization, every public sector organization around the world”, so that they can build their own intelligence and AI capabilities.

Today, we are taking a significant step in realizing Satya’s vision by launching Microsoft Machine Learning Server 9.2, our most comprehensive machine learning and advanced analytics platform for enterprises. We have exciting updates to share, including full data science lifecycle support (data preparation, modeling and operationalization) for Python as a peer to R, and a repertoire of high performance distributed ML and advanced analytics algorithm packages.

We started the journey of transforming Microsoft R Server into Machine Learning Server a year ago, by delivering innovations in Microsoft R Server 9.0 and 9.1. We made significant enhancements in this release to create the Machine Learning Server 9.2 platform which replaces Microsoft R Server and offers powerful ML capabilities.

Microsoft Machine Learning Server is the most inclusive enterprise platform that caters to the needs of all constituents – data engineers, data scientists, line-of-business programmers and IT professionals – with full support for Python and R. This flexible platform offers a choice of languages and features algorithmic innovation that brings the best of open source and proprietary worlds together. It enables best-in-class operationalization support for batch and real-time.

Microsoft Machine Learning Server includes:

  1. High-performance ML and AI wherever your data lives.
  2. The best AI innovation from Microsoft and open source.
  3. Simple, secure and high-scale operationalization and administration.
  4. A collaborative data science environment for intelligent application development.
  5. Deep ecosystem engagements, to deliver customer success with optimal TCO.

It’s now easier than ever to procure and use Microsoft Machine Learning Server on all platforms. Licensing has been simplified to the following, effective October 1st 2017:

  • Microsoft Machine Learning Server is built into SQL Server 2017 at no additional charge.
  • Microsoft Machine Learning Server stand-alone for Linux or Windows is licensed core-for-core as SQL Server 2017.
  • All customers who have purchased Software Assurance for SQL Server Enterprise Edition are entitled to use 5 nodes of Microsoft Machine Learning Server for Hadoop/Spark for each core of SQL Server 2017 Enterprise Edition under SA. In addition, we are removing the core limit per-node; customers can have unlimited cores per node of Machine Learning Server for Hadoop/Spark.

You can immediately download Microsoft Machine Learning Server 9.2 from MSDN. It comes packed with the power of the open source R and Python engines, making both R and Python ready for enterprise-class ML and advanced analytics. Also check out the R Client for Windows, R Client for Linux, and Visual Studio 2017 with R and Python Tools.

Let’s take a peek at each of the key areas of the new Microsoft Machine Learning Server outlined above.

1. High-performance Machine Learning and AI, Wherever Data Lives

The volume of the data that’s being used by enterprises to make smart business decisions is growing exponentially. The traditional paradigm requires users to move data to compute which introduces challenges with latency, governance and cost, even if it was possible to move the data to where compute is. The modern paradigm is to take compute to where the data is, to unlock intelligence, and this is Microsoft’s approach.

In enterprises, it is common to have data spread across multiple data platforms and migrate data from one platform to another, over time. In such a world, it is essential that ML and analytics are available on multiple platforms, and are portable, and Microsoft delivers on this need. Microsoft Machine Learning Server 9.2 runs on Windows, three flavors of Linux, the most popular distributions of Hadoop Spark and in the latest release of SQL Server 2017. As always, we will soon make this release available on Azure as Machine Learning Server VMs, SQL Server VMs, and as Machine Learning Services on Azure HDInsight, in addition to an ever-growing portfolio of cloud services.

Today, we are also announcing Public Preview of R Services on Azure SQL DB, to make it easy for customers who are going cloud-first or transitioning to the cloud from on-premises.

For more information, review the links below:

2. The Best AI Innovation from Microsoft and Open Source

As make AI accessible to every individual and organization, one of our key goals is to use this technology to amplify human ingenuity through intelligent technology. We are designing AI innovations that extend and empower human capabilities in all aspects of life.  We are infusing AI across our most popular products and services, and creating new ways to interact more naturally with technology. Offerings such as the Microsoft Cognitive Toolkit for deep learning, our Cognitive Services collection of intelligent APIs, SQL Server Machine Learning Services and Azure Machine Learning exemplify our approach.

Microsoft Machine Learning Server includes a rich set of highly scalable and distributed set of algorithms such as revoscaler, revoscalepy, and microsoftML that can work on data sizes larger than the size of physical memory, and run on a wide variety of platforms in a distributed manner.

The open source ecosystem is innovating at a fast pace as well, with AI toolkits such as TensorFlow, MXNet, Keras and Caffe, in addition to our open source Cognitive Toolkit.

Microsoft Machine Learning Server 9.2 bridges these two worlds, enabling enterprises to build on a single ML platform where one can bring any R or Python open source ML package, and have it work side-by-side with any proprietary innovation from Microsoft. This is a key investment area for us. You can learn more from the resources below:

3. Simple, Secure and High-Scale Operationalization and Administration

Enterprises that rely on traditional paradigms and environments for operationalization end up investing a lot of time and effort towards this area. It is not uncommon for data scientists to complete their models and hand them over to line-of-business programmers to translate that into popular LOB languages and APIs. The translation time for the model, iterations to keep it valid and current, regulatory approval, managing permissions through operationalization – all of these things are big pain points, and they result in inflated costs and delays.

Microsoft Machine Learning Server offers the best-in-class operationalization solution in the industry. From the time an ML model is completed, it takes just a few clicks to generate web services APIs that can be hosted on a server grid (either on premises or in the cloud) which can then be integrated with LOB applications easily. In addition, Microsoft Machine Learning Server integrates seamlessly with Active Directory and Azure Active Directory, and includes role-based access control to make sure that the security and compliance needs of the enterprise are satisfied. The ability to deploy to an elastic grid lets you scale seamlessly with the needs of your business, both for batch and real-time scoring.

For more information, refer to the links below:

4. A Collaborative Data Science Environment for Intelligent Application Development

In enterprises, different departments take the lead for different aspects of the data science life-cycle. For instance, data engineers lead data preparation, data scientists lead experimentation and model building, IT professionals lead deployment and operationalization, and LOB programmers develop and enhance applications with intelligence, tailoring them to the needs of the business. With the in-database analytics capability of SQL Server 2017 and SQL Server 2016 (powered by Microsoft Machine Learning Services), all these constituents can work collaboratively and in the context of the leading mission critical database that is trusted by enterprises all over the world.

Python and R are the most popular languages for ML and advanced analytics. The choice of a language depends on the expertise and culture of engineers and scientists, the data science problems to be solved, and the availability of algorithms toolkits for the chosen language. Each language is supported by a choice of open-source IDEs. It’s not unusual to have debates on which language to choose because enterprises think they have to make an either-or choice.

With Microsoft Machine Learning Server, both R and Python are fully supported. You can bring in and use the latest open source toolkits along with the included Microsoft toolkits for AI and advanced analytics, all on top of a single enterprise-grade platform. Specific enhancements to support Python in the current release include:

  • New Python packages: revoscalepy and microsoftml, bringing high performance and battle tested machine learning algorithms to Python users.
  • Pre-trained cognitive models for image classification and sentiment analysis.
  • Interoperability with PySpark.
  • Python models deployed as web services.
  • Real-time and batch scoring of Python models.

Concurrent with this release, Microsoft is also releasing a public preview of Azure Machine Learning, a comprehensive environment for data science and AI. We will integrate Microsoft Machine Learning Server capabilities with this platform, to realize an industry-leading workbench for data science and AI.

For more information, refer to the links below:

5. Deep Ecosystem Engagements, to Deliver Customer Success with Optimal TCO

Individuals embarking on the journey of making their applications intelligent, or, simply wanting to learn the new world of AI and ML, need the right learning resources to help them get started. Microsoft provides several learning resources, and has engaged several training partners to create a repertoire of solution templates to help you ramp up and become productive quickly, including the following:

Enterprises have big investments in infrastructure and applications and may need the help of partners such as Systems Integrators (SIs) and Independent Software Vendors (ISVs) to help them transform into the world of intelligent applications. Microsoft has nurtured a vibrant ecosystem of partners to help our customers here. Learn about some of our strategic partnerships at the links below:

Summary

With the launch of Microsoft Machine Learning Server 9.2, we are proud to bring enterprises worldwide an inclusive platform for machine learning and advanced analytics. We have created a better-together environment that brings intelligence where the data lives, supports both R and Python, both open source and proprietary innovation, the ability to work on the data science lifecycle across a wide variety of platforms, and infuse intelligence at scale, both in batch and real-time contexts, with APIs for the most popular LOB languages.

Adopting machine learning and advanced analytics requires a holistic approach that transcends technology, people and processes. We are proud to continue delivering the best tools, platforms and ecosystem to ensure that enterprise users are set up for success. Our next steps are to integrate Azure Machine Learning and Microsoft Machine Learning Server closely, and continue to take machine learning to our customers’ data, wherever it may reside.

Nagesh

Changing the world through data science

By Kenji Takeda, Director, Azure for Research

Alan Turing asked the question “can machines think?” in 1950 and it still intrigues us today. At The Alan Turing Institute, the United Kingdom’s national institute for data science in London, more than 150 researchers are pursuing this question by bringing their thinking to fundamental and real-world problems to push the boundaries of data science.

One year ago, The Turing first opened its doors to 37 PhD students, 117 Turing Fellows and visiting researchers, 6 research software engineers and more than 5,000 researchers for its workshops and events. I have been privileged to be one of these visiting fellows, helping the researchers take a cloud-first approach through our contribution of $5 million of Microsoft Azure cloud computing credits to The Turing. To be part of this world-leading center of data science research is exhilarating. Cloud computing is unlocking an impressive level of ambition at The Turing, allowing researchers to think bigger and unleash their creativity.


“We have had an exceptional first year of research at The Turing. Working with Microsoft, our growing community of researchers have been tooled up with skills and access to Azure for cloud computing and as a result they’ve been able to undertake complex data science tasks at speed and with maximum efficiency, as illustrated by some of the stories of Turing research showcased today. We look forward to growing our engagement with the Azure platform to help us to undertake even bigger and more ambitious research over the coming academic year.”
~ Andrew Blake, Research Director, The Alan Turing Institute

Human society is one of the most complex systems on the planet and measuring aspects of it has been extremely difficult until now. Merve Alanyali and Chanuki Seresinhe are graduate students from the University of Warwick who are spending a year at The Turing applying novel computational social science techniques to understand human happiness and frustration. They are using AI and deep neural networks to analyze millions of online photos with Microsoft Azure and their findings are providing deeper insights into the human condition.

[embedded content]

Kenneth Heafield, Turing Fellow from the University of Edinburgh, has been using thousands of Azure GPUs (graphical processing units) to explore and optimize neural machine translation systems for multiple languages in the Conference on Machine Translation. Azure GPUs enabled the group to participate in more languages, producing substantially better results than last year and winning first place in some language pairs. The team is working closely with Intel on using new architectures, including FPGAs (field-programmable gate arrays) like Microsoft’s Project Catapult, to make even bigger gains in machine translation.

Microsoft is delighted to see The Alan Turing Institute setting up a deep research program around ethics, a crucial topic in data science, AI and machine learning. Our own human-centered design principles are that AI technology should be transparent, secure, inclusive and respectful, and also maintain the highest degree of privacy protection. We are pleased that Luciano Floridi is leading the Data Ethics research group at The Turing as his perspectives on areas such as healthcare are helping us to think about how we can ensure that technology is used in the most constructive ways.

The first-year at The Turing has been impressive. We look forward to another exciting year as we work together on projects in data-centric engineering, blockchain, healthcare and secure cloud computing. Along with Microsoft’s data science collaborations at University of California, Berkeley, and through the National Science Foundation Big Data Innovation Hubs, we are perhaps getting closer to answering Alan Turing’s profound question from 67 years ago.

Learn more: