The latest software patch for on-premises Skype for Business eliminates bugs and adds features for users that run the Microsoft platform on Mac OS, narrowing an already minimal gap between the Mac and Windows clients.
For Mac users, the Skype for Business update lets delegates — users designated to receive someone else’s calls — create and edit a meeting on behalf of a colleague. Also, users can now be made a delegate even if their account isn’t part of an organization’s enterprise voice plan.
Microsoft has enabled video-based screen sharing for Mac users, the result of a next-generation screen-sharing protocol that the vendor added to Skype for Business earlier this year. The new system is faster and more reliable than the traditional method and works better in low-bandwidth conditions.
The Skype for Business update, available for download now, also fixes several bugs on the Mac client, including a flaw that prevented users from joining a meeting hosted by someone outside their organization.
Microsoft seems to announce updates to the Mac client more quickly than it does for other changes to the Skype for Business platform, and describes Mac upgrades in more detail, said Jim Gaynor, a vice president of the consulting group Directions on Microsoft, based in Kirkland, Wash.
“There are still a few gaps between SfB Mac and Windows clients, most around some of the advanced call control features, file upload/sharing, and the ability to upload PowerPoint decks for online presentations,” Gaynor said. “But they’re fairly minimal.”
Skype for Business 2015 server nears its end of life
The improvements to the Mac client were among roughly 40 enhancements released as part of Microsoft’s biannual update to the Skype for Business 2015 server.
This summer’s Skype for Business update introduces location-based routing for Skype for Business mobile clients. The feature gives businesses more control when steering calls between VoIP and PSTN endpoints based on geography.
Microsoft is expected to stop releasing feature updates and bug fixes for the 2015 server in fall 2020, the end of the typical five-year lifespan for the product.
The 2019 server will encourage businesses to host some telephony and messaging features in the cloud. Meanwhile, Microsoft Teams, a team collaboration app similar to Slack, will soon replace Skype for Business Online within the cloud-based Office 365 suite.
Low-code BPM development tools today already help developers simplify and speed up business process application development. The next step is to make those apps smarter.
To that end, BP Logix, a business process management (BPM) company in Vista, Calif., recently introduced version 5.0 of its Process Director that adds AI features to enable predictive analysis, enhanced UIs and journals for configurable collaboration.
Rather than present complex AI features, Process Director 5.0 offers a set of basic machine learning tools that the average app developer can use, such as a point-and-click graphical interfaces that guide configuration processes and display results of analytics, with no coding required.
Embedding intelligence into business applications requires specialized knowledge and teams of data scientists, said Charles Araujo, principal analyst for Intellyx, a consulting firm in Glens Falls, N.Y. Process Director 5.0’s blend of AI and low-code features brings predictive application processes to nontechnical users.
“The value Process Director 5.0 delivers is less about features, per se, and more about accessibility,” Araujo said.
The AI tools inside Process Director 5.0 enable machine learning, sentiment analysis, capture and expression of dissimilar events and conditions in a single state and configurable collaboration. The company also added UI features for iterative list search, calendar views, and inline HTML and text editing.
“AI and machine learning create prediction models that have been missing from BPM,” said Neil Ward-Dutton, research director for MWD Advisors, a U.K.-based IT consulting firm. With AI, the application learns from past history, identifies trends and makes recommendations for decisions.
As an example, Ward-Dutton pointed to how AI capabilities can help with a loan request by identifying factors that make the applicant and the loan’s purpose a low or high risk. Combined data mining and machine learning tools aggregate information about previous loan applications and current market conditions to help the loan officer make a decision.
Neil Ward-Duttonresearch director, MWD Advisors
Araujo said he sees businesses with reliable data on actions and outcomes adopt AI-enabled, predictive-type applications quickly and with good results. Developers can use that legacy data to build models that predict behavior of application users who meet certain criteria and perform specific actions. With these functions, the tool recommends a best action and prioritizes options that are presented to the user, so the application feels more intuitive or takes actions automatically.
Applying AI for nontechnical users, even with accessible tools, requires a change in traditional BPM project approaches. Araujo said project teams will have to think like a data scientist.
“Applying intelligence to applications requires imagination,” he said. “Developers need to think about application usage patterns and imagine ways to use predictive capabilities to meet users’ needs.”
“That’s not the way we’ve historically approached applications, particularly business-process-based ones,” Araujo added.
Process Director 5.0 is generally available, with versions for both cloud and on premises. In addition to AI and low-code/no-code development tools, the platform includes traditional BPM capabilities for compliance automation, process modeling, multifactor authentication and other standard BPM features.
When incorporated into business applications, AI can provide insights into how best to engage potential customers, predict customer needs, answer questions and, ultimately, sell products. Microsoft hopes to enable all that in its latest Dynamics 365 updates for marketing, sales and customer service.
At the Microsoft Business Applications Summit 2018, James Phillips, corporate vice president of the vendor’s business applications group, said in a keynote that the Dynamics 365 updates change the platform from something that feels like “a surveillance system” to a business intelligence (BI) tool incorporating AI-driven analytics.
CRM “is not a category of software that people are deeply in love with,” Phillips said. “A salesperson sits down and enters their leads and opportunities. It’s for someone else, so they can track the forecast, understand the pipeline [and] whether you’re doing your job or not.”
The granularity of the AI tools Microsoft added with the Dynamics 365 updates will likely be useful for the average end user, said Kate Leggett, analyst at Forrester Research.
“What Microsoft is really excelling at is infusing AI into all their applications to help the business user — whether it is a marketing or salesperson or customer service agent — make the right decisions for that particular interaction,” Leggett said.
Business-user AI: Microsoft’s strength
On the stage at the July 2018 conference, Tammy Mihailidis, vice president of digital customer engagement at Polaris, a maker of power sport vehicles based in Medina, Minn., spoke about how Polaris uses Dynamics 365’s marketing, sales and service platforms to give its customers a more personalized shopping experience.
Phillips said Dynamics 365 and its Power BI toolshelp organizations analyze traffic patterns of email and other communications and marry that information with LinkedIn to help understand where they should focus sales efforts.
Kate Leggettanalyst, Forrester Research
“As a salesperson, I have got a tool now that helps me focus my attention, helps guide me to success and isn’t simply about keeping track of what I am doing,” Phillips said.
Using Polaris as an example, Ryan Darby Martin, a senior product marketing manager at Microsoft, demonstrated how this process would look to both the customer — in this case, a fictional municipality — and to the Polaris agents, from within Dynamics 365.
The process with the Dynamics 365 updates includes a chatbot answering questions from a potential customer, predictive lead scoring recommending that sales staff focus on this particular lead, making the sale and welcoming the customer.
“We are actually able to track all of those interactions and calculate the health score of this particular lead,” Martin said. “We could see the time that was spent by us, but also the time that was spent by them. For example, I can actually see if they opened [an] email, if they clicked on the attachment, if they viewed the link [and] how many times they were responding to us.”
“Microsoft is trying to break down the artificial division between the front office and the back office by making the CRM assets available to all users with its integration into Skype and the Office products,” she said. “They are making it very easy to consume. It’s probably one of the most inexpensive enterprise solutions available.”
In addition being a venue for unveiling the Dynamics 365 updates, the conference was an opportunity for Microsoft to announce it would be releasing updates to its suite of products twice a year. Each release will be preceded by release notes that will help IT professionals prepare for the software updates months in advance, according to Microsoft.
“Companies have to be continually innovating,” Leggett said. “You are getting new releases twice a year, and when that happens, you need to have change management processes in place to be able to understand the changes, communicate them to the end users and then roll out these new releases to you CRM users.”
Today’s post was written by Joseph (Jody) Hobbs,managing directorof business applications and information security officer at Centra.
Centra is proud to count itself among the early adopters of cloud technology in the healthcare field. Back in 2014, we saw cloud computing as a way to keep up with the rapid growth we were experiencing across the enterprise—and the challenge of adapting to industry changes under the Patient Protection and Affordable Care Act (ACA). Five years later, we’re still using Microsoft Cloud services to remain on the leading edge of business productivity software so that we can provide exceptional patient care.
With Microsoft 365, we are better able to adapt to industry-wide changes introduced by ACA, such as the transition from a fee-for-service model to a quality-based model. This change made capturing data and analytics very important, because now reimbursement is based on quality of care, not quantity of services. We use Power BI, the data analytics tool from Microsoft Office 365 E5, to meet new healthcare reporting requirements and provide a wealth of data to our clinicians. They use this data to measure their performance against quality benchmarks to improve patient experiences and health outcomes.
We also turned to Microsoft 365 to help address Centra data security and privacy policies. Microsoft accommodated our requirement for data to remain in the continental United States, which helps us comply with Health Insurance Portability and Accountability Act (HIPAA) regulations that are standard in the healthcare industry. We also found a great solution for emailing sensitive information by combining a Microsoft two-factor authentication solution with our existing encryption appliance. Microsoft invests an incredible amount in its security posture, more than we ever could, and this, along with the knowledge that our data is not intermingled with others’ data in the tenant, gives us peace of mind. And we use Office 365 Advanced Threat Protection, which gives us great insight into malicious activities aimed at our employees’ inboxes.
Keeping our Firstline Workers flexible and mobile is another major priority. We plan to get all our clinical workers online with Office 365 to actualize our vision for a more productive, mobile workforce. We have almost 4,000 employees taking advantage of Office 365 ProPlus and downloading up to five instances of Office 365 on a range of devices. This makes it seamless for them to work from home or the office using the same powerful, cloud-based productivity apps.
As Centra continues to grow from a network of hospitals to an assortment of health-related enterprises, adding everything from a college of nursing to our own insurance business, we see a cloud-based workplace solution as key to staying agile and making the most of our momentum. In Microsoft 365, we have found a solution that marries the strict security requirements of our industry with the needs of a workforce that demands anytime, anywhere access to colleagues and information. For Centra, change isn’t just a matter of increasing productivity or mobility—at the end of the day, our ability to stay up to date with the latest technology innovations means we are providing the best care possible.
Read the case study to learn more about how Centra uses Microsoft 365 to improve quality-based healthcare practices and establish mobile, highly secure work environments to expedite patient care.
WASHINGTON, D.C. — A time comes in a service provider’s life when the early adrenaline-rush of business growth begins to fade and a harsh reality settles in: To continue to grow, and grow profitably, an organization must take deliberate steps to boost operational efficiency.
“It’s not the sexiest topic, but it is so important,” said Carolyn April, senior director of industry analysis at CompTIA. April chaired a panel discussion on operational efficiency improvement at ChannelCon 2018, CompTIA’s annual channel partner event. At last year’s ChannelCon, the benefits of diversity were an important focus, while previous meetings highlighted such topics as compliance controls.
A recent CompTIA survey of managed service providers (MSPs) and other channel partners revealed the key operational pain points. Half of the 450 organizations polled cited pricing pressure from customers or competitors as the top operations issue hurting profitability in the last year. Inefficient service delivery and inefficient sales process followed, with 28% of the respondents identifying the former and 27% of the respondents citing the latter.
April said even MSPs raking in sales could be “leaking margin left and right” due to inefficiency.
Operational efficiency improvement calls for a proactive approach that ensures a service provider has processes in place to run the business well — versus a reactive approach that hinges on extinguishing fires, April noted.
“The respondents having the hardest time and experiencing the most problems in their businesses are the ones that are reactive and not proactive,” she said.
For Paul Cronin, operational efficiency is built on a cultural foundation. Cronin, facilitator of excellence at Cronin Corp., an MSP, participated in the ChannelCon 2018 panel discussion.
“I believe that operational efficiency begins with a culture of discipline,” he said.
A culture of discipline is one aspect of great companies, as defined in Good to Great: Why Some Companies Make the Leap … and Others Don’t. The book studied companies that made the jump to superior financial performance and were able to sustain that edge for at least 15 years.
Paul Croninfacilitator of excellence, Cronin Corp.
Cronin said he taught a monthly leadership workshop on Good to Great while he was an executive at Atrion, a systems integrator acquired by Carousel Industries in 2017. He is now beginning to shape the culture at his new venture.
Another component of Cronin’s operational efficiency approach is Lean methodology and, in particular, the framework’s focus on small, incremental changes as a mechanism for process improvement.
“We are very focused on Lean,” he said, noting the aim of empowering people to make small changes happen.
Vince Tinnirello, CEO at Anchor Network Solutions, an MSP in the Denver area, said he uses the TruMethods framework, which provides metrics and key performance indicators (KPIs) to help MSPs determine when to hire and expand or contract. “Find a framework and stick to it,” he advised the ChannelCon 2018 audience.
Signing up for an MSP peer group can also help a company along the efficiency track.
“Join a peer group because it is business-changing,” he said.
April said PSA and RMM tools are important, but suggested the human dimension may prove the bigger challenge for service providers. Indeed, operational efficiency panelists and audience members noted that companies sometimes struggle to get employees to follow the processes and procedures designed to improve efficiency.
An engineer, for example, may prefer to “just get the job done” and ignore KPIs, processes and procedures along the way, one audience member said. In addition, employees hired at a more senior level may think they are smarter than the “dumb” processes the company wants them to adopt, another audience member noted.
Tinnirello agreed that getting every employee to do things the same way has proved to be a difficult task — even with checklists and policies and procedures. A tool such as an RMM system can help standardize service delivery, but Tinnirello noted that an RMM can’t automate every customer request. He said his company now aims to “automate where we can and make processes that people actually follow where you can’t automate.”
Operational efficiency improvement: Keeping score
MJ Shoer, director of client engagement and virtual CIO at Onepath, an MSP and professional services firm, said the company drills employees in its processes from the time of onboarding. Onepath’s approach, he said, is to hire Level One employees and move them up to more advanced positions, rather than bring on employees at a higher level.
The approach, Shoer explained, lets the company “maintain the structure we want,” while also providing the appropriate flexibility to adapt to market changes.
Onepath, meanwhile, uses performance scorecards to maintain a focus on KPIs. The scorecards track KPIs at the individual level and roll up to department-level and companywide KPIs.
But as MSPs enforce KPIs that support corporate goals, they should limit the number of goals to management amount. Cronin suggested the pursuit of too many goals is the mark of an inefficient company.
“We never had more than three company goals for the year,” he said. That’s not to say the company didn’t focus on other actions to move it forward, but the three main goals cascaded down through the organization, he added.
Cronin cited three top metrics: net promotor score, top-line revenue growth and EBITDA.
She cited as an example how her mentors, John Potter and Warren Bennis, scholars who helped pioneer the contemporary field of leadership studies, defined leadership. For them, leadership is largely about the ability to come up with a vision, communicate that vision and then get people to fulfill that vision. While that sort of top-down approach is effective for leading change, leading innovation is different, argued Hill, who has spent the last decade studying leaders of innovation around the globe.
“Innovation leaders see their primary role not as the visionary, but as the creator of a context in which others are willing and able to innovate,” said Hill during a keynote session at the recent LiveWorx event in Boston in which she offered up advice to IT professionals on leading innovation.
In other words, as a leader of innovation, you’re not necessarily the person who’s out in front telling everybody “this is where we’re going,” Hill said. Instead, you’re the “stage setter” — giving the players a shared purpose and letting them go from there.
“If you want to innovate and if you want to get at something new, you have to unleash the talents and the passions of individuals,” Hill said.
However, if innovation leaders want those individual sparks of innovation to be useful, they have to figure out how to harness all the diverse ideas, talents and passions of their teams to do something that actually meets the needs of the collective good. That’s one of the many tests leaders face: whether they can unleash and harness, Hill said.
Give them space
When it comes to unleashing innovation, a gentle push in the right direction is better than a forceful shove, according to Hill.
Linda Hillprofessor of business administration, Harvard Business School
“You cannot tell people to innovate,” Hill said. “Formal authority has nothing to do with whether they’ll innovate. You have to get people to volunteer and want to do what is really emotionally and intellectually taxing work.”
A piece of that puzzle is understanding that these people “don’t want to follow you to the future, they want to co-create that future with you,” Hill said. Innovation leaders need to create the space where that co-creation can happen — where a leader’s vision serves as a starting point rather than an ending point.
“Steve Jobs understood that innovation comes from collective genius, not solo genius,” Hill said. “Innovation is a team sport. [Successful innovation leaders like Jobs] really think you need to be a part of a community if you’re going to be able to innovate.”
Hill also relayed some interesting advice from her friend and study subject Bill Coughran, former senior vice president of engineering at Google. Coughran’s advice to innovation leaders when people come to them looking for guidance: “Keep it fuzzy.”
“They’re going to get nervous, depressed and frustrated and they’re going to come to you and want you to tell them what to do,” Hill said. “But the first time you tell them, that’s it. They’ll rely on you too much, they’ll delegate back to you and they won’t do the collaborative work that needs to be done.”
Hill said it’s the innovation leader’s job to coach everyone in the organization, no matter their position, how to be not only a “value creator,” but also a “game-changer.” A value creator is someone who knows how to identify and close the performance gap — the gap between where the organization is now and where it should be. A game-changer is someone who knows how to identify and close the “opportunity gap” — a gap between where the organization is now and where it could be.
In fact, if innovation leaders want to hold onto talent, they need to give the talent the chance to work on opportunity gaps, Hill said. That usually means letting the talent work on cutting-edge projects. If innovation leaders don’t do that, the talent is more likely to defect because they’re not going learn the expertise required to make new things happen and therefore they’re not going to be as marketable.
Innovation leaders, it seems, can’t be without vision themselves. To create an environment where new things happen — and where the talent is excited to innovate — leaders need to have a clear “point of view” on technology and innovation, as Hill put it. “If you don’t have a point of view, they don’t want to play with you,” she said.
The right Windows server management tools keep the business running with minimal interruptions. But administrators should be open to change as the company’s needs evolve.
Many organizations run on mix of new and old technologies that complicate the maintenance workload of the IT staff. Administrators need to take stock of their systems and get a complete rundown of all the variables associated with the server operating systems under their purview. While it might not be possible to use one utility to run the entire data center, administrators must assess which tool offers the most value by weighing the capabilities of each.
For these everyday tasks, administrators have a choice of several Windows server management tools that come at no extra cost. Some have been around for years, while others recently emerged from development. The following guide helps IT workers understand why certain tools work well in particular scenarios.
Choose a GUI or CLI tool?
Windows server management tools come in two flavors: graphical user interface (GUI) and command-line interface (CLI).
Many administrators will admit it’s easier to work with a GUI tool because the interface offers point-and-click management without a need to memorize commands. A disadvantage to a GUI tool is the amount of time it takes to execute a command, especially if there are a large number of servers to manage.
Learning how to use and implement a CLI tool can be a slow process because it takes significant effort to learn the language. One other downside is many of these CLI tools were not designed to work together; the administrator must learn how to pipe output from one CLI tool to the next to develop a workflow.
A GUI tool is ideal when there are not many servers to manage, or for one-time or infrequent tasks. A CLI tool is more effective for performing a series of actions on multiple servers.
Windows Admin Center, formerly Project Honolulu, is a GUI tool that combines local and remote server management tools in a single console for a consolidated administrative experience.
Windows Admin Center is one of Microsoft’s newer Windows server management tools that makes it easier to work with nondomain machines, particularly those running Server Core.
Windows Admin Center can only manage Windows systems and lacks the functionality IT workers have come to expect with the Remote Server Administration Tools application.
Administrators interested in using Windows Admin Center as one of their primary Windows server management tools should be aware of potential security issues before implementing it in their data center.
Now more than 10 years old, PowerShell is one of the key Windows server management tools due to its potent ability to manage multiple machines through scripting. No longer just a Windows product, Microsoft converted the automation and configuration management tool into an open source project. Microsoft initially called this new offering PowerShell Core, but now refers to it as just PowerShell. The open source version of PowerShell runs on Linux and macOS platforms. Microsoft supports Windows PowerShell but does not plan to add more features to it.
Administrators can use both PowerShell versions side by side, which might be necessary for some shops. At the moment, Windows PowerShell provides more functionality because certain features have yet to be ported to PowerShell Core.
Digital transformation has become a top business priority, with many companies across industries focused on transforming their systems, business models and customer engagement to ensure e-business processes create value for the organization.
This makes carving out an effective enterprise digital strategy paramount to success. But, too often, organizations focus on the technological aspects of the transformation and ignore the business side of the equation, according to speakers at the recent LiveWorx 2018 conference in Boston. When building an enterprise digital strategy, organizations should start by looking at the fundamental business problems its leaders want to solve, and then move on to exploring how they can use technology to solve them, according to LiveWorx speaker Sarah Peach, senior director of business development at Samsung Electronics.
If they start with experimenting with the capabilities of a technology, they might end up with something that works, but is of no value to their business, Peach explained during her session, titled “The Next Frontier for Digital Businesses.“
“Starting with the business problem — which then dictates the application and the technologies that you want to use to support that — is the approach that successful companies are taking,” she said.
Co-panelist Anand Krishnan said an enterprise digital strategy should be bucketed into three broad areas — products, platforms and partnerships — to help develop comprehensive use cases that benefit customers.
“Everyone is looking at digital touchpoints, but [should] set the focus on the journey itself, which is very important,” said Krishnan, vice president of big data and AI at Harman Connected Services.
An enterprise digital strategy has to also meet the needs of the organization’s overall business strategy, said Jeffrey Miller, vice president of customer success and advisory services at PTC, based in Needham, Mass.
“You cannot produce a digital strategy without understanding your business strategy,” Miller said during his session, titled “Digital Transformation: Creating a Pragmatic Vision and Strategy.”
In order to move their digital transformation program forward, organizations should couple business strategy with its goals for innovation and its digital strategy, then design use cases that create value for the business, he said.
The evolving enterprise digital strategy in an IoT era
Jeffrey Millervice president of customer success and advisory services at PTC
As companies deal with increasing numbers of connected systems, products and people, their enterprise digital strategy should address how to bring these areas together to meet one of two primary business objectives, said Ranjeet Khanna, a co-panelist of Peach and Krishnan.
“Either create efficiencies for the use cases that they are dealing with, or create a new revenue opportunity,” said Khanna, director of product management for IoT, public key infrastructure and embedded security solutions at Entrust Datacard, based in Shakopee, Minn.
Designing connected products is not just about incorporating mechanical and physical design anymore; manufacturers have to now worry about software design, Peach said. The manufacturing industry has, therefore, witnessed a rapid evolution in their digital strategy in the last couple of years, she added.
According to a recent Capgemini study, manufacturers estimate 47% of all their products will be smart and connected by 2020.
“If you are an OEM, you are now expected to produce a smart, connected product, and all of your digital systems have to change to support that,” Peach said.
The data generated from connected products throughout their lifecycle is another big change that manufacturers deal with today, she said.
“Your digital strategy has to start at the design side and follow all the way through to the end of life of your product,” she said.
CAMBRIDGE, Mass. — While many are curious about the future of BI, here at a business intelligence conference, experts made clear it’s impossible to predict that future.
The best most of us can do is chart the past, keep an eye on the present and follow trends to suggest what might come next. That’s a point futurist Amy Webb and researcher Howard Dresner made quite clear in their keynotes at Dresner’s Real Business Intelligence Conference, which took place here this week.
Identifying trends is more of a science than an art, according to Webb, founder and CEO of the Future Today Institute and a professor at the New York University Stern School of Business.
Futurist sees AI in BI
As a futurist, Webb advises businesses and governments on technology trends. She said the advice she offers is “grounded in data.”
Every year, Webb publishes her Emerging Tech Trends Report. With close to 250 trends in this year’s edition, it covers topics ranging from the current state and future of BI and data management tools to emerging applications for various AI tools.
Machine learning, in automating parts of data analysis, theoretically can better position a business across all of its branches — from sales to marketing to customer service — based on how much data it has available, Webb said.
Emerging tools like deep learning — which functions in a similar way, but can ideally develop more important inferences with less data — have the potential to change the way businesses house their data and how their data retention policies are structured, Webb continued.
Small group of AI giants
Howard Dresnerchief research officer, Dresner Advisory Services
Now, Webb said there are a total of nine companies “driving the entire culture of AI,” including Microsoft, Google and Amazon. She added that she expects the AI sector to consolidate even more.
In addition to their large budgets and powerful AI tools, these AI giants also maintain large-scale data storage tools and possess robust cloud platforms.
With the future of BI and analytics moving toward more cloud storage and automated analysis, Webb said she would expect to see a business “quickly moving to pick which system it will connect with” and flexibility becoming more limited.
Cloud critical to BI
Meanwhile, in his opening keynote at the conference, Dresner highlighted the results of surveys on the state of BI by his firm, Dresner Advisory Services, where he is chief research officer.
Dresner touched on the growing force of cloud computing. Since 2016, when its future was still somewhat uncertain, the cloud has “very much become mainstream,” he said.
AI technologies are developing quickly, too, he said, although perhaps not as fast as its widespread publicity would make it appear. “Smart features will increasingly creep into many solutions,” Dresner said.
Over the last few years, vendors have made big strides in providing software as a service, which has now become fairly common. While dashboards and reporting are still the main priorities of BI teams, emerging technology and tools, such as edge computing and text analytics, are gaining in significance, he said.
In general, the future of BI looks positive, he said, noting that BI budgets are increasing year by year.
Editor’s note:TechTarget, publisher of SearchBusinessAnalytics, is a media partner of the event.
News Director, Business Applications & Architecture Media Group
Billions of dollars are lost to healthcare fraud annually, and one pharmacy benefit manager is fighting back by applying advanced data analytics to a combination of pharmacy and medical claims.
Prime Therapeutics, a privately held company that serves 22 Blue Cross and Blue Shield plans (Blue Plans) and more than 27 million members, is partnering with SAS, an analytics software company, to build an analytics platform designed to help reduce healthcare fraud.
Prime’s relationship with Blue Plans gives it the unique opportunity to apply data analytics not only to its pharmacy claims, but also to the health plans’ medical claims — and the SAS Fraud Framework for Healthcare gives Prime the advanced analytics power to handle the volume of data. The two companies hope to see initial results of their work by fall 2018.
Fraud becoming more complex
In the industry, pharmacy benefit managers typically have built their analytics from the ground up internally, said Jo-Ellen Abou Nader, Prime’s assistant vice president of fraud, waste and abuse operations. “The problem is fraud has gotten much more complex, and we need a big engine, as well as a partner in the industry to move this forward,” Abou Nader said. “That’s why we selected SAS. They have experience in the healthcare industry with IT, and their Fraud Framework was a great engine and opportunity for us to partner with them on building this out.”
The SAS Fraud Framework, a cloud-based product hosted by SAS, enables Prime to manage the entire fraud protection lifecycle, said Stu Bradley, vice president of fraud and security intelligence at SAS. The product “has the capabilities to ingest, enrich, transform data. So we’ve got a big data management component which is inclusive of data quality. It gives us the ability to build and deploy rules all the way through to analytic models, leveraging things like machine learning and artificial intelligence, and deploy those models into a runtime environment such that we can execute it against claims from a healthcare perspective.”
Before SAS entered the picture, Prime used rule-based analytics to scour its pharmacy claims for potential fraud. “The rules are very foundational to identify fraud that we knew of,” Abou Nader said. “But where we’re heading with SAS is being able to look across not only one client’s claims, but we’re able to pull all of the claims in for all our clients under Prime, all of our Blue Plans, to be able to look across the entire network of pharmacies and of members and physicians so that we’re not working in a silo.”
Jo-Ellen Abou Nader
For Prime, the SAS Fraud Framework will include more than 1,000 scenarios, or models, designed to identify the risk of healthcare fraud. The framework will be built out by SAS “so that all these scenarios trigger risk scores,” Abou Nader said. “There are risk scores for each member, for each physician, for each pharmacy. That’s how our analytics team will prioritize.”
By integrating pharmacy and medical claims, Abou Nader said, “a scenario we may look at is pharmacy claims with no associated medical office visits– [for example,] a pain management patient where they are on controlled substances but we see no office visits within the last six months. We’re looking also at duplicate therapy across the pharmacy benefit and medical benefit.
“It’s not that you have to look in each scenario,” Abou Nader said. “SAS has done the job for us so it’s not a needle in a haystack. They have identified where each physician, for example, triggers on each scenario.”
Taking a holistic approach
What Bradley finds intriguing from an analytics perspective is the ability to look at data patterns across various insurance plans.
Jo-Ellen Abou NaderAVP of fraud, waste and abuse operations, Prime Therapeutics
By managing healthcare fraud detection at an individual level, the Blue Plans “have insight only into their own data and their own claims,” Bradley said. “But if you can look holistically across those 22 plans, you have a much better opportunity to identify some of the more complex fraud schemes that may extend across multiple plans and be able to link that data together such that you can find and identify where you might have great risk. We call that consortium modeling.”
This approach, he said, provides “better accuracy, better prioritization of risk and it reduces the false positives because you can look at, say, a specific pharmacy that is paying prescriptions across multiple plans and be able to look at their behavior holistically versus in their respective silos.”
Because the Fraud Framework is hosted by SAS, implementation on Prime’s part centered on ensuring data quality. “There has been a lot of discussion around pushing this amount of data to SAS,” Abou Nader said. The focus particularly has been on “how they digest the information, because medical claims look very different than pharmacy claims. With this integration we’ve really had to focus on how the analytics will look and the outcomes we want, so it’s really been a process to go through with SAS to get to where we need to be.”
Don’t overlook the importance of data
One example of the tandem approach Abou Nader mentioned centered on preserving the quality of the claims data under examination.
When you consider the overall lifecycle of fraud detection, Bradley said, “it’s got to start with the data. We’ve got to be able to provide the appropriate analytic capabilities to build and deploy those models. You need to be able to execute and score every single transaction, and in this case it’s claims.”
Ensuring the quality of Prime’s data was a joint effort, Bradley said. The effort starts “with the identification of the appropriate data sources. We leverage their domain experts who know a lot about Prime’s data, in concert with our domain experts who know a lot about broader pharmacy claims data across multiple organizations and how we need to best structure that for fraud. Part of that is cross-pollinating those skill sets and defining how we’re going to transform that data and get it into an analytical-ready format such that we can deliver [more quickly] analytics that is going to be more accurate, and serve up the data to the end users that is going to be in as valuable a format as possible.”
Fraud continually evolves
SAS has “gathered all of our data,” Abou Nader said. “Now we’re in the process of pushing all that data and defining all the analytics and the model scoring as it relates to our set of data. So it’s quite a bit of lift on the SAS side.”
She said Prime and SAS are targeting an October go-live date for the healthcare fraud framework. “Our goal is to be able to get cases out to test them. We’re expecting to see results as early as September, when we will start getting alerts to test — the actual alerts to say look at this member, look at this physician, look at this pharmacy, they are the highest risk. We will start getting some of those in the September, October timeframe to be able to test and go validate to say is this fraud or is this not fraud.”
But this won’t see the end of the work. Fraud is an evolution, Abou Nader said, and so is healthcare fraud data analytics modeling. “I like to say [fraud] is like whack-a-mole. As soon as we’ve caught the last fraud, here comes the next scenario.
“The great thing about SAS is this is a long-term partnership where we will continue to update the models with the latest fraud schemes that are happening.”