Tag Archives: began

Using automated machine learning for AI in insurance

Mitsui Sumitomo Insurance, one of the largest insurance firms in Japan, began the process of digital transformation several years ago. The company launched multiple projects, and continues to start new projects, to send it further into the digital age.

One of MSI’s more ambitious undertakings is the MS1 Brain platform, an AI in insurance project to create a more personalized experience for customers.

AI in insurance

Released earlier this year, the MS1 Brain platform uses machine learning and predictive analytics, along with customer data, including contract details, accident information and lifestyle changes, to recommend products and services to customers based on their predicted needs.

The platform also generates personalized communications for customers.

“Our business model is B to B to C [business to business to consumer]. We provide our products through agencies,” said Teruki Yokoyama, deputy manager of digital strategy in the department of digital business at MSI. “Until now, we have provided products to customers, both individuals and corporations mostly by leveraging experienced agents’ intimate knowledge of client needs.”

“By providing the needs analysis outcomes of each customer to the agency by MS1Brain, now even an inexperienced agency can make optimal proposals to customers with higher demands,” he continued.

To build the platform, MSI chose dotData, a startup automated machine learning vendor based in San Mateo, Calif.

Machine learning
Mitsui Sumitomo Insurance used automated machine learning to build out a machine learning platform

Automated machine learning

MSI first connected with dotData in 2017, when MSI‘s CIO visited Silicon Valley for a technical survey, Yokoyama said.

At that time, dotData was just getting started, and it hadn’t released a product. Still, MSI was intrigued by its automated machine learning platform, which claims to provide full-cycle machine learning automation. DotData competitors include DataRobot, H2O.ai and Auger.ai.

Automation of the data science process is the only way a company can truly deliver value from AI/ML investments and provide competitive differentiation by investing in predictive analytics.
Teruki YokoyamaDeputy manager of digital strategy, Mitsui Sumitomo Insurance

“When it comes to data analysis, model accuracy often gets the most attention; dotData, on the other hand, focuses on how quickly you can move from raw data to working models — the AI-based feature engineering is what stood out,” Yokoyama said.

MSI had to build a lot of intelligent models, said Ryohei Fujimaki, CEO and founder of dotData. But, the firm didn’t have the data science team to build them.

DotData’s platform was scalable and enabled MSI to automate the entire AI building process, from feature generation to model implementation, Yokoyama said.

“Everyone should embrace this approach,” said Yokoyama of the automated machine learning approach.

“Automation of the data science process is the only way a company can truly deliver value from AI/ML investments and provide competitive differentiation by investing in predictive analytics,” he said.

Go to Original Article
Author:

Jira Roadmaps connect to Confluence, await Code Barrel

Atlassian’s Jira Roadmaps began to sync up with the rest of its cloud-based product line this week, and more integrations will become available this quarter, as users await further streamlining of the company’s tools.

Jira Roadmaps, which offer high-level views into team projects and their projected delivery timelines, became available for the latest version of Jira Software Cloud in October 2018. Jira Software Cloud is distinct from Jira Server, a much older on-premises version of the nearly 20-year-old product.

This week’s updates include several refinements to the Roadmaps workflow interface, such as clearer visualizations of dependencies between Roadmap projects, and finer-grained workflow editing features in the top-level UI. Most significantly, users can now add multiple live Jira Roadmaps images to Confluence documents that offer business managers an organization-wide view of software projects, a key component of enterprise BizDevOps strategy.

“We use Confluence for our internal wiki,” said Chester Dean, director of business technology operations at Looker, a business intelligence firm in Santa Cruz, Calif. “The new integration will give us access to embedded visualizations of next-gen workflows.”

Looker, which Google acquired in June 2019, uses its own project-tracking tools within the previous version of Jira, known as Jira classic, which Atlassian also offers to customers through a partnership between the two companies. Looker still uses the older version of Jira along with the latest version, dubbed Jira next-gen, as users can get started quickly on projects in the newer edition, but the company still relies on some older features.

“We get people to model what they want in next-gen, then build it in classic,” he said. “Next-gen reduces the amount of admin time it takes to learn and understand how to use Jira, but it isn’t yet ready to replace classic for us.”

Jira Roadmaps in Confluence
Atlassian’s Jira Roadmaps can now be embedded in Confluence documents

Jira Roadmaps, Code Barrel offer ease of use

One feature the latest version of Jira lacks is the ability to link workflows between different projects, but an Atlassian spokesperson said that feature is in the works. Dean said he understands that the priority for Atlassian is to keep Jira Roadmaps and the latest version of Jira Software Cloud current.

“There are a bunch of [vendors] building project management tools, and Atlassian has to be there for the next generation of developers,” he said.

Next month, Atlassian will also roll out integration between the latest version of Jira and the Jira automation tools it acquired with Code Barrel last fall. Code Barrel’s rules builder software automates routine tasks for Jira administrators, such as automatically pre-populating issues with associated subtasks.

Non-technical teams at Looker such as marketing and customer service have taken to the latest version of Jira because of such usability features, Dean said.

Still, Dean isn’t alone in wanting more cohesion between the two versions of Jira Software Cloud, as well as between the multiple products in the overall Jira line. Jira Roadmaps for the older version of the product are not yet generally available, but were previewed at the Atlassian Open summit in Boston last October, and users at that event also said they’d like to share information more easily between the two versions of the product.

However, Jira Roadmaps workflows are fundamentally designed to be independent from one another, so that Jira administrators don’t have to manage changes. This may complicate upgrades for users of the older version, but in the long run, analysts warn that enterprises should expect such disruptions.

“From one generation to another, there are new ways of working,” said Thomas Murphy, an analyst at Gartner. “Customers are used to a certain way of doing things, but those features might operate differently than they expect in a new product.”

Atlassian’s software integration balancing act

While cloud-only users wish for more features in common between Jira next-gen and classic, enterprise companies in on-premises and hybrid cloud environments would also like to see some next-gen Jira features added to Atlassian’s Jira Server.

But the company has made clear that its emphasis will be on cloud and next-gen products, and it says more than half of its enterprise customers have already moved to the cloud version. Some 45% of Jira users have also moved to next-gen as of this month, the company said. At this point, Jira Software Cloud and Server products are developed separately on different codebases, which introduce different constraints, making it unlikely they will share features.

In part, this is because Atlassian increasingly competes with Agile planning and DevOps software vendors that don’t offer on-premises products at all, such as Zendesk and GitLab, Gartner’s Murphy said. Another competitive product, Microsoft’s Azure DevOps, offers the same features both on-premises and in the cloud, but Azure DevOps users face their own integration and upgrade challenges as Microsoft moves toward GitHub.

Meanwhile, Atlassian sweetened the cloud deal for reluctant enterprise users when it shored up its cloud security features and began offering a cloud SLA last year, after a move to AWS in 2018 improved its reliability. In November 2019, the company introduced Atlassian Forge, a framework software partners and IT pros can use to convert popular plugins available for on-premises products for use with the cloud suite, which had been another major hindrance to enterprise cloud migration.

Atlassian has pledged to streamline and rationalize all of its Jira products, which include Portfolio for Jira and Jira Align, based on Atlassian’s acquisition of AgileCraft in 2019, and link them through a unified data repository. Company spokespeople said this week that work will continue throughout 2020, along with CI/CD pipeline integration for Jira, likely to be launched at Atlassian Summit in early April.

Go to Original Article
Author:

SAP leadership changes and S/4HANA were top 2019 SAP stories

SAP ended the decade with one of the most unsettled years in its 47-year existence.

The year began with layoffs, as the German enterprise giant restructured its workforce in March to the tune of 4,400 fewer employees. This was just the first shoe to drop in a year of organizational turmoil. In April, several prominent members of the SAP leadership ranks left for different pastures. The biggest shoe dropped in October with the abrupt departure of longtime CEO Bill McDermott. He was replaced by co-CEOs Jennifer Morgan and Christian Klein.

Bill McDermottBill McDermott

In 2019, the saga of S/4HANA migrations that has dogged the company for several years continued. The clock is running to get customers to move to the “next-generation ERP” system before 2025, when SAP’s support for legacy ERP systems will end. However, it appears that the migrations to S/4HANA continue to lag.

Here’s a look back at some of the most prominent issues that characterized 2019 for SAP.

SAP leadership changes

The first indication that it might be an interesting year for SAP came in March when the company announced the layoffs of 4,400 employees. The layoffs appeared to be particularly significant for SAP because they included veterans of HANA development and the ABAP programming language which has formed the foundation of HANA. This raised questions about SAP’s commitment to the future of its signature in-memory database platform. The layoffs also hit SAP leadership, as longtime executive Bjoern Goerke was cashiered from his role as CTO.

Jennifer MorganJennifer Morgan

SAP leadership had a major shift in April as several executives left the company, led by cloud business head Rob Enslin, who later surfaced at Google. SAP stayed in-house by promoting veterans Jennifer Morgan, Adaire Fox-Martin and Juergen Mueller. The SAP leadership changes in April led many observers to question the future direction of the company, but the general feeling was that SAP’s deep bench would keep the company stable.

“The fact is, particularly at SAP, there’s a lot of structural organization underneath any individual leader, which keeps that organization moving forward,” analyst Joshua Greenbaum of Enterprise Applications Consulting said at the time. “Some people are saying this is going to have a huge impact across the board and have a direct impact on customers, but I just don’t see it.”

Christian KleinChristian Klein

The deep bench will be put to the test. In October, CEO Bill McDermott, who had led the company for nine years, abruptly resigned. McDermott, who served as the prominent public face of the company, was replaced by Morgan and Klein, who assumed the roles of co-CEO. But the direction under the new SAP leadership remains to be seen.

“It makes sense to pass the baton at this moment, when there’s a transition in place to the cloud internally,” Greenbaum said. “So it’s a good time to freshen up the scene with the two new co-CEOs and it was a good moment for [McDermott] to step out.”

S/4HANA migration debate continues

The questions of if, when and how to migrate to SAP S/4HANA have become perennial issues in the last few years for SAP customers. SAP has been touting the transformative nature of the “next-generation” ERP since it debuted in 2015, but the number of customers actually making the move continues to lag. SAP S/4HANA promises to transform existing business processes and develop new business models by adding intelligence that enables benefits such as real-time decision making and predictive analytics.

The willingness of SAP customers to move to S/4HANA is debatable. In June, a survey from Rimini Street, a firm that provides third-party support services for SAP and Oracle ERP systems, indicated that almost 80% of SAP customers plan to stay on SAP ECC at least until the 2025 deadline. The report showed that SAP customers want to control their own fate rather than be forced into an S/4HANA migration, according to Hari Candadai, Rimini Street global vice president of product marketing.

“SAP customers are now taking control of their roadmaps and are disconnecting, or want to disconnect, from SAP’s planned 2025 end of support to their flagship ECC product line,” Candadai said at the time.

Another survey conducted in December 2018 by ASUG (Americas SAP User Group) showed that more than half of the ASUG members who responded, 56%, said they plan to move to S/4HANA but have not taken concrete steps in that process yet.

“There’s a general sense that they want to do it, they’re planning for it, they’re gathering data and information, and they’re looking for use cases that drive a business case,” Geoff Scott, ASUG CEO, said at the time about the respondents.

Many in the SAP ecosphere say the question of an S/4HANA migration has moved from “if” to “when.” Some, like Chris Carter, CEO of Approyo, a Wisconsin-based SAP partner, believe that S/4HANA migration fears are unfounded and that customers delaying the move are in danger of missing out on the advantages that they will gain from S/4HANA.

“The S/4HANA migration is going to happen and must happen,” Carter said at the time. “It’s not only because of 2025, it’s because of the innovation that SAP is putting into the products, and that partners are putting into the products and the tools.”

For many, however, the lack of a compelling business case is the largest impediment to an S/4HANA migration. In order to make the case, SAP customers need to see an S/4HANA migration as part of a larger effort to transform business processes, according to experts. This can involve developing a strategy around a company’s SAP landscape, including cloud products like Ariba, Concur and SuccessFactors, said Len Riley, commercial advisory practice leader at UpperEdge, a Boston-based IT advisory firm.

Once companies decide that they are going to move to S/4HANA, they will need strong leadership and project management skills to manage the process successfully. Vinci Energies provided a case in point, as the Paris based energy company completed a nine-month project to deploy S/4HANA. The company has begun to realize the benefits of the advanced but simplified financial model that the S/4HANA core.

“We are currently running about 10 billion euros and more than 700 legal entities,” Dominique Tessaro, Vinci Energies’ CIO, said at the time. “All modules are running on a single SAP S/4HANA instance and client.”

The search for the intelligent enterprise

Related to SAP’s efforts to move customers to S/4HANA is the concept of the SAP intelligent enterprise. Although SAP positions S/4HANA as the digital core of the intelligent enterprise, there continues to be a lack of understanding as to what the term really means.

In SAP’s vision, the intelligent enterprise is more than just an implementation of a technology or technologies, but requires a transformative shift in the organization’s culture and processes. This means bringing together operational data, or “O” data, and experiential data, or “X” data, to enable companies to analyze data in new ways and reach decisions that may not have been possible before.

“The intelligent enterprise is about, ‘How do I run my organization in a way that is capable of responding to the outside world and leverages everything I have available?'” said Geoff Maxwell, program manager at SAP Transformation Office.

Still, defining the SAP intelligent enterprise remains elusive because the circumstances and reason for implementing it are unique to each organization, according to Paul Saunders, senior director and analyst at Gartner.

“What makes one organization an intelligent enterprise is not what makes another an intelligent enterprise,” Saunders said. “That’s where SAP can’t dictate what an intelligent enterprise is [and] needs to play a role in helping organizations become an intelligent enterprise.”

However, organizations will need to step up efforts to get to the intelligent enterprise — whatever their definition — in order to keep up with the pace of change in business and technology, said Steve Niesman, CEO of Itelligence, an SAP reseller that provides implementation and managed services for SAP systems.

When O’s and X’s meet

The SAP intelligent enterprise depends on the integration of O data with X data, which was the main reason SAP spent $8 billion on Qualtrics in 2018. How and why this integration will happen was a main concern of SAP in 2019.

At the Sapphire conference in May, former CEO McDermott and Qualtrics CEO Ryan Smith stressed the need for companies to provide an exceptional customer experience in the era of smartphones and social media.

“Today organizations are disproportionally rewarded when they deliver great experiences,” Smith said. “And are absolutely punished when they don’t.”

However, beyond a few interesting use cases, there was no sign yet that SAP customers were jumping headlong into the X and O marriage. The idea may be valid, and potentially valuable, but much like the SAP intelligent enterprise, organizations seem to be approaching Qualtrics-centered integrations cautiously.

“If SAP can change people’s mindsets, if they can show CIOs that this does more than just the back office, then [companies] can bring in their CEO and show it is an end-to-end platform and worth investing in,” Isaac Sacolick, president and CIO of consultancy StarCIO, said at the time.

After a tumultuous 2019, that’s likely top of SAP’s list for 2020.

Go to Original Article
Author:

How Microsoft re-envisioned the data warehouse with Azure Synapse Analytics

About four years ago, the Microsoft Azure team began to notice a big problem troubling many of its customers. A mass migration to the cloud was in full swing, as enterprises signed up by the thousands to reap the benefits of flexible, largescale computing and data storage. But the next iteration of that tech revolution, in which companies would use their growing stores of data to get more tangible business benefits, had stalled.

Technology providers, including Microsoft, have built a variety of systems to collect, retrieve and analyze enormous troves of information that would uncover market trends and insights, paving the way toward a new era of improved customer service, innovation and efficiency.

But those systems were built independently by different engineering teams and sold as individual products and services. They weren’t designed to connect with one another, and customers would have to learn how to operate them separately, wasting time, money and precious IT talent.

“Instead of trying to add more features to each of our services, we decided to take a step back and figure out how to bring their core capabilities together to make it easy for customers to collect and analyze all of their increasingly diverse data, to break down data silos and work together more collaboratively,” said Raghu Ramakrishnan, Microsoft’s chief technology officer for data.

At its Ignite conference this week in Orlando, Florida, Microsoft announced the end result of a yearslong effort to address the problem: Azure Synapse Analytics, a new service that merges the capabilities of Azure SQL Data Warehouse with new enhancements such as on-demand query as a service.

Microsoft said this new offering will help customers put their data to work much more quickly, productively and securely by pulling together insights from all data sources, data warehouses and big data analytics systems. And, the company said, with deeper integration between Power BI and Azure Machine Learning, Azure Synapse Analytics can reduce the time required to process and share that data, speeding up the insights that businesses can glean.

What’s more, it will allow many more businesses to take advantage of game-changing technologies like data analytics and artificial intelligence, which are helping scientists to better predict the weather, search engines to better understand people’s intent and workers to more easily handle mundane tasks.

This newest effort to break down data silos also builds on other Microsoft projects, such as the Open Data Initiative and Azure Data Share, which allows you to share data from multiple sources and even other organizations.

Microsoft said Azure Synapse Analytics is also designed to support the increasingly popular DevOps strategy, in which development and operations staff collaborate more closely to create and implement services that work better throughout their lifecycles.

YouTube Video

A learning process

Azure Synapse Analytics is the result of a lot of work, and a little trial and error.

At first, Ramakrishnan said, the team developed highlevel guidelines showing customers how to glue the systems together themselves. But they quickly realized that was too much to ask.

“That required a lot of expertise in the nitty gritty of our platforms,” Ramakrishnan said. Customers made it overwhelmingly clear that we needed to do better.”

So, the company went back to the drawing board and spent an additional two years revamping the heart of its data business, Azure SQL Data Warehouse, which lets customers build, test, deploy and manage applications and services in the cloud.

A breakthrough came when the company realized that customers need to analyze all their data in a single service, without having to copy terabytes of information across various systems to use different analytic capabilities – as has traditionally been the case with data warehouses and data lakes.

With the new offering, customers can use their data analytics engine of choice, such as Apache Spark or SQL, on all their data. That’s true whether it’s structured data, such as rows of numbers on spreadsheets, or unstructured data, such as a collection of social media posts.

This project was risky. It involved deep technical surgery: completely rewriting the guts of the SQL query processing engine to optimize it for the cloud and make it capable of instantly handling big bursts of work as well as very large and diverse datasets.

It also required unprecedented integration among several teams within Microsoft, some of whom would have to make hard choices. Established plans had to be scrapped. Resources earmarked for new features would be redirected to help make the entire system work better.

“In the beginning, the conversations were often heated. But as we got into the flow of it, they became easier. We began to come together,” Ramakrishnan said.

Microsoft also had to make sure that the product would work for any company, regardless of employees’ technical expertise.

“Most companies can’t afford to hire teams of 20 people to drive data projects and wire together multiple systems. There aren’t even enough skilled people out there to do all that work,” said Daniel Yu, director of product marketing for Azure Data and Artificial Intelligence.

Making it easy for customers

Customers can bring together various sources of data into a single feed with Azure Synapse Analytics Studio, a console – or single pane of glass that will allow a business professional with minimal technical expertise to locate and collect data from multiple sources like sales, supply chain, finance and product development. They can then choose how and where to store that data, and they can use it to create reports through Microsoft’s popular Power BI analytics service.

In a matter of hours, Azure Synapse will deliver useful business insights that used to take days or even weeks and months, said Rohan Kumar, corporate vice president for Azure Data.

“Let’s say an executive wants a detailed report on sales performance in the eastern U.S. over the last six months,” Kumar said. Today, a data engineer has to do a lot of work to find where that data is stored and write a lot of brittle code to tie various services together. They might even have to bring in a systems integrator partner. With Azure Synapse, there’s no code required. It’s a much more intuitive experience.”

The complexity of the technical problems Azure Synapse addressed would be hard to overstate. Microsoft had to meld multiple independent components into one coherent form factor, while giving a wide range of people – from data scientists to line of business owners – their preferred tools for accessing and using data.


With Azure Synapse, there’s no code required. It’s a much more intuitive experience.”

~ Rohan Kumar, corporate vice president for Azure Data


That includes products like SQL Server, the open source programming interface Apache Spark, Azure Data Factory and Azure Data Studio, as well as notebook interfaces preferred by many data professionals to clean and model data.

“Getting all those capabilities to come together fluidly, making it run faster, simpler, eliminating overlapping processes – there was some scary good stuff getting done,” Ramakrishnan said.

The result is a data analytics system that will be as easy to use as a modern mobile phone. Just as the smartphone replaced several devices by making all of their core capabilities intuitively accessible in a single device, the Azure Synapse “smartphone for data” now allows a data engineer to build an entire end-to-end data pipeline in one place. It also enables data scientists and analysts to look at the underlying data in ways that are natural to them.

And just as the phone has driven waves of collaboration and business innovation, Azure Synapse will free up individuals and companies to introduce new products and services as quickly as they can dream them up, Microsoft said.

“If we can help different people view data through a lens that is natural to them, while it’s also visible to others in ways natural to them, then we will transform the way companies work,” Ramakrishnan said. That’s how we should measure our success.

Top photo: Rohan Kumar, corporate vice president for Azure Data, says Azure Synapse will deliver useful business insights that used to take days or even weeks and months. Photo by Scott Eklund/Red Box Pictures.

Related:

Go to Original Article
Author: Microsoft News Center

MyPayrollHR collapse stirs allegations, questions, anger

For Tanya Willis, executive director at Agape Animal Rescue, last Thursday began like any other day — until an employee called. The employee’s deposited pay had been withdrawn from her account. The same was true for all seven of the organization’s employees. Willis called her payroll provider, MyPayrollHR.

A MyPayrollHR employee said there was a “glitch in the system,” to keep track of the missing money, and that there were “no worries,” according to Willis. But it wasn’t just a glitch in the system and the worries were only beginning.

Money continued to drain out of the animal rescue employees’ personal accounts. It didn’t stop with one paycheck reversal. “Most of our employees are still sitting with negative overdrafts in their accounts,” Willis said. One employee was told by a bank that it would be 45 to 60 days before she was made whole, she said. 

Some of the bank reversals sought arbitrary amounts, Willis said. The personal account of one employee of this Nashville-based organization showed “a negative $999,000.” She provided a screenshot of the employee’s account to SearchHRSoftware.

Employee checking account after MyPayrollHR collapse
The collapse of MyPayrollHR set into motion unusual activity in employee accounts. Many were subject to reversals in pay. But one Agape Animal Rescue employee saw reversals nearing $1 million.

Agape Animal Rescue wasn’t alone in its experience. Not long after Willis spoke with MyPayrollHR, the Clifton Park NY-based firm said it was closing “due to unforeseen circumstances.” The firm had some 4,000 customers, mostly SMB-sized businesses. The total payroll involved was about $26 million. That money is gone and there is no explanation for it. It’s unclear how many workers saw reversals in their personal bank accounts.

With MyPayrollHR’s announcement, the company also shut its doors to answers, including press queries. New York State Gov. Andrew Cuomo announced an investigation late last week. “We will not allow these bad actors to take money away from the hard-working people in this state,” he said.

The questions pile up

The MyPayrollHR story has left people shocked. Employees, including low wage workers, discovered money removed from their accounts. Employers are scrambling to cover the amounts. Questions have multiplied.

A key player in this is MyPayrollHR’s automated clearing house (ACH) network vendor Cachet Financial Services. An ACH facilitates electronic payments. Cachet, which is based in Pasadena, Calif., said it is a fraud victim. It is pointing the finger at MyPayrollHR and its parent company, Valuewise Corp.

One of the services that Cachet offers is direct deposit for payroll processing firms. It was providing these services for MyPayrollHR for about 12 years.

There are two parts to an ACH transaction: First, the payroll processing firm directs the ACH to take money out of an employer’s account; the money is moved into a settlement or holding account, which is under the control of the ACH. Second, money is moved from the settlement account into various employee accounts. The process is standard in the industry, according to Wendy Slavkin, Cachet general counsel.

MyPayrollHR uploaded a file instructing Cachet to take money out of employer accounts for payment to employees. Once Cachet did that, the money should have been put into a Cachet settlement account. But that didn’t happen here.

Account numbers were manipulated

“MyPayrollHR manipulated the account numbers in that electronic file so that the money was taken out of the employer’s account and put into an account controlled by MyPayrollHR, not a Cachet settlement account, as it should’ve been,” Slavkin said.

That problem did not stop the ACH transaction process.

The ACH system then automatically tried to access MyPayrollHR accounts, which came back as frozen. To fulfill the transaction, money was taken out of Cachet’s holding account and placed into employee accounts. “Now all of a sudden Cachet is out $26 million because it’s effectively made the payroll for all these companies,” Slavkin said.

Slavkin said Cachet then initiated reversals to get its money back from employee accounts. She said that is a standard process. But some accounts were dinged twice for reversals because the first reversal was not coded in accordance to National Automated Clearinghouse Association (NACHA) standards. 

Slavkin said initial reversals “should have been by law and by NACHA standards rejected by the receiving bank,” so it initiated a second reversal to reclaim the money. They said banks, for the most part, rejected the first and second reversals. But many were still affected, including some of MyPayrollHR’s largest payroll processing customers.

But Cachet, apparently, had a change of heart about its direction. It then began a process to alert the hundred-plus banks involved in the employer wage payments and advised them to reject both sets of reversals so the employees could have their money, according to Slavkin.

The employees will be made whole

Cachet has been in touch with the FBI, Slavkin said. It said it is out about $26 million as a consequence of the MyPayrollHR incident.

“The employees will have their pay back, but in a sense, they’ve been paid by Cachet, and not their employer,” Slavkin said.

The total losses from MyPayrollHR may be as high as $35 million, if it includes an additional $9 million in unpaid tax withholding managed by a second ACH provider, according to Cachet.

“We just have to see how this plays out and how effective the FBI can be in helping us recoup our money,” Slavkin said.

Cachet said there were no more than two reversals, but some of the employers say there have been three.

What is clear is that this problem unfolded suddenly, and employers had very little time to react. 

Among those with questions is Jeff Salter, founder and CEO at Caring Senior Service in San Antonio, which has 1,500 employees in 23 states. As many as 1,200 of the elder care provider’s employees were affected by the problem. It was a customer of MyPayrollHR for more than three years and it was happy with the service — until last week. 

A scramble to fix the problem

At about 7 a.m. on Thursday, Salter discovered his own paycheck had a pending reversal. He started investigating and discovered a larger problem.

Cachet had initiated the reversals to Caring Senior Service’s accounts, Salter said. “I know [Cachet] may claim that one of the transactions was to reverse a non-funded transaction to them, but they have done the withdrawal twice and in some cases three times,” Salter said.

Salter questions how banks can approve two reversals.

“It is clearly evident that there is something nefarious occurring, yet they are allowing reversals to occur even when their account holders are being hurt by their decisions,” Salter said. “We’ve been focused on supporting our employees first; 90% of our employees are caregivers and none of our employees deserve this treatment.”

MyPayrollHR’s abrupt shutdown is forcing some affected businesses to use reserves to cover employees. Meanwhile, employees are in various stages of getting the banks to cover the payroll amounts.

Brad Mete is the managing partner of two recruiting and staffing firms, Affinity Resource, an employment agency, and IntellaPro LLC, a professional staffing firm. The Fort Lauderdale-based firms employ about 800. About 600 of their employees were impacted. Mete learned about the problem from an employee Thursday. 

In the absence of clear answers from MyPayrollHR, Mete and his partners suspected a big problem. “We knew the writing was on the wall, that there was probably something going down there,” he said.

It was chaos times two

Mete and his team stayed late Thursday in the office and paid people directly out of its financial cushion. People had to be paid Friday. They issued direct deposits on cash pay cards, which are similar to debit cards.

In the meantime, Mete has questions. They weren’t pulling the money out of the employees’ accounts, but who was, and why? What was happening to it? They had no way of getting answers to these questions, but they had to solve the problem.

At 5:45 a.m. on Friday, Mete’s phone started ringing.

“I didn’t have time to breathe — it was chaos times two,” Mete said.

Mete had to talk to employees, clients and banks to work through the problem. Explaining the problem to employees and the plan to pay them wasn’t easy. Some of the employees work in warehouses, “and they’re not going to give you a hug when you take two weeks of pay out of their pocket,” Mete said. 

Mete worried clients might blame him. He had to tell them: “We’re not the ones that pulled the plug here. We are the victims. We are stepping up. I need you to stand by me. We’re going to do the right thing here,” he said.

Explanations leave questions

Cachet’s explanation is unlikely to satisfy the questions. Money is still missing from employee accounts. Employees are still struggling with banks to fix the problems, and there are transactions that make no sense.

Agape Animal Rescue’s Willis said Thursday there was one withdrawal from employee accounts. On Friday there was another withdrawal or reversal to employee accounts.

Agape’s bank established an advocate to help employees work with their various banks to get the money returned. Some banks have been cooperative; in other cases, employees are still fighting for their money, she said.

Some of the second withdrawals on Friday were for the amount of an employee’s pay, and others were for arbitrary amounts, Willis said.

One Agape employee’s account was pinged again and again “by whatever company was trying to get the money,” Willis said. “It just continuously did it and it finally stopped at almost a million dollars.” The employee only had a relatively small amount of money in the account.

The reversals, or attempts to pull money out of employee accounts, haven’t ended, Willis said.  On Monday, there was another attempt at a reversal.

Willis has been reading the victim accounts on a new Facebook group. “You’re just inundated with horrible stories of people that were affected in different ways.”

The stories from victims “just continue and continue,” Willis said, and there are still people whose bank accounts are getting pinged for money.

What would the people responsible say?

Willis said the shelter has suffered, so far, about $10,000 in losses. Employee taxes were taken out of employee checks, but the payments weren’t made to tax authorities. “We still owe taxes in the next few weeks,” she said.

They are now appealing for financial help.

The organization rescues as many as 200 dogs a year, and it’s an expensive process that includes medical support such as spaying or neutering, helping owners who can no longer take care of their pets, taking in dogs from animal control and dealing with abuse cases. The shelter’s work can include behavioral help for dogs, as well as grooming to get them ready for their “forever families.”

Willis was asked what she would say to MyPayrollHR if the people responsible were in front of her.

“I would have to ask, what made you feel like you have the right to disrupt so many lives?” Willis said. “People don’t do this to people.”

Go to Original Article
Author:

Gears 5 Versus Multiplayer Tech Test Now Open to All Xbox Live Gold Members – Xbox Wire

The Gears 5 Versus Tech Test began last weekend, inviting both those players who have pre-ordered Gears 5 and Xbox Game Pass members to rev up their Lancers and roadie run through their opponents on the way to victory. Thanks to the incredible Gears fans, both new and veteran, who made our initial Gears 5 Versus Tech Test weekend a smashing success. This weekend we’re extending the invitation to join the action to all Xbox Live Gold members as part of Xbox Live Free Play Days this weekend.

Starting at 10 a.m. PDT tomorrow, COG hopefuls who are new to the Gears 5 fight can enlist in the Boot Camp training mode and then jump right into Arcade, the new game type designed for over-the-top fun. If you’re returning for another round following last week’s Tech Test, be sure to check out fan-favorite King of the Hill, the updated competitive game type Escalation and continue your Tour of Duty, a series of challenges that grant sweet rewards. A new reward has also been added – the Wreath Bloodspray, which can be earned by completing one match during the Tech Test from 5:30 – 6:30 pm PDT on Friday, July 26. No matter your level of Gears familiarity, Gears 5 offers a game type for you.

To ensure you’re ready for battle, search the Microsoft Store for Gears 5 Tech Test and download to your library. The Tech Test is available for download right now, but servers will be offline until tomorrow at 10 a.m. PDT. We also wanted to remind you that online multiplayer will require Xbox Live Gold for Xbox Game Pass for Console members, and that because this Tech Test is to help test our servers, you might encounter some queueing as you start to play. We hope to make Gears 5 a great online experience for all players and Xbox Game Pass members on launch, and this Tech Test will help.

Thanks for your support and we look forward to seeing you online! Be sure to visit www.gears5.com and follow @GearsofWar on Facebook and Twitter to keep up-to-date with the latest on Gears 5 ahead of its worldwide launch on September 10.

Gears 5 will release on September 10 for Xbox One, Windows 10 PC and Xbox Game Pass. Early access starts on September 6 for Xbox Game Pass Ultimate members and Gears 5 Ultimate Edition purchasers. Pre-order details can be found on the Microsoft Store.

Go to Original Article
Author: Microsoft News Center

Another patched Apache Struts vulnerability exploited

At least one malicious actor began exploiting a critical vulnerability in Apache Struts in the wild, despite a patch being issued last week.

According to researchers at Volexity, a cybersecurity company based in Washington, D.C., the exploits of the Apache Struts vulnerability surfaced in the wild not long after a proof-of-concept (PoC) exploit was published publicly on GitHub.

The Apache Software Foundation posted a security bulletin about the vulnerability — tracked as CVE-2018-11776 — on Aug. 22, 2018, and said that a remote code execution attack is possible “when namespace value isn’t set for a result defined in underlying configurations and in same time, its upper action(s) configurations have no or wildcard namespace. Same possibility when using url tag which doesn’t have value and action set and in same time, its upper action(s) configurations have no or wildcard namespace.”

The flaw, which was discovered and reported in April by security researcher Man Yue Mo of Semmle Inc., a software analytics company based in San Francisco, affects Struts 2.3 through 2.3.34 and Struts 2.5 through 2.5.16. Apache patched the vulnerability and noted that upgrading to version 2.3.35 or 2.5.17 would solve the problem. However, only a day after Apache posted its security bulletin, a researcher posted a PoC exploit on GitHub.

“Shortly after the PoC code was released, Volexity began observing active scanning and attempted exploitation of the vulnerability across its sensor network,” Volexity researchers said in a blog post. “The in-the-wild attacks observed thus far appear to have been taken directly from the publicly posted PoC code.”

The researchers also noted that the vulnerability is “trivial to exploit” and has already seen at least one malicious actor attempt to exploit it “en masse in order to install the CNRig cryptocurrency miner.”

“Although the main payload for Apache Struts exploits appears to be cryptocurrency miners, failure to patch also leaves an organization open to significant risk that goes beyond cryptomining.”

In 2017, another Apache Struts vulnerability — enabling remote code execution exploits — was disclosed; shortly after that disclosure, the vulnerability was exploited in the massive Equifax data breach that exposed 148 million U.S. consumers’ personal data.

Enterprises and users are encouraged to update to the patched versions of Apache Struts immediately so as not to become the next victim of an Equifax-like data breach.

In other news:

  • Facebook removed its own security app, Onavo Protect, from Apple’s App Store this week because of its privacy issues. Onavo is a free VPN app that Facebook acquired in 2013 to collect data on how much its users use other mobile apps. Apple updated its App Store rules in June to ban the collection of information about other apps installed and in use on mobile devices. Apple reportedly urged Facebook to voluntarily remove the app from the App Store after Apple ruled that Onavo violated its new data collection policies. Onavo was downloaded more than 33 million times on both iOS and Android devices, and while it is no longer available in the App Store, it is still on offer in the Google Play
  • NIST published guidance this week on securing wireless infusion pumps after research over the past few years has shown the vulnerabilities in the internet-connected medical devices. The guidance, NIST SP 1800-8 “Securing Wireless Infusion Pumps in Healthcare Delivery Organizations,” suggests a defense-in-depth strategy for protecting wireless infusion pumps. “This strategy may include a variety of tactics: using network segmentation to isolate business units and user access; applying firewalls to manage and control network traffic; hardening and enabling device security features to reduce zero-day exploits; and implementing strong network authentication protocols and proper network encryption, monitoring, auditing, and intrusion detection systems (IDS) and intrusion prevention systems (IPS),” the guidance This special publication is part of NIST’s ongoing effort to secure IoT devices.
  • A researcher at Check Point uncovered new malware that hijacks browsers. A rootkit called CEIDPageLock is being distributed by the RIG Exploit kit, according to Check Point’s Israel Gubi. “It acts to manipulate the victim’s browser and turn their home-page into a site pretending to be 2345.com — a Chinese web directory,” Gubi explained, adding that it “monitors user browsing and dynamically replaces the content of several popular Chinese websites with the fake home page, whenever the user tries to visit them.” He said that CEIDPageLock targets Chinese victims specifically.

HYCU moves beyond Nutanix backup with Google Cloud support

HYCU is now well-versed in the Google cloud.

HYCU, which began with a backup application built specifically for hyper-converged vendor Nutanix, today launched a service to back up data stored on Google Cloud Platform (GCP).

HYCU sprung up in June 2017 as an application sold by Comtrade that offered native support for Nutanix AHV and Nutanix customers using VMware ESX hypervisors. The Comtrade Group spun off HYCU into a separate company in March 2018, with both the new company and the product under the HYCU brand.

Today, HYCU for Google became available through the GCP Marketplace. It is an independent product from HYCU for Nutanix, but there is a Nutanix angle to the Google backup: GCP is Nutanix’s partner for the hyper-converged vendor’s Xi public cloud services. HYCU CEO Simon Taylor said his team began working on Google backup around the time Nutanix revealed plans for Xi in mid-2017. HYCU beat Nutanix out of the gate, launching its Google service before any Nutanix Xi Cloud Services have become generally available.

“We believe Nutanix is the future of the data center, and we place our bets on them,” Taylor said. “Everyone’s been asking us, ‘Beyond Nutanix, where do you go from here?’ We started thinking of the concept of multi-cloud. We see people running fixed workloads on-prem, and if it’s dynamic, they’ll probably put it on a public cloud. And Google is the public cloud that’s near and dear to Nutanix’s heart.”

HYCU backup for GCP is integrated into Google Cloud Identity and Access Management, installs without agents and backs up data to Google Buckets. The HYCU service uses native GCP snapshots for backup and recovery.

Subbiah Sundaram, vice president of products at HYCU, based in Boston, said HYCU provides application- and clone-consistent backups, and it allows single-file recovery. Sundaram said because HYCU takes control of the snapshot, data transfers do not affect product systems.

Sundaram said HYCU for GCP was built for Google admins, rather than typical backup admins.

“When customers use the cloud, they think of it as buying a service, not running software. And that’s the experience we want them to have,” Sundaram said. “It’s completely managed by us. We create and provision the backup targets on Google and manage it for you.”

HYCU for GCP uses only GCP to stage backups, backing up data in different Google Cloud zones. Sundaram said HYCU may back up to clouds or on-premises targets in future releases, but the most common request from customers so far is to back up to other GCP zones.

HYCU charges for data protected, rather than total storage allocated for backup. For example, a customer allocating 100 GB to a virtual machine with 20 GB of data protected is charged for the 20 GB. List price for 100 GB of consumed virtual machine capacity starts at $12 per month, or 12 cents per gigabyte, for data backed up every 24 hours. The cost increases for more frequent backups. Customers are billed through the GCP Marketplace.

Industry analysts pointed out HYCU is a brand-new company in name only. Its Comtrade legacy gives HYCU 20-plus years of experience in data protection and monitoring, over 1,000 customers and hundreds of engineers. That can allow it to move faster than typical startups.

“They’re a startup that already has a ton of experience,” said Christophe Bertrand, senior data protection analyst for Enterprise Strategy Group in Milford, Mass. “When you’re a small organization, you have to make strategic calls on what to do next. So, now, they’re getting into Google Cloud, which is evolving to be more enterprise-friendly. Clearly, backup and recovery is one of the functions you need to get right for the enterprise. Combined with the way Nutanix supports Google, it’s a smart move for HYCU.”

Steven Hill, senior storage analyst for 451 Research, agreed that GCP support was a logical step for Nutanix-friendly HYCU.

We believe Nutanix is the future of the data center, and we place our bets on them.
Simon TaylorCEO, HYCU

“Nutanix partnering with Google is a good hybrid cloud play. So, theoretically, what you’re running on-prem runs exactly the same once it’s on Google Cloud,” he said. “HYCU comes in and says, ‘We can do data protection and backup and workload protection that just fits seamlessly in with all of this. Whether you’re on Google Cloud or whether you’re on Nutanix via AHV or Nutanix via ESX, it’s all the same to us.'”

Taylor said HYCU is positioned well for when Nutanix makes Xi available. Nutanix has said its first Xi Cloud Service will be disaster recovery. “We will be delivering for Xi,” Taylor said. “You can imagine that will require a much closer bridge between these two different products. Once Xi is available, we’ll be fast on their heels with a product that will support both purpose-built backup for Nutanix and purpose-built recovery for Xi in a highly integrated fashion.”

Although HYCU for GCP can protect data for non-Nutanix users, Taylor said HYCU remains as dedicated to building a business around Nutanix as ever. He emphasized that HYCU develops its software independently from Nutanix and Google, although he is determined to have a good working relationship with both.

“We believe data protection should be an extension of the platform it serves, not a stand-alone platform,” he said.

Still, Taylor flatly denied his goal is for HYCU to become part of Nutanix.

“Right now, we want to build a brand,” he said. “This is about building a business that matters, not about a quick exit.”

Can value-based care models cut costs? Yes, they can

Nearly 25 years after value-based care models began, Change Healthcare’s just-released survey showed 100% of respondents are saving money using those principles.

The survey of 120 insurance company payers is Change Healthcare’s third effort since 2012 to understand what works — and what doesn’t — when it comes to value-based care models.

The universality of savings was a surprise to Andrei Gonzales, M.D., director of value-based reimbursement initiatives at Change Healthcare, a provider of payment, data analytics and a variety of other healthcare platforms based in Nashville, Tenn. “We expected medical cost savings,” Gonzales said, “but to have 100% say they achieved savings and with almost 25% having cost savings of over 7.5% was surprising. We’ve seen savings in our own work, but we wanted a more objective and quantifiable view of the impact of these programs.”

But Gonzalez wasn’t surprised payers were struggling to innovate and to move quickly. Just 21% said they could roll out a new value-based care model in three to six months, while 13% said they needed a full two years.

“Agility in healthcare is difficult, but it is possible,” he said. “With more experience, this should get easier.”

Agility in healthcare is difficult, but it is possible.
Andrei GonzalesM.D., director of value-based reimbursement initiatives at Change Healthcare

Payers were also frustrated with the technology underpinnings of their value-based care models, Gonzales said. More than half of those surveyed said their analytics, automation and reporting capabilities just weren’t doing the job. In many cases, these tools were custom-built specifically for the value-based care models, Gonzales explained. But as more and more initiatives roll out, the services can struggle to keep up.

The other struggle is to get hospitals and physicians on board with value-based care models, he said. “It’s not surprising to us how difficult it can be to engage providers in a value-based care model,” he said.

Busy doctors and staff can do a good job with patient care in their particular silo, but have a hard time looking across the spectrum of care, he said.

“But once they understand the problem and can see the data that they’re not doing as well as a competitor is doing and that they have to compete for patients, then they start to understand. That’s really where the work is right now.”

Minecraft: Education Edition reaches milestone of 2 million users and releases new Hour of Code tutorial |

Three years ago, Minecraft became a part of the Microsoft family and my team began our work to extend Minecraft’s potential in classrooms around the world. Flash forward to today – one year after the release of Minecraft: Education Edition – and I’m thrilled to share that we’re continuing to see tremendous momentum with more than 2 million licensed users in 115 countries around the world and more than 250 educator-created lesson plans in our community.

We’ve seen firsthand how Minecraft’s open sandbox environment ignites students’ innate creativity and makes learning fun and collaborative. Educators are teaching a wide variety of subjects with Minecraft: Education Edition, and we’ve seen particularly exciting results when Minecraft is used to teach Computer Science. Though many don’t realize it, coding is in fact one of the most creative activities a student can do, building something with no limitations but his or her own creativity. And research continues to show that creativity, collaboration and coding are all critical skills for success in the modern workplace.

As part of Microsoft’s continued commitment to empower students with these skills, we’ve partnered with Code.org for the past two years to offer educators and students a free Hour of Code tutorial using Minecraft. The results are far beyond anything we could have imagined.

To date, nearly 70 million Minecraft Hour of Code sessions have introduced the basics of coding to people around the world, joining the global movement. Today, I’m excited to announce that we have built a new Minecraft Hour of Code tutorial called Minecraft: Hero’s Journey.

Available at Code.org/Minecraft today, Hero’s Journey introduces a fun character called the Agent and 12 new challenges that teach core coding concepts like loops, debugging, and functions. It’s free and playable across iOS, Android and Windows platforms. Upon completing the tutorial, students can import their code into Minecraft: Education Edition for the first time ever, bringing their work to life in the game, or share their work via email, text message or social media. Learn more about the tutorial at education.minecraft.net/hour-of-code.

Above: Loops enable players to repeat a set of instructions until a certain condition has been reached.

I hope you’ll join us in building a global appreciation for Computer Science by visiting Code.org today to try the tutorial and plan your very own Hour of Code! And please share your experience on Facebook or Twitter using: #Minecraft #Hourofcode to spread the excitement around the globe during Computer Science Education Week.

Each day I come into the office, I am inspired by the stories and videos we receive from our Minecraft: Education Edition community and innovative educators like Melissa Wrenchey, Doug Bergman and Jeff Gearhart who are using computer science to inspire the next generation of creators, innovators and leaders.

Thank you to everyone who made this first year such a success. Our continued growth is only possible with feedback and ideas from you – the passionate educators around the world who are changing education as we know it. Keep it coming!