Tag Archives: deliver

How to achieve explainability in AI models

When machine learning models deliver problematic results, it can often happen in ways that humans can’t make sense of — and this becomes dangerous when there are no limitations of the model, particularly for high-stakes decisions. Without straightforward and simple tools that highlight explainability in AI models, organizations will continue to struggle in implementing AI algorithms. Explainable AI refers to the process of making it easier for humans to understand how a given model generates the results it does and planning for cases when the results should be second-guessed.

AI developers need to incorporate explainability techniques into their workflows as part of their overall modeling operations. AI explainability can refer to the process of creating algorithms for teasing apart how black box models deliver results or the process of translating these results to different types of people. Data science managers working on explainable AI should keep tabs on the data used in models, strike a balance between accuracy and explainability, and focus on the end user.

Opening the black box

Traditional rule-based AI systems included explainability in AI as part of models, since humans would typically handcraft the inputs to output. But deep learning techniques using semi-autonomous neural-network models can’t provide a model’s results map to an intended goal.

Researchers are working to build learning algorithms that generate explainable AI systems from data. Currently, however, most of the dominant learning algorithms do not yield interpretable AI systems, said Ankur Taly, head of data science at Fiddler Labs, an explainable AI tools provider.

“This results in black box ML techniques, which may generate accurate AI systems, but it’s harder to trust them since we don’t know how these systems’ outputs are generated,” he said. 

AI explainability often describes post-hoc processes that attempt to explain the behavior of AI systems, rather than alter their structure. Other machine learning model properties like accuracy are straightforward to measure, but there are no corresponding simple metrics for explainability. Thus, the quality of an explanation or interpretation of an AI system needs to be assessed in an application-specific manner. It’s also important for practitioners to understand the assumptions and limitations of the techniques they use for implementing explainability.

“While it is better to have some transparency rather than none, we’ve seen teams fool themselves into a false sense of security by wiring an off-the-shelf technique without understanding how the technique works,” Taly said. 

Start with the data

The results of a machine learning model could be explained by the training data itself, or how a neural network interprets a dataset. Machine learning models often start with data labeled by humans. Data scientists can sometimes explain the way a model is behaving by looking at the data it was trained on.

“What a particular neural network derives from a dataset are patterns that it finds that may or may not be obvious to humans,” said Aaron Edell, director of applied AI at AI platform Veritone.

But it can be hard to understand what good data looks like. Biased training data can show in up a variety of ways. A machine learning model trained to identify sheep might only come from pictures of farms, causing the model to misinterpret sheep in other settings, or white clouds on farm pictures as sheep. Facial recognition software can be trained on company faces — but if those faces are mostly male or white, the data is biased.

One good practice is to train machine learning models on data that should be indistinguishable from the data the model will be expected to run on. For example, a face recognition model that identified how long Jennifer Aniston appears in every episode of Friends should be trained on frames of actual episodes rather than Google image search results for ‘Jennifer Aniston.’ In a similar vein, it’s OK to train models on publicly available datasets, but generic pre-trained models as a service will be harder to explain and change if necessary.   

Balancing explainability, accuracy and risk

The real problem with implementing explainability in AI is that there are major trade-offs between accuracy, transparency and risk in different types of AI models, said Matthew Nolan, senior director of decision sciences at Pegasystems. More opaque models may be more accurate, but fail the explainability test. Other types of models like decision trees and Bayesian networks are considered more transparent but are less powerful and complex.

“These models are critical today as businesses deal with regulations such as like GDPR that require explainability in AI-based systems, but this sometimes will sacrifice performance,” said Nolan.

Focusing on transparency can cost a business, but turning to more opaque models can leave a model unchecked and might expose the consumer, customer and the business to additional risks or breaches.

To address this gap, platform vendors are starting to embed transparency settings into their AI tool sets. This can make it easier to companies to adjust the acceptable amount of opaqueness or transparency thresholds used in their AI models and gives enterprises the control to adjust the models based on their needs or on corporate governance policy so they can manage risk, maintain regulatory compliance and ensure customers a differentiated experience in a responsible way.

Data scientists should also identify when the complexity of new models are getting in the way of explainability. Yifei Huang, data science manager at sales engagement platform Outreach, said there are often simpler models available for attaining the same performance, but machine learning practitioners have a tendency toward using more fancy and advanced models.

Focus on the user

Explainability means different things to a highly skilled data scientist compared to a call center worker who may need to make decisions based on an explanation. The task of implementing explainable AI is not just to foster trust in explanations but also help the end users make decisions, said Ankkur Teredesai, CTO and co-founder at KenSci, an AI healthcare platform.

Often data scientists make the mistake of thinking about explanations from the perspective of a computer scientist, when the end user is a domain expert who may need just enough information to make a decision. For a model that predicts the risk of a patient being readmitted, a physician may want an explanation of the underlying medical reasons, while a discharge planner may want to know the likelihood of readmission to plan accordingly.

Teredesai said there is still no general guideline for explainability, particularly for different types of users. It’s also challenging to integrate these explanations into the machine learning and end user workflows. End users typically need explanations as possible actions to take based on a prediction rather than just explanation as reasons, and this requires striking the right balance between focusing on prediction and explanation fidelity.

There are a variety of tools for implementing explainability on top of machine learning models which generate visualizations and technical descriptions, but these can be difficult for end users to understand, said Jen Underwood, vice president of product management at Aible, an automated machine learning platform. Supplementing visualizations with natural language explanations is a way to partially bridge the data science literacy gap. Another good practice is to directly use humans in the loop to evaluate your explanations to see if they make sense to a human, said Daniel Fagnan, director of applied science on the Zillow Offers Analytics team. This can help lead to more accurate models through key improvements including model selection and feature engineering.

KPIs for AI risks

Enterprises should consider the specific reasons that explainable AI is important when looking towards how to measure explainability and accessibility. Teams should first and foremost establish a set of criteria for key AI risks including robustness, data privacy, bias, fairness, explainability and compliance, said Dr. Joydeep Ghosh, chief scientific officer at AI vendor CognitiveScale. It’s also useful to generate appropriate metrics for key stakeholders relevant to their needs.

External organizations like AI Global can help establish measurement targets that determine acceptable operating values. AI Global is a nonprofit organization that has established the AI Trust Index, a scoring benchmarks for explainable AI that is like a FICO score. This enables firms to not only establish their own best practices, but also compare the enterprise against industry benchmarks.

When someone offers you a silver bullet explainable AI technology or solution, check whether you can have a common-grounded conversation with the AI that goes deep and scales to the needs of the application.
Mark StefikResearch Fellow, PARC, a Xerox Company

Vendors are starting to automate this process with tools for automatically scoring, measuring and reporting on risk factors across the AI operations lifecycle based on the AI Trust Index. Although the tools for explainable AI are getting better, the technology is at an early research stage with proof-of-concept prototypes, cautioned Mark Stefik, a research fellow at PARC, a Xerox Company. There are substantial technology risks and gaps in machine learning and in AI explanations, depending on the application.

“When someone offers you a silver bullet explainable AI technology or solution, check whether you can have a common-grounded conversation with the AI that goes deep and scales to the needs of the application,” Stefik said.

Go to Original Article
Author:

Microsoft announces new capabilities for a seamless, smart and secure IoT world – Stories

New solutions deliver IoT innovations from cloud to edge

REDMOND, Wash. — Oct. 28, 2019 — Microsoft Corp. on Monday announced new capabilities that further simplify the customer journey and deliver highly secured IoT solutions. These solutions help customers embrace IoT as a core strategy to drive better business outcomes, improve safety and address social issues by predicting and preventing equipment failures, optimizing smart buildings for space utilization and energy management, improving patient outcomes and worker safety, tracking assets across a supply chain, and more.

The proliferation of IoT devices is enabling companies to bring cloud intelligence to the edge, to create solutions that are adaptive and responsive to their environments. According to IDC,1 41.6 billion devices — including smartphones, smart home assistants and smart appliances — will be connected to the internet by 2025. Even sooner, by 2021, 94% of businesses surveyed will be using IoT, according to a recent Microsoft IoT Signals research report and, in nearly every case (97%), those companies are concerned about potential security risks.

“At Microsoft, we are committed to building a trusted, easy-to-use platform that allows our customers and partners to build seamless, smart, secure solutions regardless of where they are in the IoT journey,” said Sam George, CVP of Azure IoT at Microsoft. “That’s why we are investing $5B in IoT and intelligent edge — technology that is accelerating ubiquitous computing and bringing unparalleled opportunity across industries.”

Delivering new IoT innovations from cloud to edge

Our core focus is addressing the challenge of securing connected devices at every layer while advancing IoT to create a seamless experience between the physical and digital worlds. In the past year, we launched more than 100 new services and features that make IoT solutions more secure and scalable, reduce complexity, and create opportunities in new market areas.

Making IoT seamless

IoT Central is a fully managed IoT app platform that provides solution builders with built-in security, scale and extensibility needed to develop enterprise-grade IoT solutions. New features to IoT Central simplify challenges of building and deploying scalable and affordable enterprise applications:

  • 11 new industry-focused application templates to accelerate solution builders across retail, healthcare, government and energy.
  • API support for extending IoT Central or integrating it with other solutions, including API support for device modelling, provisioning, lifecycle management, operations and data querying.
  • IoT Edge support, including management for edge devices and IoT Edge module deployments, which enable customers to deploy cloud workloads, including AI, directly to connected devices.
  • IoT Plug and Play support, for rapid device development and connectivity.
  • The ability to Save & Load applications to enable application reusability.
  • More Data Export options for continually exporting data to other Azure PaaS services, such as storage for rich analytics.
  • Multitenancy support for building and managing a single application with multiple tenants, each with their own isolated data, devices, users and roles. And updates to that single application are visible to all tenants for easy manageability.
  • Custom user roles for fine-grained access control to data, actions and configurations in the system.
  • New pricing model for early 2020, designed to help customers and partners have predictable pricing as usage scales.

Making IoT smarter

Azure IoT Hub helps enterprise developers reduce costs and optimize operations through IoT cloud applications. New capabilities with IoT Hub message enrichment add the ability to stamp messages coming from devices with rich information before they are sent to downstream cloud services, making integration easy. IoT Hub integrates with Azure Event Grid, making it easy to consume IoT Hub device messages from an even broader variety of downstream services.

Azure Maps customers can add geospatial weather intelligence into their applications to enable scenarios like weather-based routing, weather-based targeted marketing and weather-based operations optimization, in partnership with AccuWeather. Azure Maps will now be available on Gov Cloud, simplifying the onboarding process for customers.

Azure Time Series Insights is announcing new preview capabilities including:

  • Multilayered storage provides the best of both worlds: lightning fast access to frequently used data (“warm data”) and fast access to infrequently used historical data (“cold data”).
  • Flexible cold storage: Historical data is stored in a customer’s own Azure Storage account, giving customers complete control of their IoT data. Data is stored in open source Apache Parquet format, enabling predictive analytics, machine learning and other custom computations using familiar technologies including Spark, Databricks and Jupyter.
  • Rich analytics: Rich query APIs and user experience support interpolation, new scalar and aggregate functions, categorical variables, scatter plots, and time shifting between time series signals for in-depth analysis.
  • Enterprise-grade scale: Scale and performance improvements at all layers, including ingestion, storage, query and metadata/model.
  • Extensibility and integration: New Time Series Insights Power BI connector allows customers to take queries from Time Series Insights into Power BI to get a unified view in a single pane of glass.

Through our Express Logic acquisition, Azure RTOS continues to enable new intelligent capabilities. It unlocks access to billions of new connected endpoints and grows the number of devices that can seamlessly connect to Azure. Renesas is a top microcontroller unit (MCU) manufacturer that shares our vision of making IoT development as easy and seamless as possible, and we are excited to announce that Azure RTOS will be broadly available across Renesas’ products, including the Synergy and RA MCU families. It is already integrated into the Renesas Synergy Software Package and will be integrated out of box with the Renesas RA Flexible Software Package.

Making IoT more secure

We have added new features to Azure Security Center for IoT with the announcement of a Security Partner program and support for national clouds, and we are excited to announce the upcoming general availability of Azure Sphere in February 2020.

Enabling a future of intelligent and secure computing at the edge for organizations, enterprises and consumers will require advances in computer architecture all the way down to the chip level, with security built in from the beginning. Microsoft Azure Sphere is taking a holistic approach to securing the intelligent edge and IoT from the silicon to the cloud in a way that gives customers flexibility and control. For example, Qualcomm recently announced a partnership with Microsoft to develop mobile hardware for Microsoft’s Azure Sphere IoT operating system.

Microsoft (Nasdaq “MSFT” @microsoft) enables digital transformation for the era of an intelligent cloud and an intelligent edge. Its mission is to empower every person and every organization on the planet to achieve more.

1 IDC, Worldwide Global DataSphere IoT Device and Data Forecast, 2019–2023, Doc # US45066919, May 2019

For more information, press only:

Microsoft Media Relations, WE Communications, (425) 638-7777, [email protected]

Note to editors: For more information, news and perspectives from Microsoft, please visit the Microsoft News Center at http://news.microsoft.com. Web links, telephone numbers and titles were correct at time of publication but may have changed. For additional assistance, journalists and analysts may contact Microsoft’s Rapid Response Team or other appropriate contacts listed at http://news.microsoft.com/microsoft-public-relations-contacts.

Go to Original Article
Author: Microsoft News Center

Schlumberger, Chevron and Microsoft announce collaboration to accelerate digital transformation – Stories

Global organizations will work together to accelerate development of cloud-native solutions and deliver actionable data insights for the industry

MONACO September 17, 2019 — Tuesday at the SIS Global Forum 2019, Schlumberger, Chevron and Microsoft. announced the industry’s first three-party collaboration to accelerate creation of innovative petrotechnical and digital technologies.

Data is quickly emerging as one of the most valuable assets to any company yet extracting insights from it is often difficult as information gets trapped in internal silos. As part of the collaboration, the three companies will work together to build Azure-native applications in the DELFI* cognitive E&P environment initially for Chevron, which will enable companies to process, visualize, interpret and ultimately obtain meaningful insights from multiple data sources.

DELFI* is a secure, scalable and open cloud-based environment providing seamless E&P software technology across exploration, development, production and midstream. Chevron and Schlumberger will combine their expertise and resources to accelerate the deployment of DELFI solutions in Azure, with support and guidance from Microsoft. The parties will ensure the software developments meet the latest standards in terms of security, performance, release management, and are compatible with the Open Subsurface Data Universe (OSDU) Data Platform. Building on this open foundation will amplify the capabilities of Chevron’s petrotechnical experts.

The collaboration will be completed in three phases starting with the deployment of the Petrotechnical Suite in the DELFI environment, followed by the development of cloud-native applications on Azure, and the co-innovation of a suite of cognitive computing native capabilities across the E&P value chain tailored to Chevron’s objectives.

Olivier Le Peuch, chief executive officer, Schlumberger, said, “Combining the expertise of these three global enterprises creates vastly improved and digitally enabled petrotechnical workflows. Never before has our industry seen a collaboration of this kind, and of this scale. Working together will accelerate faster innovation with better results, marking the beginning of a new era in our industry that will enable us to elevate performance across our industry’s value chain.”

“There is an enormous opportunity to bring the latest cloud and AI technology to the energy sector and accelerate the industry’s digital transformation,” said Satya Nadella, CEO of Microsoft. “Our partnership with Schlumberger and Chevron delivers on this promise, applying the power of Azure to unlock new AI-driven insights that will help address some of the industry’s—the world’s—most important energy challenges, including sustainability.”

Joseph C. Geagea, executive vice president, technology, projects and services, Chevron, said, “We believe this industry-first advancement will dramatically accelerate the speed with which we can analyze data to generate new exploration opportunities and bring prospects to development more quickly and with more certainty. It will pull vast quantities of information into a single source amplifying our use of artificial intelligence and high-performance computing built on an open data ecosystem.”

About Schlumberger

Schlumberger is the world’s leading provider of technology for reservoir characterization, drilling, production, and processing to the oil and gas industry. With product sales and services in more than 120 countries and employing approximately 100,000 people who represent over 140 nationalities, Schlumberger supplies the industry’s most comprehensive range of products and services, from exploration through production, and integrated pore-to-pipeline solutions that optimize hydrocarbon recovery to deliver reservoir performance.

Schlumberger Limited has executive offices in Paris, Houston, London, and The Hague, and reported revenues of $32.82 billion in 2018. For more information, visit.

About Chevron

Chevron Corporation is one of the world’s leading integrated energy companies. Through its subsidiaries that conduct business worldwide, the company is involved in virtually every facet of the energy industry. Chevron explores for, produces and transports crude oil and natural gas; refines, markets and distributes transportation fuels and lubricants; manufactures and sells petrochemicals and additives; generates power; and develops and deploys technologies that enhance business value in every aspect of the company’s operations. Chevron is based in San Ramon, Calif. More information about Chevron is available at www.chevron.com.

About Microsoft

Microsoft (Nasdaq “MSFT” @microsoft) enables digital transformation for the era of an intelligent cloud and an intelligent edge. Its mission is to empower every person and every organization on the planet to achieve more.

###

*Mark of Schlumberger

For further information, contact:

Moira Duff
Corporate Communication Manager−Western Hemisphere
Schlumberger
Tel: +1 281 285 4376
[email protected]

Sean Comey
Sr. Advisor, External Affairs
Chevron
Tel: +1 925 842 5509
[email protected]

Microsoft Media Relations
WE Communications for Microsoft
(425) 638-7777
[email protected]

Go to Original Article
Author: Microsoft News Center

Social determinants of health data provide better care

Social determinants of health data can help healthcare organizations deliver better patient care, but the challenge of knowing exactly how to use the data persists.

The healthcare community has long-recognized the importance of a patient’s social and economic data, said Josh Schoeller, senior vice president and general manager of LexisNexis Health Care at LexisNexis Risk Solutions. The current shift to value-based care models, which are ruled by quality rather than quantity of care, has put a spotlight on this kind of data, according to Schoeller.

But social determinants of health also pose a challenge to healthcare organizations. Figuring out how to use the data in meaningful ways can be daunting, as healthcare organizations are already overwhelmed by loads of data.

A new framework, released last month, by the not-for-profit eHealth Initiative Foundation, could help. The framework was developed by stakeholders, including LexisNexis Health Care, to give healthcare organizations guidance on how to use social determinants of health data ethically and securely.

Here’s a closer look at the framework.

Use cases for social determinants of health data

The push to include social determinants of health data into the care process is “imperative,” according to eHealth Initiative’s framework. Doing so can uncover potential risk factors, as well as gaps in care.

The eHealth Initiative’s framework outlines five guiding principles for using social determinants of health data. 

  1. Coordinating care

Determine if a patient has access to transportation or is food is insecure, according to the document. The data can also help a healthcare organization coordinate with community health workers and other organizations to craft individualized care plans.

  1. Using analytics to uncover health and wellness risks

Use social determinants of health data to predict a patient’s future health outcomes. Analyzing social and economic data can help the provider know if an individual is at an increased risk of having a negative health outcome, such as hospital re-admittance. The risk score can be used to coordinate a plan of action.

  1. Mapping community resources and identifying gaps

Use social determinants of health data to determine what local community resources exist to serve the patient populations, as well as what resources are lacking.

  1. Assessing service and impact

Monitor care plans or other actions taken using social determinants of health data and how it correlates to health outcomes. Tracking results can help an organization adjust interventions, if necessary.

  1. Customizing health services and interventions

Inform patients about how social determinants of health data are being used. Healthcare organizations can educate patients on available resources and agree on next steps to take.

Getting started: A how-to for healthcare organizations

The eHealth Initiative is not alone in its attempt to move the social determinants of health data needle.

Niki Buchanan, general manager of population health at Philips Healthcare, has some advice of her own.

  1. Lean on the community health assessment

Buchanan said most healthcare organizations conduct a community health assessment internally, which provides data such as demographics and transportation needs, and identifies at-risk patients. Having that data available and knowing whether patients are willing or able to take advantage of community resources outside of the doctor’s office is critical, she said.

Look for things that meet not only your own internal ROI in caring for your patients, but that also add value and patient engagement opportunities to those you’re trying to serve in a more proactive way.
Niki BuchananGeneral manager of population health management, Philips Healthcare

  1. Connect the community resource dots

Buchanan said a healthcare organization should be aware of what community resources are available to them, whether it’s a community driving service or a local church outreach program. The organization should also assess at what level it is willing to partner with outside resources to care for patients.

“Are you willing to partner with the Ubers of the world, the Lyfts of the world, to pick up patients proactively and make sure they make it to their appointment on time and get them home,” she said. “Are you able to work within the local chamber of commerce to make sure that any time there’s a food market or a fresh produce kind of event within the community, can you make sure the patients you serve have access?”

  1. Start simple

Buchanan said healthcare organizations should approach social determinants of health data with the patient in mind. She recommended healthcare organizations start small with focused groups of patients, such as diabetics or those with other chronic conditions, but that they also ensure the investment is a worthwhile one.

“Look for things that meet not only your own internal ROI in caring for your patients, but that also add value and patient engagement opportunities to those you’re trying to serve in a more proactive way,” she said.

Go to Original Article
Author:

AT&T Spark highlights big changes in networking market

SAN FRANCISCO — AT&T is revamping more than its massive network to deliver high-speed, low-latency 5G services to businesses and consumers. The company is also remaking its relationship with networking vendors.

At the heart of the change is an open-source-first policy that has redefined the role of tech vendors that historically supplied the service provider with proprietary hardware and software. Now, AT&T is telling its suppliers, which include Ericsson, Nokia and Samsung, they have to become software developers and system integrators.

“There is a place for all of those hardware vendors in this new ecosystem to become integrators, to become hardeners,” Amy Wheelus, vice president of AT&T’s Network Cloud, said this week during an interview at the service provider’s 5G AT&T Spark conference.

A hardener, which is a word Wheelus takes credit for, is a tech company that bolts features onto open source software, such as security and the management applications necessary to troubleshoot and fix network problems. Open source technology, such as OpenStack and Open Network Automation Platform (ONAP), is the foundation of AT&T’s Network Cloud, a cloud computing platform under construction to support future 5G applications.

Work like AT&T’s, which is also underway in other service providers’ data centers, is changing the relationship between carriers and networking suppliers, said Rajesh Ghai, an analyst at IDC.

“There’s a big architectural change that is happening in telcos today,” he said. “[And] there will be a huge systems integration and services component to it.”

The wish list from AT&T Spark

What AT&T and other service providers want matters to networking vendors. That’s because the suppliers’ largest customer base — enterprises — is gradually trading their on-premises software for applications running in the cloud. The trend is shrinking the enterprise market, while increasing the buying clout of cloud providers — such as Amazon, Google and Microsoft — and service providers preparing for massive 5G rollouts.

As a result, AT&T, Vodafone, Verizon and other large service providers “are going to be much more prescriptive in how we want [technology] to evolve,” Wheelus said.

For AT&T, that means commodity x86 hardware for running all Layer 4-7 network services, such as routing, load balancing and firewalls. AT&T is using ONAP to turn those services into virtualized network functions. Under the VNFs is the OpenStack cloud computing platform.

AT&T also wants makers of Layer 4-7 software to rethink the design of their products. For example, rather than selling load balancers for specific purposes, AT&T wants one product that it can configure for multiple tasks using open source orchestration tools, like Ansible and YAML.

“We don’t need eight different load balancers from eight different companies,” Wheelus said. “I need one load balancer, and then I fine-tune it.”

5G hype vs. reality

Is [there] a hype cycle? Yeah, maybe it’s a hype cycle.
Amy Wheelusvice president of AT&T’s Network Cloud

While revamping its network for 5G, AT&T is rolling out 5G radios and other infrastructure in U.S. cities. By early next year, AT&T plans to have 19 cities wired for 5G, including Atlanta, Dallas, Los Angeles and San Francisco.

What’s missing, however, are the applications that will deliver services that take advantage of 5G’s unique capabilities. At Spark, AT&T and its partners showed marketing videos touting the potential for better home gaming and entertainment and innovation in medicine and manufacturing. But none of the services exist, and it’s not clear how they will become a reality.

“Is [there] a hype cycle? Yeah, maybe it’s a hype cycle,” Wheelus said at AT&T Spark.

Nevertheless, AT&T claimed it is making progress on future products at its 5G development centers in Atlanta, Palo Alto, Calif., and Plano, Texas. The 5G infrastructure rolling out in the 19 cities is where AT&T will eventually test the services under development.

“We’re just getting standards-based equipment [for the cities],” Wheelus said.

With so much experimentation underway, no one can predict whether the billions of dollars AT&T and the rest of the tech industry are spending on 5G will generate a healthy return on investment. One of the first indicators will be the success of mobile services built for 5G-enabled smartphones, which analysts predict will arrive in 2021.

The potentially more lucrative services for businesses — the kind that is shown today in slick videos at 5G conferences — will take longer to hit the market. Those products are unlikely to be widely available until 2025, analysts said.

JDA Partners with Microsoft to Power Data-Driven Digital Transformations in the Cloud

JDA to build cognitive SaaS solutions on Microsoft Azure to deliver an intelligent, Autonomous Supply Chain to customers


Scottsdale, Ariz.
August 01, 2018

JDA Software, Inc., today announced a strategic partnership with Microsoft to enable JDA to build cognitive SaaS solutions on the market-leading Microsoft Azure cloud platform. This will, in turn, accelerate JDA’s vision to deliver an Autonomous Supply ChainTM through an infusion of advanced, intelligent cloud platform capabilities. This partnership further advances JDA’s innovation initiatives along with its recently announced definitive agreement to acquire Blue Yonder.  Blue Yonder is a market leader in artificial intelligence (AI) and machine learning (ML) solutions for retail and supply chain. These announcements support JDA’s strategy to develop more cognitive and connected solutions to power digital transformations and create competitive advantage for its customers.

“JDA’s supply chain solutions provide a faster response to demand signals from consumers, cognitive insights, and intelligent decisions based on edge sensors. Microsoft Azure will fuel our ongoing SaaS momentum as JDA applications deliver seamless customer experiences across cloud, on-premise, and edge solutions,” said Girish Rishi, chief executive officer, JDA. “Our strategic partnership with Microsoft accelerates JDA ‘s mission as the supply chain platform company, enabling our broad ecosystem of joint partners and developers to further leverage our AI/ML-based solutions.”

Scott Guthrie, executive vice president, Microsoft Cloud + AI Group, Microsoft said, “Microsoft Azure is driving new levels of organizational productivity and intelligent data-driven experiences, making it the ideal platform to bring JDA’s vision of an Autonomous Supply Chain to life. The powerful combination of JDA’s proven applications with Azure will empower customers to take advantage of real-time insights for smarter business decisions and profitable business growth.”

Victoria Brown, research manager, IDC said, “This partnership between established, trusted providers, uniting cloud services via Microsoft Azure, and supply chain via JDA addresses a gap in the supply chain ecosystem as cloud becomes a prerequisite for enterprises today as they embark on their digital supply chain transformations. Cloud-based supply chain deployments account for only about 40 percent of deployments today, and this new, trusted partnership could send that on an upward trajectory quite quickly.”

JDA’s solutions optimize the entire supply chain from end to end – from supplier to factory, transportation network to warehouse, store to consumer – through its market-leading solutions offerings. JDA is the only company named a leader by Gartner across all five Magic Quadrants that cover supply chain and retail merchandising solutions.  Joining forces with Microsoft for go-to-market and the development of forthcoming JDA SaaS solutions on the Azure platform will reap a number of immediate benefits to JDA’s more than 4,000 customers, including in the following key areas.

JDA to build cognitive, connected SaaS solutions on Azure

  • This partnership accelerates JDA’s SaaS solutions roadmap including those next generation solutions built on JDA LuminateTM, JDA’s next generation cognitive, connected supply chain platform
  • JDA’s customers will be able to tap into Microsoft’s large global footprint and global alliances network, while leveraging Azure’s large compliance portfolio, embedded security, enterprise-grade service level agreements, and industry-leading support.

JDA and Microsoft go to market together to digitally transform supply chain and retail operations

  • The companies will join forces in the market to drive digital transformations across key verticals such as retail, manufacturing and logistics with their combined solution portfolios
  • JDA’s leading supply chain and retail solutions highly complement Microsoft’s enterprise business application solutions and will now serve as the cornerstone to Microsoft’s supply chain practice offerings

JDA Luminate ControlTower TM is  the first solution built on Azure

  • JDA’s SaaS roadmap includes a first-of-its-kind digital control tower — JDA Luminate ControlTower – a virtual decision center that provides real-time, 24/7 end-to-end visibility into global supply chains that will serve as the nerve center of their operations and identify bottlenecks and propose resolutions before they occur
  • Using Azure as the development platform for JDA Luminate ControlTower will accelerate JDA’s ability to deliver this key component of the autonomous supply chain.

Additional Resources:

 

Tweet this: JDA Partners with @Microsoft to Power Data-Driven #DigitalTransformations  in the Cloud and Deliver an #AutonomousSupplyChain http://bit.ly/2AqMOQg

 

About JDA Software, Inc.

JDA Software is the leading supply chain software provider powering today’s digital transformations. We help companies optimize delivery to customers by enabling them to predict and shape demand, fulfill faster and more intelligently, and improve customer experiences and loyalty.  More than 4,000 global customers use our unmatched end-to-end software and SaaS solutions to unify and shorten their supply chains, increase speed of execution, and profitably deliver to their customers.  Our world-class client roster includes 75 of the top 100 retailers, 77 of the top 100 consumer goods companies, and 8 of the top 10 global 3PLs.  Running JDA, you can plan to deliverwww.jda.com

 

Social Networks:

Web: https://jda.com

Blog: https://blog.jda.com

Facebook: https://www.facebook.com/JDASoftwareGroup

Instagram: https://www.instagram.com/jdasoftware/ 

LinkedIn: https://www.linkedin.com/company/jda-software

Twitter: https://twitter.com/JDASoftware

YouTube: https://www.youtube.com/user/JDASoftware

 

“JDA” is a trademark or registered trademark of JDA Software Group, Inc. Any trade, product or service name referenced in this document using the name “JDA” is a trademark and/or property of JDA Software Group, Inc.

 

JDA Software, Inc.
15059 N Scottsdale Rd, Ste 400
Scottsdale, AZ 85254

###

SAP and Accenture collaborate on entitlement management platform

SAP and Accenture are teaming to deliver an intelligent entitlement management application intended to help companies build and deploy new business models.

Entitlement management applications help companies grant, enforce and administer customer access entitlements (which are usually referred to as authorizations, privileges, access right, or permissions) to data, devices and services — including embedded software applications — from a single platform.

The new SAP Entitlement Management allows organizations to dynamically change individual customer access rights and install renewal automation capabilities in applications, according to SAP. This means they can create new offerings that use flexible pricing structures.

The new platform’s entitlement management and embedded analytics integrate with SAP S/4HANA’s commerce and order management functions, which according to SAP, can help organizations create new revenue streams and get new products and services to market faster.

Accenture will provide consulting, system development and integration, application implementation, and analytics capabilities to the initiative.

“As high-tech companies rapidly transition from stand-alone products to highly connected platforms, they are under mounting pressure to create and scale new intelligent and digital business models,” said David Sovie, senior managing director of Accenture’s high-tech practice, in a press release. “The solution Accenture is developing with SAP will help enable our clients to pivot to as-a-service business models that are more flexible and can be easily customized.”

SAP and Accenture go on the defense

SAP and Accenture also unveiled a new platform that provides digital transformation technology and services for defense and security organizations.

The digital defense platform is based in S/4HANA and contains advanced analytics capabilities, and allows more use of digital applications by military personnel. It includes simulations and analytics applications intended to help defense and security organizations plan and run operations efficiently and be able to respond quickly to changing operating environments, according to SAP and Accenture.

“This solution gives defense agencies the capabilities to operate in challenging and fast-changing geo-political environments that require an intelligent platform with deployment agility, increased situational awareness and industry-specific capabilities,” said Antti Kolehmainen, Accenture’s managing director of defense business, in a press release.

The platform provides data-driven insights intended to help leaders make better decisions, and it enables cross-enterprise data integration in areas like personnel, military supply chain, equipment maintenance, finances and real estate.

IoT integration will enable defense agencies to connect devices that can collect and exchange data. The digital defense platform technology is available to be deployed on premises or in the cloud, according to the companies.

“The next-generation defense solution will take advantage of the technology capabilities of SAP S/4HANA and Accenture’s deep defense industry knowledge to help defense agencies build and deploy solutions more easily and cost-effectively and at the same time enable the digital transformation in defense,” said Isabella Groegor-Cechowicz, SAP’s global general manager of public services, in a press release.

New application and customer experience tool for SAP environments

AppDynamics (a Cisco company) has unveiled a new application and customer experience monitoring software product for SAP environments.

AppDynamics for SAP provides visibility into SAP applications and customer experiences via code-level insights into customer taps, swipes and clicks, according to AppDynamics. This helps companies understand the performance of SAP applications and databases, as well as the code impact on customers and business applications.

To satisfy customer expectations, [the modern enterprise] needs to meet the demands of an agile, digital business, while also maintaining and operating essential core systems.
Thomas Wyattchief strategy officer, AppDynamics

“The modern enterprise is in a challenging position,” said Thomas Wyatt, AppDynamics’ chief strategy officer, in a press release. “To satisfy customer expectations, it needs to meet the demands of an agile, digital business, while also maintaining and operating essential core systems.”

AppDynamics for SAP allows companies to collaborate around business transactions, using a unit of measurement that automatically reveals customers’ interactions with applications. They can then identify and map transactions flowing between each customer-facing application and systems of records — SAP ERP or CRM systems that include complex integration layers, such as SAP Process Integration and SAP Process Orchestration.

AppDynamics for SAP includes ABAP code-level diagnostics and native ABAP agent monitoring that provides insights into SAP environments with code and database performance monitoring, dynamic baselines, and transaction snapshots when performance deviates from the norm. It also includes intelligent alerting to IT based on health rules and baselines that are automatically set for key performance metrics on every business transaction. Intelligent alerting policies integrate with existing enterprise workflow tools, including ServiceNow, PagerDuty and JIRA.

This means that companies can understand dependencies across the entire digital business and baseline, identify, and isolate the root causes of problems before they affect customers. AppDynamics for SAP also helps companies to plan SAP application migrations to the cloud and monitor user experiences post-migration, according to AppDynamics.

27″ IPS Monitor – Yamakasi Catleap 2703

Yamakasi Catleap 2703 Monitor for sale

Looking for £100, can deliver locally or meet centrally, can be seen working if collected.

It requires your graphics card to have a Dual DVI. That’s the only input, the graphics cards needs to drive the monitor.

Great monitor, reason for upgrade is I am going to 2560 x 1440p 144hz Crossover Fast

No dead pixels I can see, perfectly fine. It’s an LG IPS Panel too, same used in iMacs.

The monitor itself has a glass panel on the front, very sturdy…

27″ IPS Monitor – Yamakasi Catleap 2703

BlackBerry and Microsoft partner to empower the mobile workforce

Companies deliver seamless Mobile App experience and policy compliance; BlackBerry Secure platform now available on Azure

WATERLOO, ONTARIO and REDMOND, Wash. – March 19, 2018 BlackBerry Limited (NYSE: BB; TSX: BB) and Microsoft Corp. (NASDAQ: MSFT) today announced a strategic partnership to offer enterprises a solution that integrates BlackBerry’s expertise in mobility and security with Microsoft’s unmatched cloud and productivity products.

BlackBerry logoThrough this partnership, the companies have collaborated on a first-of-its-kind solution: BlackBerry Enterprise BRIDGE. This technology provides a highly-secure way for their joint customers – the world’s largest banks, healthcare providers, law firms, and central governments – to seamlessly use native Microsoft mobile apps from within BlackBerry Dynamics.

By making Microsoft’s mobile apps seamlessly available from within BlackBerry Dynamics, enterprise users will now have a consistent experience when opening, editing, and saving a Microsoft Office 365 file such as Excel, PowerPoint, and Word on any iOS® or Android™ device. This enables users to work anytime, anyplace, with rich file fidelity. At the same time, corporate IT departments benefit from a greater return on their existing investments, and added assurance that their company’s data and privacy is secured to the highest standards and in compliance with corporate and regulatory policies.

“BlackBerry has always led the market with new and innovative ways to protect corporate data on mobile devices,” said Carl Wiese, president of Global Sales at BlackBerry. “We saw a need for a hyper-secure way for our joint customers to use native Office 365 mobile apps. BlackBerry Enterprise BRIDGE addresses this need and is a great example of how BlackBerry and Microsoft continue to securely enable workforces to be highly productive in today’s connected world.”

Microsoft logo“In an era when digital technology is driving rapid transformation, customers are looking for a trusted partner,” said Judson Althoff, executive vice president of Worldwide Commercial Business at Microsoft. “Our customers choose Microsoft 365 for productivity and collaboration tools that deliver continuous innovation, and do so securely. Together with BlackBerry, we will take this to the next level and provide enterprises with a new standard for secure productivity.”

“Along with a number of our peers in the Financial Services industry, we see strategic partnerships like this one as key to enhancing and bringing new products to market,” said George Sherman, Managing Director, CIO Global Technology Infrastructure, JPMorgan Chase. “This partnership will help create a more seamless mobile experience for end-users, which is a top priority for us at JPMorgan Chase.”

Lastly, the companies shared that the BlackBerry Secure platform for connecting people, devices, processes and systems, has been integrated with the Microsoft Azure cloud platform. Specifically, BlackBerry UEM Cloud, BlackBerry Workspaces, BlackBerry Dynamics, and BlackBerry AtHoc are now available on Azure.

To learn more, please visit  BlackBerry.com.

About Microsoft

Microsoft (Nasdaq “MSFT” @microsoft) is the leading platform and productivity company for the mobile-first, cloud-first world, and its mission is to empower every person and every organization on the planet to achieve more.

About BlackBerry

BlackBerry is a cybersecurity software and services company dedicated to securing the Enterprise of Things. Based in Waterloo, Ontario, the company was founded in 1984 and operates in North America, Europe, Asia, Australia, Middle East, Latin America and Africa. The Company trades under the ticker symbol “BB” on the Toronto Stock Exchange and New York Stock Exchange. For more information, visit www.BlackBerry.com.

BlackBerry and related trademarks, names and logos are the property of BlackBerry Limited and are registered and/or used in the U.S. and countries around the world. All other marks are the property of their respective owners. BlackBerry is not responsible for any third-party products or services.

###

Media Contacts:

BlackBerry

(519) 597-7273

[email protected]

Microsoft Media Relations

WE Communications for Microsoft

(425) 638-7777

[email protected]

Investor Contact:

BlackBerry Investor Relations

(519) 888-7465

[email protected]

 

The post BlackBerry and Microsoft partner to empower the mobile workforce appeared first on Stories.