Tag Archives: Intelligence

Aviso introduces version 2.0 of AI-guided sales platform

Aviso announced version 2.0 of its artificial intelligence guided sales platform last week. The new version is aimed at lowering costs and reducing the time that sales reps spend working on CRM databases by providing them with AI tools that predict deal close probabilities and guide next best actions.

Algorithmic-guided selling using AI technology and existing sales data to guide sellers through deals is a new but increasingly popular technology. Nearly 51% of sales organizations have already deployed or plan to deploy algorithmic-guided selling in the next five years, according to a 2019 Gartner survey.

Aviso’s 2.0 sales platform uses AI tools to prioritize sales opportunities and analyze data from sources including CRM systems, emails, user calendars, chat transcripts and support and success tools to deliver real-time insights and suggest next best action for sales teams. The support and success tools are external offerings that Aviso’s platform can connect with, including customer support tools like Zendesk or Salesforce Service Cloud, and customer success tools like Gainsight or Totango, according to Amit Pande, vice president of marketing at Aviso.

The forecasting and sales guidance vendor claims the new version will help sales teams close 20% more deals and reduce spending on non-core CRM licenses by 30% compared with conventional CRM systems. The cost reduction calculation is based on “the number of non-core licenses that can be eliminated, as well as additional costs such as storage and add-ons that can be eliminated when underutilized or unused licenses are eliminated,” Pande said.

According to Aviso, new AI-based features in version 2.0 of its sales platform include:

  • Deal Execution Tools, a trio of tools meant to assist in finalizing deals. Bookings Timeline uses machine learning to calculate when deals will close based on an organization’s unique history. Each booking timeline also includes the top factors that influence the prediction. Opportunity Acceleration helps sales teams determine which opportunities carry the highest probability of closing early if they are pulled into the current quarter. Informed Editing is intended to limit typos and unsaved changes during entry of data. The tool gives users contextual help before they commit to edits, telling them what quarter or whose forecast they are updating. Changes and edits are automatically saved by the software.
  • Deal and Forecast Rooms enable users to do what-if analysis, use scenario modeling and automatically summarize forecast calls and deal review transcripts.
  • Coaching Rooms help sales managers improve sales rep performance with data from past and current deals and from team activity in Deal and Forecast Rooms. 
  • Nudges provide reminders for sales reps through an app on mobile devices. Nudges also offer recommendation for course corrections, and potential next steps based on insights from the specific deal.

Aviso’s 2.0 sales platform is currently in beta with select customers.

Cybersecurity company FireEye has been using the Aviso platform for several years and is among the selected customers. Andy Pan, director of Americas and public sector sales operations at FireEye, said the Aviso platform has helped FireEye operate in a more predictive measure through some of its new AI-driven features. “The predictive features helps us review both the macro business as a whole, and the deal-specific features provides guided pathways towards the inspection of deals.”

Other sales forecasting tools vendors in the market include Salesforce and Clari. Sales forecasting feature from Salesforce enables organizations to make forecasts specific to their needs and let managers track their team’s performance. Clari’s product includes features such as predictive forecasting, which uses AI-based projection to see the team’s achievement at the end of the quarter, and history tracking to see who last made changes to the forecast.

Go to Original Article
Author:

The Week It Snowed Everywhere – New Zealand News Centre

NIWA and Microsoft Corp. are teaming up to make artificial intelligence handwriting recognition more accurate and efficient in a project that will support climate research.

The project aims to develop better training sets for handwriting recognition technology that will “read” old weather logs. The first step is to use weather information recorded during a week in July 1939 when it snowed all over New Zealand, including at Cape Reinga.

NIWA climate scientist Dr. Andrew Lorrey says the project has the potential to revolutionise how historic data can be used. Microsoft has awarded NIWA an AI for Earth grant for the artificial intelligence project, which will support advances in automating handwriting recognition. AI for Earth is a global programme that supports innovators using AI to support environmental initiatives related to water, climate change, sustainable agriculture and biodiversity.

Microsoft’s Chief Environmental Officer, Lucas Joppa, sees a project that could quite literally be world-changing. “This project will bring inanimate weather data to life in a way everyone can understand, something that’s more vital than ever in an age of such climate uncertainty.

“I believe technology has a huge role to play in shining a light on these types of issues, and grantees such as NIWA are providing the solutions that we get really excited about.”

YouTube Video

Dr. Lorrey has been studying the weather in the last week of July 1939 when snow lay 5 cm deep on top of Auckland’s Mt. Eden, the hills of Northland turned white and snow flurries were seen at Cape Reinga. “Was 1939 the last gasp of conditions that were more common during the Little Ice Age, which ended in the 1800s? Or the first glimpse of the extremes of climate change thanks to the Industrial Revolution?”

Weather records at that time were meticulously kept in logbooks with entries made several times a day, recording information such as temperature, barometric pressure and wind direction. Comments often included cloud cover, snow drifts or rainfall.

“These logs are like time machines, and we’re now using their legacy to help ours,” Dr. Lorrey says.

“We’ve had snow in Northland in the recent past, but having more detail from further back in history helps us characterise these extreme weather events better within the long-term trends. Are they a one-in-80-year event, do they just occur at random, can we expect to see these happening with more frequency, and why, in a warming climate, did we get snow in Northland?”

Dr Drew Lorrey

Until now, however, computers haven’t caught up with humans when it comes to deciphering handwriting. More than a million photographed weather observations from old logbooks are currently being painstakingly entered by an army of volunteer “citizen scientists” and loaded by hand into the Southern Weather Discovery website. This is part of the global Atmospheric Circulation Reconstructions over the Earth (ACRE) initiative, which aims to produce better daily global weather animations and place historic weather events into a longer-term context.

“Automated handwriting recognition is not a solved problem,” says Dr. Lorrey. “The algorithms used to determine what a symbol is — is that a 7 or a 1? — need to be accurate, and of course for that there needs to be sufficient training data of a high standard.” The data captured through the AI for Earth grant will make the process of making deeper and more diverse training sets for AI handwriting recognition faster and easier.

“Old data is the new data,” says Patrick Quesnel, Senior Cloud and AI Business Group Lead at Microsoft New Zealand. “That’s what excites me about this. We’re finding better ways to preserve and digitise old data reaching back centuries, which in turn can help us with the future. This data is basically forgotten unless you can find a way to scan, store, sort and search it, which is exactly what Azure cloud technology enables us to do.”

Dr. Lorrey says the timing of the project is especially significant.
“This year is the 80th anniversary of The Week It Snowed Everywhere, so it’s especially fitting we’re doing this now. We’re hoping to have all the New Zealand climate data scanned by the end of the year, and quality control completed with usable data by the end of the next quarter.”

Ends.
About NIWA
The National Institute of Water and Atmospheric Research (NIWA) is New Zealand’s leading provider of climate, freshwater and ocean science. It delivers the science that supports economic growth, enhances human well-being and safety and enables good stewardship of our natural environment.
About Microsoft
Microsoft (Nasdaq “MSFT” @microsoft) enables digital transformation for the era of an intelligent cloud and an intelligent edge. Its mission is to empower every person and every organization on the planet to achieve more.

For more information contact:

Dr. Andrew Lorrey
NIWA climate scientist
Ph 09 375-2055
Mob 021 313-404
Andrea Jutson
On behalf of Microsoft New Zealand
Ph: (09) 354 0562 or 021 0843 0782
[email protected]

Go to Original Article
Author: Microsoft News Center

Tableau analytics platform gets AI, data management upgrades

LAS VEGAS — Improved augmented intelligence and data preparation capabilities are at the core of new updates to the Tableau analytics platform.

The Seattle-based vendor unveiled the enhancements on Wednesday at its annual user conference here.

Tableau revealed an upgrade to Ask Data, the Tableau analytics platform’s natural language processing tool that was introduced in February. Ask Data is now able to interpret more complex questions than it could before, and it can now do year-over-year geospatial comparisons, according to the vendor.

Tableau evolving its platform

In addition, with the recent introduction of Tableau Catalog and an upgrade to Prep Conductor, users of the Tableau analytics platform will be able to prepare data within Tableau itself rather than another product before importing the cleansed data into Tableau.

Finally, Tableau added to the Tableau analytics platform Metrics, a mobile-first tool that will enable users to monitor key performance indicators.

The product moves unveiled at the conference show that Tableau is continuing to evolve its popular analytics and business intelligence platform by adding features to help end users do more with self-service, said Doug Henschen, principal analyst at Constellation Research.

“With great self-service freedom comes great responsibility to do a better job of data governance,” Henschen said. “Tableau Catalog gives Tableau customers a data access and data lineage tracking aid to help users spot the data they should be using to help avoid recreating data sets and analyses that already exist or that could easily be extended to cover new use cases.”

Tableau Catalog gives Tableau customers a data-access and data-lineage tracking aid to help users spot the data they should be using to help avoid recreating data sets and analyses that already exist or that could easily be extended to cover new use cases.
Doug HenschenPrincipal analyst, Constellation Research

The host of upgrades come a day after Tableau revealed an enhanced partnership agreement with Amazon Web Services, Modern Cloud Analytics, designed to help Tableau’s many on-premises users migrate to the cloud.

A user looks at the new features

Meanwhile, one of the self-service customers Henschen alluded to is the University of Michigan, which has nearly 100,000 potential users with 50,000 employees and 48,000 students.

While it hasn’t yet taken advantage of the burgeoning data management capabilities of the Tableau analytics platform, the school is interested in Tableau’s natural language processing capabilities.

But with nearly 100,000 potential users — from hospital staff to the history department — nothing is as simple as choosing to use one BI tool within an overall system and eschewing another.

“Our data is relatively simple enough that we don’t need to constantly pivot or join a number of different things from a number of different places together,” Matthew Pickus, senior business intelligence analyst, said of Michigan’s decision to not yet employ tools like Tableau Catalog and Prep Conductor. “We try and keep the system as enterprise as possible.”

Christopher Gardner, business intelligence senior analyst at the University of Michigan, added that the potential cost of using the data preparation tools, given the number of users across the university, is a constraint.

That said, because data across the university’s myriad departments is often kept by each department according to that department’s own method — creating data silos — data standardization is something that could be on the horizon at Michigan, the school’s BI analysts said.

“It’s starting to get talked about a little more, so it may be something we start investigating,” Gardner said.

Bringing analytics to end users

“Some of the data management tools will become much more needed in the future,” Pickus added. “We’re just trying to figure out the best way to approach it. It’s going to become more important.

Tableau reaching down not just how to visualize your data but how to help you manage and organize your data across all the sets is going to be very helpful in the future.”

NLP, meanwhile, is something Michigan’s IT leaders see as a way to make analytics more accessible to its employees and students.

A gif displays Ask Data, Tableau's natural language processing tool.
A gif shows Tableau’s Ask Data, its natural language processing tool, in action.

But Gardner and Pickus said they want more from NLP tools than they’re currently capable of providing, whether part of the Tableau analytics platform or any other BI vendor’s suite.

“Our executives are very interested in it,” said Gardner. “They’re looking for ways to make data more accessible to users who aren’t familiar with reporting tools. To us it’s kind of frustrating, because we’ve got the reporting tools. Let’s take it a step further, and instead of just reporting let’s start doing analysis and start getting trends.”

Perhaps that’s next for the Tableau analytics platform.

Go to Original Article
Author:

SwiftStack 7 storage upgrade targets AI, machine learning use cases

SwiftStack turned its focus to artificial intelligence, machine learning and big data analytics with a major update to its object- and file-based storage and data management software.

The San Francisco software vendor’s roots lie in the storage, backup and archive of massive amounts of unstructured data on commodity servers running a commercially supported version of OpenStack Swift. But SwiftStack has steadily expanded its reach over the last eight years, and its 7.0 update takes aim at the new scale-out storage and data management architecture the company claims is necessary for AI, machine learning and analytics workloads.

SwiftStack said it worked with customers to design clusters that scale linearly to handle multiple petabytes of data and support throughput of more than 100 GB per second. That allows it to handle workloads such as autonomous vehicle applications that feed data into GPU-based servers.

Marc Staimer, president of Dragon Slayer Consulting, said throughput of 100 GB per second is “really fast” for any type of storage and “incredible” for an object-based system. He said the fastest NVMe system tests at 120 GB per second, but it can scale only to about a petabyte.

“It’s not big enough, and NVMe flash is extremely costly. That doesn’t fit the AI [or machine learning] market,” Staimer said.

This is the second object storage product launched this week with speed not normally associated with object storage. NetApp unveiled an all-flash StorageGrid array Tuesday at its Insight user conference.

Staimer said SwiftStack’s high-throughput “parallel object system” would put the company into competition with parallel file system vendors such as DataDirect Networks, IBM Spectrum Scale and Panasas, but at a much lower cost.

New ProxyFS Edge

SwiftStack 7 plans introduce a new ProxyFS Edge containerized software component next year to give remote applications a local file system mount for data, rather than having to connect through a network file serving protocol such as NFS or SMB. SwiftStack spent about 18 months creating a new API and software stack to extend its ProxyFS to the edge.

Founder and chief product officer Joe Arnold said SwiftStack wanted to utilize the scale-out nature of its storage back end and enable a high number of concurrent connections to go in and out of the system to send data. ProxyFS Edge will allow each cluster node to be relatively stateless and cache data at the edge to minimize latency and improve performance.

SwiftStack 7 will also add 1space File Connector software in November to enable customers that build applications using the S3 or OpenStack Swift object API to access data in their existing file systems. The new File Connector is an extension to the 1space technology that SwiftStack introduced in 2018 to ease data access, migration and searches across public and private clouds. Customers will be able to apply 1space policies to file data to move and protect it.

Arnold said the 1space File Connector could be especially helpful for media companies and customers building software-as-a-service applications that are transitioning from NAS systems to object-based storage.

“Most sources of data produce files today and the ability to store files in object storage, with its greater scalability and cost value, makes the [product] more valuable,” said Randy Kerns, a senior strategist and analyst at Evaluator Group.

Kerns added that SwiftStack’s focus on the developing AI area is a good move. “They have been associated with OpenStack, and that is not perceived to be a positive and colors its use in larger enterprise markets,” he said.

AI architecture

A new SwiftStack AI architecture white paper offers guidance to customers building out systems that use popular AI, machine learning and deep learning frameworks, GPU servers, 100 Gigabit Ethernet networking, and SwiftStack storage software.

“They’ve had a fair amount of success partnering with Nvidia on a lot of the machine learning projects, and their software has always been pretty good at performance — almost like a best-kept secret — especially at scale, with parallel I/O,” said George Crump, president and founder of Storage Switzerland. “The ability to ratchet performance up another level and get the 100 GBs of bandwidth at scale fits perfectly into the machine learning model where you’ve got a lot of nodes and you’re trying to drive a lot of data to the GPUs.”

SwiftStack noted distinct differences between the architectural approaches that customers take with archive use cases versus newer AI or machine learning workloads. An archive customer might use 4U or 5U servers, each equipped with 60 to 90 drives, and 10 Gigabit Ethernet networking. By contrast, one machine learning client clustered a larger number of lower horsepower 1U servers, each with fewer drives and a 100 Gigabit Ethernet network interface card, for high bandwidth, he said.

An optional new SwiftStack Professional Remote Operations (PRO) paid service is now available to help customers monitor and manage SwiftStack production clusters. SwiftStack PRO combines software and professional services.

Go to Original Article
Author:

Microsoft + The Jackson Laboratory: Using AI to fight cancer

YouTube Video

Biomedical researchers are embracing artificial intelligence to accelerate the implementation of cancer treatments that target patients’ specific genomic profiles, a type of precision medicine that in some cases is more effective than traditional chemotherapy and has fewer side effects.

The potential for this new era of cancer treatment stems from advances in genome sequencing technology that enables researchers to more efficiently discover the specific genomic mutations that drive cancer, and an explosion of research on the development of new drugs that target those mutations.

To harness this potential, researchers at The Jackson Laboratory, an independent, nonprofit biomedical research institution also known as JAX and headquartered in Bar Harbor, Maine, developed a tool to help the global medical and scientific communities stay on top of the continuously growing volume of data generated by advances in genomic research.

The tool, called the Clinical Knowledgebase, or CKB, is a searchable database where subject matter experts store, sort and interpret complex genomic data to improve patient outcomes and share information about clinical trials and treatment options.

The challenge is to find the most relevant cancer-related information from the 4,000 or so biomedical research papers published each day, according to Susan Mockus, the associate director of clinical genomic market development with JAX’s genomic medicine institute in Farmington, Connecticut.

“Because there is so much data and so many complexities, without embracing and incorporating artificial intelligence and machine learning to help in the interpretation of the data, progress will be slow,” she said.

That’s why Mockus and her colleagues at JAX are collaborating with computer scientists working on Microsoft’s Project Hanover who are developing AI technology that enables machines to read complex medical and research documents and highlight the important information they contain.

While this machine reading technology is in the early stages of development, researchers have found they can make progress by narrowing the focus to specific areas such as clinical oncology, explained Peter Lee, corporate vice president of Microsoft Healthcare in Redmond, Washington.

“For something that really matters like cancer treatment where there are thousands of new research papers being published every day, we actually have a shot at having the machine read them all and help a board of cancer specialists answer questions about the latest research,” he said.

Peter Lee stands with arms crossed behind some plants
Peter Lee, corporate vice president of Microsoft Healthcare. Photo by Dan DeLong. 

Curating CKB

Mockus and her colleagues are using Microsoft’s machine reading technology to curate CKB, which stores structured information about genomic mutations that drive cancer, drugs that target cancer genes and the response of patients to those drugs.

One application of this knowledgebase allows oncologists to discover what, if any, matches exist between a patient’s known cancer-related genomic mutations and drugs that target them as they explore and weigh options for treatment, including enrollment in clinical trials for drugs in development.

This information is also useful to translational and clinical researchers, Mockus noted.

The bottleneck is filtering through the more than 4,000 papers published every day in biomedical journals to find the subset of about 200 related to cancer, read them and update CKB with the relevant information on the mutation, drug and patient response.

“What you want is some degree of intelligence incorporated into the system that can go out and not just be efficient, but also be effective and relevant in terms of how it can filter information. That is what Hanover has done,” said Auro Nair, executive vice president of JAX.

The core of Microsoft’s Project Hanover is the capability to comb through the thousands of documents published each day in the biomedical literature and flag and rank all that are potentially relevant to cancer researchers, highlighting, for example, information on gene, mutation, drug and patient response.

Human curators working on CKB are then free to focus on the flagged research papers, validating the accuracy of the highlighted information.

“Our goal is to make the human curators superpowered,” said Hoifung Poon, director of precision health natural language processing with Microsoft’s research organization in Redmond and the lead researcher on Project Hanover.

“With the machine reader, we are able to suggest that this might be a case where a paper is talking about a drug-gene mutation relation that you care about,” Poon explained. “The curator can look at this in context and, in a couple of minutes, say, ‘This is exactly what I want,’ or ‘This is incorrect.’”

Hoifung Poon sits on a yellow chair
Hoifung Poon , director of precision health natural language processing with Microsoft’s research organization, is leading the development of Project Hanover, a machine reading technology. Photo by Jonathan Banks. 

Self supervision

To be successful, Poon and his team need to train machine learning models in such a way that they catch all the potentially relevant information – ensure there are no gaps in content – and, at the same time, weed out irrelevant information sufficiently to make the curation process more efficient.

In traditional machine reading tasks such as finding information about celebrities in news stories, researchers tend to focus on relationships contained within a single sentence, such as a celebrity name and a new movie.

Since this type of information is widespread across news stories, researchers can skip instances that are more challenging such as when the name of the celebrity and movie are mentioned in separate paragraphs, or when the relationship involves more than two pieces of information.

“In biomedicine, you can’t do that because your latest finding may only appear in this single paper and if you skip it, it could be life or death for this patient,” explained Poon. “In this case, you have to tackle some of the hard linguistic challenges head on.”

Poon and his team are taking what they call a self-supervision approach to machine learning in which the model automatically annotates training examples from unlabeled text by leveraging prior knowledge in existing databases and ontologies.

For example, a National Cancer Institute initiative manually compiled information from the biomedical literature on how genes regulate each other but was unable to sustain the effort beyond two years. Poon’s team used the compiled knowledge to automatically label documents and train a machine reader to find new instances of gene regulation.

They took the same approach with public datasets on approved cancer drugs and drugs in clinical trials, among other sources.

This connect-the-dots approach creates a machine learned model that “rarely misses anything” and is precise enough “where we can potentially improve the curation efficiency by a lot,” said Poon.

Collaboration with JAX

The collaboration with JAX allows Poon and his team to validate the effectiveness of Microsoft’s machine reading technology while increasing the efficiency of Mockus and her team as they curate CKB.

“Leveraging the machine reader, we can say here is what we are interested in and it will help to triage and actually rank papers for us that have high clinical significance,” Mockus said. “And then a human goes in to really tease apart the data.”

Over time, feedback from the curators will be used to help train the machine reading technology, making the models more precise and, in turn, making the curators more efficient and allowing the scope of CKB to expand.

“We feel really, really good about this relationship,” said Nair. “Particularly from the standpoint of the impact it can have in providing a very powerful tool to clinicians.”

Related:

John Roach writes about Microsoft research and innovation. Follow him on Twitter.

Go to Original Article
Author: Microsoft News Center

AI at the core of next-generation BI

Next-generation BI is upon us, and has been for a few years now.

The first generation of business intelligence, beginning in the 1980s and extending through the turn of the 21st century, relied entirely on information technology experts. It was about business reporting, and was inaccessible to all but a very few with specialized skills.

The second introduced self-service analytics, and lasted until just a few years ago. The technology was accessible to data analysts, and defined by data visualization, data preparation and data discovery.

Next-generation BI — the third generation — is characterized by augmented intelligence, machine learning and natural language processing. It’s open everyday business users, and trust and transparency are important aspects. It’s also changing the direction data looks, becoming more predictive.

In September, Constellation Research released “Augmented Analytics: How Smart Features Are Changing Business Intelligence.The report, authored by analyst Doug Henschen, took a deep look at next-generation BI.

Henschen reflected on some of his findings about the third generation of business analytics for a two-part Q&A.

In Part I, Henschen addressed what marked the beginning of this new era and who stands to benefit most from augmented BI capabilities. In Part II, he looked at which vendors are positioned to succeed and where next-generation business intelligence is headed next.

In your report you peg 2015 as the beginning of next generation BI — what features were you seeing in analytics platforms at that time that signaled a new era?

Doug HenschenDoug Henschen

Doug Henschen: There was a lot percolating at the time, but I don’t think that it’s about a specific technology coming out in 2015. That’s an approximation of when augmented analytics really became something that was ensconced as a buying criteria. That’s I think approximately when we shifted — the previous decade was really when self-service became really important and the majority of deployments were driving toward it, and I pegged 2015 as the approximate time at which augmented started getting on everyone’s radar.

Beyond the technology itself, what were some things that happened in the market around the time of 2015 that showed things were changing?

Henschen: There were lots of technology things that led up to that — Watson playing Jeopardy was in 2011, SAP acquired KXEN in 2013, IBM introduced Watson Analytics in 2014. Some startups like ThoughtSpot and BeyondCore came in during the middle of the decade, Salesforce introduced Einstein in 2016 and ended up acquiring BeyondCore in 2016. A lot of stuff was percolating in the decade, and 2015 is about when it became about, ‘OK, we want augmented analytics on our list. We want to see these features coming up on roadmaps.’

What are you seeing now that has advanced next-generation BI beyond what was available in 2015?

Anything that is proactive, that provides recommendations, that helps automate work that was tedious, that surfaces insights that humans would have a tough time recognizing but that machines can recognize — that’s helpful to everybody.
Doug HenschenAnalyst, Constellation Research

Henschen: In the report I dive into four areas — data preparation, data discovery and analysis, natural language interfaces and interaction, and forecasting and prediction — and in every category you’ve seen certain capabilities become commonplace, while other capabilities have been emerging and are on the bleeding edge. In data prep, everyone can pretty much do auto data profiling, but recommended or suggested data sources and joins are a little bit less common. Guided approaches that walk you through how to cleanse this, how to format this, where and how to join — that’s a little bit more advanced and not everybody does it.

Similarly, in the other categories, recommended data visualization is pretty common in discover and analysis, but intent-driven recommendations that track what individuals are doing and make recommendations based on patterns among people are more on the bleeding edge. It applies in every category. There’s stuff that is now widely done by most products, and stuff that is more bleeding edge where some companies are innovating and leading.

Who benefits from next-generation BI that didn’t benefit in previous generations — what types of users?

Henschen: I think these features will benefit all. Anything that is proactive, that provides recommendations, that helps automate work that was tedious, that surfaces insights that humans would have a tough time recognizing but that machines can recognize — that’s helpful to everybody. It has long been an ambition in BI and analytics to spread this capability to the many, to the business users, as well as the analysts who have long served the business users, and this extends the trend of self-service to more users, but it also saves time and supports even the more sophisticated users.

Obviously, larger companies have teams of data analysts and data engineers and have more people of that sort — they have data scientists. Midsize companies don’t have as many of those assets, so I think [augmented capabilities] stand to be more beneficial to midsize companies. Things like recommended visualizations and starting points for data exploration, those are very helpful when you don’t have an expert on hand and a team at your disposal to develop a dashboard to address a problem or look at the impact of something on sales. I think [augmented capabilities] are going to benefit all, but midsize companies and those with fewer people and resources stand to benefit more.  

You referred to medium-sized businesses, but what about small businesses?

Henschen: In the BI and analytics world there are products that are geared to reporting and helping companies at scale. The desktop products are more popular with small companies — Tableau, Microsoft Power BI, Tibco Spotfire are some that have desktop options, and small companies are turning also to SaaS options. We focus on enterprise analytics — midsize companies and up — and I think enterprise software vendors are focused that way, but there are definitely cloud services, SaaS vendors and desktop options. Salesforce has some good small business options. Augmented capabilities are coming into those tools as well.

Editor’s note: This interview has been edited for clarity and conciseness.

Go to Original Article
Author:

Announcing AI Business School for Education for leaders, BDMs and students | | Microsoft EDU

We live in an ever more digital, connected world. With the emergence of Artificial Intelligence, the opportunity we have to provide truly personalized, accessible learning and experiences to all students around the world is now upon us. Leaders in education have the opportunity to dramatically impact outcomes more than ever, from changing the way in which they engage with students throughout the student journey, to providing truly personalized learning, to improving operational efficiencies across the institution. At Microsoft, our mission in education is to empower every student on the planet to achieve more. Through that lens, we believe education leaders should consider opportunities to introduce new technologies like AI into the design of learning and technological blueprint to expand the horizon for driving better outcomes and efficiencies for every student and institution around the world.

That’s why I’m excited to share that Microsoft’s AI Business School now offers a learning path for education. Designed for education leaders, decision-makers and even students, the Microsoft AI Business School for Education helps learners understand how AI can enhance the learning environment for all students—from innovations in the way we teach and assess, to supporting accessibility and inclusion for all students, to institutional effectiveness and efficiency with the use of AI tools. The course is designed to empower learners to gain specific, practical knowledge to define and implement an AI strategy. Industry experts share insights on how to foster an AI-ready culture and teach them how to use AI responsibly and with confidence. The learning path is available on Microsoft Learn, a free platform to support learners of all ages and experience levels via interactive, online, self-paced learning.

The Microsoft AI Business School for Education includes a number of modules across sales, marketing, technology and culture, but most importantly, it calls upon the expert insights from education leaders including:

  • Professor Peter Zemsky uses INSEAD’s Value Creation Framework to show the advantages AI presents for educational institutions and how an organization can determine the right approach that works with their strategy and goals.
  • Michelle Zimmerman, author of “Teaching AI: Exploring New Frontiers for Learning,” shares her experience as an educator and why she sees believes AI can transform how students learn.
  • David Kellerman of the University of New South Wales (UNSW) shares his perspective on what’s unique about AI in higher education and how using AI can transform the way institutions collaborate and encourage students to be lifelong learners. As a key research institution in Australia, the University of New South Wales (UNSW)is focused on being a learning institution that collaborates across academic and operational departments as it uses AI to create a personalized learning journey for students. Dr. Kellerman shares his perspective on what’s unique about AI in higher education and how using AI to transform the way institutions collaborate can create students that are lifelong learners.

The Microsoft AI Business School for Education joins a larger collection of industry-specific courses including financial services, manufacturing, retail, healthcare and government. With this holistic portfolio, the AI Business School can also help students learn about AI application across a number of industries and roles. We’ve already seen several universities and vocational colleges incorporate this curriculum into their courses across business, finance, economics and health-related degrees as a means of providing real-world examples of AI opportunity and impact.

New research has highlighted the importance of adopting AI to transform the learning experience for students. Last week at the Asian Summit on Education and Skills (ASES) in India, Microsoft and IDC unveiled the latest findings from the study “Future-Ready Skills: Assessing the use of AI within the Education sector in Asia Pacific.” The study found that Artificial Intelligence (AI) will help double the rate of innovation improvements for higher education institutions across the region. Despite 3 in 4 education leaders agreeing that AI is instrumental to an institute’s competitiveness, 68% of education institutions in the region today have actually yet to embark on their AI journey. Those who have started integrating AI have seen improvements in student engagement, efficiency and competitiveness, as well as increased funding and accelerated innovation.

Microsoft is proud to be working with schools and institutions around the world to improve understanding of Artificial Intelligence and support leaders, educators and students to get ready for the future, like the recent collaboration in India with CBSE to train up over 1000 educators.

Click here for free STEM resourcesExplore tools for student-centered learning

Go to Original Article
Author: Microsoft News Center

Novartis and Microsoft announce collaboration to transform medicine with artificial intelligence – Stories

  • Multiyear alliance underpins the Novartis commitment to leverage data & artificial intelligence (AI) to transform how medicines are discovered, developed and commercialized
  • Novartis to establish AI innovation lab to empower its associates to use AI across the business
  • Joint research activities will include co-working environments on Novartis Campus (Switzerland), at Novartis Global Service Center in Dublin, and at Microsoft Research Lab (UK) – starting with tackling personalized therapies for macular degeneration; cell & gene therapy; and drug design

Basel, and Redmond, October 1, 2019 – Novartis today announced an important step in reimagining medicine by founding the Novartis AI innovation lab and by selecting Microsoft Corp. as its strategic AI and data-science partner for this effort. The new lab aims to significantly bolster Novartis AI capabilities from research through commercialization and help accelerate the discovery and development of transformative medicines for patients worldwide.

As part of the strategic collaboration announced, Novartis and Microsoft have committed to a multi-year research and development effort. This strategic alliance will focus on two core objectives:

  • AI Empowerment. The lab will aim to bring the power of AI to the desktop of every Novartis associate. By bringing together vast amounts of Novartis datasets with Microsoft’s advanced AI solutions, the lab will aim to create new AI models and applications that can augment our associates’ capabilities to take on the next wave of challenges in medicine.
  • AI Exploration. The lab will use the power of AI to tackle some of the hardest computational challenges within the life sciences, starting with generative chemistry, image segmentation & analysis for smart and personalized delivery of therapies, and optimization of cell and gene therapies at scale.

Microsoft and Novartis will also collaborate to develop and apply next-generation AI platforms and processes that support future programs across these two focus areas. The overall investment will include project funding, subject-matter experts, technology, and tools.

Novartis logo
Vas Narasimhan, CEO of Novartis, said, “As Novartis continues evolving into a focused medicines company powered by advanced therapy platforms and data science, alliances like this will help us deliver on our purpose to reimagine medicine to improve and extend lives. Pairing our deep knowledge of human biology and medicine with Microsoft’s leading expertise in AI could transform the way we discover and develop medicines for the world.”

Microsoft CEO, Satya Nadella, added, “Our strategic alliance will combine Novartis’ life sciences expertise with the power of Azure and Microsoft AI. Together, we aim to address some of the biggest challenges facing the life sciences industry today and bring AI capabilities to every Novartis employee so they can unlock new insights as they work to discover new medicines and reduce patient costs.”

Novartis Data & Digital

Novartis is focusing itself as a leading medicines company powered by advanced therapies and data science. Going big on data and digital is a key strategic pillar that helps Novartis realize that ambition. Data science and digital technologies allow the company to reimagine how it innovates in R&D, engages with patients and customers, and increases operational efficiencies. Novartis focuses its efforts around four strategic digital priority areas:

  • Scaling 12 digital lighthouse projects: Build a strong foundation and jumpstart digital transformation
  • Make Novartis digital: sharing, learning and talent acquisition
  • Becoming the #1 partner in the tech ecosystem: bridge Novartis with external expertise
  • Bolder moves: lead through future disruptive healthcare scenarios with large-scale partnerships

Disclaimer

This press release contains forward-looking statements within the meaning of the United States Private Securities Litigation Reform Act of 1995 that can generally be identified by words such as “to transform,” “multiyear,” “commitment,” “to found,” “aims,” “vision,” “potential,” “can,” “will,” “plan,” “expect,” “anticipate,” “committed,” or similar terms, or regarding the development or adoption of potentially transformational technologies and business models and the collaboration with Microsoft; or by express or implied discussions regarding potential marketing approvals, new indications or labeling for the healthcare products described in this press release, or regarding potential future revenues from collaboration with Microsoft or such products. You should not place undue reliance on these statements. Such forward-looking statements are based on our current beliefs and expectations regarding future events, and are subject to significant known and unknown risks and uncertainties. Should one or more of these risks or uncertainties materialize, or should underlying assumptions prove incorrect, actual results may vary materially from those set forth in the forward-looking statements. There can be no guarantee that the collaboration with Microsoft will achieve any or all of its intended goals or objectives, or in any particular time frame. Neither can there be any guarantee that any healthcare products described in this press release will be submitted or approved for sale or for any additional indications or labeling in any market, or at any particular time. Nor can there be any guarantee that the collaboration with Microsoft or such products will be commercially successful in the future. In particular, our expectations regarding the collaboration with Microsoft and such products could be affected by, among other things, uncertainties involved in the development or adoption of potentially transformational technologies and business models; the uncertainties inherent in research and development of new healthcare products, including clinical trial results and additional analysis of existing clinical data; regulatory actions or delays or government regulation generally, including potential regulatory actions or delays with respect to the collaboration with Microsoft; global trends toward health care cost containment, including government, payor and general public pricing and reimbursement pressures and requirements for increased pricing transparency; our ability to obtain or maintain proprietary intellectual property protection; the particular prescribing preferences of physicians and patients; general political, economic and trade conditions; safety, quality or manufacturing issues; potential or actual data security and data privacy breaches, or disruptions of our information technology systems, and other risks and factors referred to in Novartis AG’s current Form 20-F on file with the US Securities and Exchange Commission. Novartis is providing the information in this press release as of this date and does not undertake any obligation to update any forward-looking statements contained in this press release as a result of new information, future events or otherwise.

About Novartis

Novartis is reimagining medicine to improve and extend people’s lives. As a leading global medicines company, we use innovative science and digital technologies to create transformative treatments in areas of great medical need. In our quest to find new medicines, we consistently rank among the world’s top companies investing in research and development. Novartis products reach more than 750 million people globally and we are finding innovative ways to expand access to our latest treatments. About 108 000 people of more than 140 nationalities work at Novartis around the world. Find out more at www.novartis.com.

Novartis is on Twitter. Sign up to follow @Novartis at http://twitter.com/novartisnews

For Novartis multimedia content, please visit www.novartis.com/news/media-library

For questions about the site or required registration, please contact [email protected]

About Microsoft

Microsoft (Nasdaq “MSFT” @microsoft) enables digital transformation for the era of an intelligent cloud and an intelligent edge. Its mission is to empower every person and every organization on the planet to achieve more.

Novartis Media Relations

E-mail: me[email protected]

Peter Zuest

Novartis Global External Communications

+41 79 899 98 12 (mobile)

[email protected]

Microsoft Media Relations

WE Communications for Microsoft
(425) 638-7777
[email protected]

http://news.microsoft.com

Eric Althoff

Novartis US External Communications

+1 646 438 4335

[email protected]

Novartis Investor Relations

Central investor relations line: +41 61 324 7944

E-mail: [email protected]

Central North America
Samir Shah +41 61 324 7944 Sloan Simpson +1 862 778 5052
Pierre-Michel Bringer

Thomas Hungerbuehler

Isabella Zinck

+41 61 324 1065

+41 61 324 8425

+41 61 324 7188

Cory Twining
+1 212 778 3258

Go to Original Article
Author: Microsoft News Center

Azure Sentinel—the cloud-native SIEM that empowers defenders is now generally available

Machine learning enhanced with artificial intelligence (AI) holds great promise in addressing many of the global cyber challenges we see today. They give our cyber defenders the ability to identify, detect, and block malware, almost instantaneously. And together they give security admins the ability to deconflict tasks, separating the signal from the noise, allowing them to prioritize the most critical tasks. It is why today, I’m pleased to announce that Azure Sentinel, a cloud-native SIEM that provides intelligent security analytics at cloud scale for enterprises of all sizes and workloads, is now generally available.

Our goal has remained the same since we first launched Microsoft Azure Sentinel in February: empower security operations teams to help enhance the security posture of our customers. Traditional Security Information and Event Management (SIEM) solutions have not kept pace with the digital changes. I commonly hear from customers that they’re spending more time with deployment and maintenance of SIEM solutions, which leaves them unable to properly handle the volume of data or the agility of adversaries.

Recent research tells us that 70 percent of organizations continue to anchor their security analytics and operations with SIEM systems,1 and 82 percent are committed to moving large volumes of applications and workloads to the public cloud.2 Security analytics and operations technologies must lean in and help security analysts deal with the complexity, pace, and scale of their responsibilities. To accomplish this, 65 percent of organizations are leveraging new technologies for process automation/orchestration, while 51 percent are adopting security analytics tools featuring machine learning algorithms.3 This is exactly why we developed Azure Sentinel—an SIEM re-invented in the cloud to address the modern challenges of security analytics.

Learning together

When we kicked off the public preview for Azure Sentinel, we were excited to learn and gain insight into the unique ways Azure Sentinel was helping organizations and defenders on a daily basis. We worked with our partners all along the way; listening, learning, and fine-tuning as we went. With feedback from 12,000 customers and more than two petabytes of data analysis, we were able to examine and dive deep into a large, complex, and diverse set of data. All of which had one thing in common: a need to empower their defenders to be more nimble and efficient when it comes to cybersecurity.

Our work with RapidDeploy offers one compelling example of how Azure Sentinel is accomplishing this complex task. RapidDeploy creates cloud-based dispatch systems that help first responders act quickly to protect the public. There’s a lot at stake, and the company’s cloud-native platform must be secure against an array of serious cyberthreats. So when RapidDeploy implemented a SIEM system, it chose Azure Sentinel, one of the world’s first cloud-native SIEMs.

Microsoft recently sat down with Alex Kreilein, Chief Information Security Officer at RapidDeploy. Here’s what he shared: “We build a platform that helps save lives. It does that by reducing incident response times and improving first responder safety by increasing their situational awareness.”

Now RapidDeploy uses the complete visibility, automated responses, fast deployment, and low total cost of ownership in Azure Sentinel to help it safeguard public safety systems. “With many SIEMs, deployment can take months,” says Kreilein. “Deploying Azure Sentinel took us minutes—we just clicked the deployment button and we were done.”

Learn even more about our work with RapidDeploy by checking out the full story.

Another great example of a company finding results with Azure Sentinel is ASOS. As one of the world’s largest online fashion retailers, ASOS knows they’re a prime target for cybercrime. The company has a large security function spread across five teams and two sites—but in the past, it was difficult for ASOS to gain a comprehensive view of cyberthreat activity. Now, using Azure Sentinel, ASOS has created a bird’s-eye view of everything it needs to spot threats early, allowing it to proactively safeguard its business and its customers. And as a result, it has cut issue resolution times in half.

“There are a lot of threats out there,” says Stuart Gregg, Cyber Security Operations Lead at ASOS. “You’ve got insider threats, account compromise, threats to our website and customer data, even physical security threats. We’re constantly trying to defend ourselves and be more proactive in everything we do.”

Already using a range of Azure services, ASOS identified Azure Sentinel as a platform that could help it quickly and easily unite its data. This includes security data from Azure Security Center and Azure Active Directory (Azure AD), along with data from Microsoft 365. The result is a comprehensive view of its entire threat landscape.

“We found Azure Sentinel easy to set up, and now we don’t have to move data across separate systems,” says Gregg. “We can literally click a few buttons and all our security solutions feed data into Azure Sentinel.”

Learn more about how ASOS has benefitted from Azure Sentinel.

RapidDeploy and ASOS are just two examples of how Azure Sentinel is helping businesses process data and telemetry into actionable security alerts for investigation and response. We have an active GitHub community of preview participants, partners, and even Microsoft’s own security experts who are sharing new connectors, detections, hunting queries, and automation playbooks.

With these design partners, we’ve continued our innovation in Azure Sentinel. It starts from the ability to connect to any data source, whether in Azure or on-premises or even other clouds. We continue to add new connectors to different sources and more machine learning-based detections. Azure Sentinel will also integrate with Azure Lighthouse service, which will enable service providers and enterprise customers with the ability to view Azure Sentinel instances across different tenants in Azure.

Secure your organization

Now that Azure Sentinel has moved out of public preview and is generally available, there’s never been a better time to see how it can help your business. Traditional on-premises SIEMs require a combination of infrastructure costs and software costs, all paired with annual commitments or inflexible contracts. We are removing those pain points, since Azure Sentinel is a cost-effective, cloud-native SIEM with predictable billing and flexible commitments.

Infrastructure costs are reduced since you automatically scale resources as you need, and you only pay for what you use. Or you can save up to 60 percent compared to pay-as-you-go pricing by taking advantage of capacity reservation tiers. You receive predictable monthly bills and the flexibility to change capacity tier commitments every 31 days. On top of that, bringing in data from Office 365 audit logs, Azure activity logs and alerts from Microsoft Threat Protection solutions doesn’t require any additional payments.

Please join me for the Azure Security Expert Series where we will focus on Azure Sentinel on Thursday, September 26, 2019, 10–11 AM Pacific Time. You’ll learn more about these innovations and see real use cases on how Azure Sentinel helped detect previously undiscovered threats. We’ll also discuss how Accenture and RapidDeploy are using Azure Sentinel to empower their security operations team.

Get started today with Azure Sentinel!

1 Source: ESG Research Survey, Security Analytics and Operations: Industry Trends in the Era of Cloud Computing, September 2019
2 Source: ESG Research Survey, Security Analytics and Operations: Industry Trends in the Era of Cloud Computing, September 2019
3 Source: ESG Research Survey, Security Analytics and Operations: Industry Trends in the Era of Cloud Computing, September 2019

Go to Original Article
Author: Microsoft News Center

Magento BI update a benefit to vendor’s e-commerce customers

With the rollout of the Magento Business Intelligence Summer 2019 Release on Thursday, the Magento BI platform will get improved scheduling capabilities along with a host of new dashboard visualizations.

Magento, founded in 2008 and based in Culver City, Calif., is primarily known for its e-commerce platform. In 2018 the vendor was acquired by Adobe for $1.7 billion and is now part of the Adobe Experience Cloud.

With the vendor’s focus on e-commerce, the Magento BI platform isn’t designed to compete as a standalone tool against the likes of Microsoft Power BI, Qlik, Tableau and other leading BI vendors. Instead, it’s designed to slot in with Magento’s e-commerce platform and is intended for existing Magento customers.

“I love the BI angle Magento is taking here,” said Mike Leone, a senior analyst at Enterprise Strategy Group. “I would argue that many folks that utilize their commerce product are by no means experts at analytics. Magento will continue to empower them to gain more data-driven insights in an easy and seamless way. It is enabling businesses to take the next step into data-driven decision making without adding complexity.”

Similarly, Nicole France, principal analyst at Constellation Research, noted the importance of enhancing the BI capabilities of Magento’s commerce customers.

“This kind of reporting on commerce systems is undoubtedly useful,” she said. “The idea here seems to be reaching a wider audience than the folks directly responsible for running commerce. That means putting the right data in the appropriate context.”

The updated Magento BI platform comes with 13 data visualization templates, now including bubble charts, and over 100 reports.

Bubble charts such as this sample showing an organization's customer breakdown by state are now part of Magento's business intelligence platform.
Asample bubble chart from Magento shows an organization’s customer breakdown by state.

In addition, it comes with enhanced sharing capabilities. Via email, users can schedule reports to go out to selected recipients on a one-time basis or any repeating schedule they want. They can also keep track of the relevancy of the data with time logs and take all necessary actions from a status page.

“It finds the insights merchants want,” said Daniel Rios, product manager at Adobe. “It brings BI capabilities to merchants.”

Matthew Wasley, product marketing manager at Adobe, added: “Now there’s a better way to share insights that goes right to the inbox of a colleague and is part of their daily workflow.

“They can see the things they need to see — it bridges the gap,” Wasley said. “It’s an email you actually want to open.”

According Wasley, the Magento BI platform provides a full end-to-end data stack that services customers from the data pipeline through the data warehouse and ultimately to the dashboard visualization layer.

While some BI vendors offer products with similar end-to-end capabilities, others offer only one layer and need to be paired with other products to help a business client take data from its raw form and develop it into a digestible form.

“We’re ahead of the curve with Magento,” Wasley said.

He added that the end-to-end capability of the Magento BI tool is something other vendors are trying to put together through acquisitions. Though he didn’t name any companies specifically, Google with its purchase of Looker and Salesforce with its acquisition of Tableau are two that fit the mold.

We see our BI as a differentiator for our commerce platform. Standalone BI is evolving in itself. It’s tailored, and differentiates our commerce product.
Matthew WasleyProduct marketing manager, Adobe

Still, the Magento BI tool isn’t designed to compete on the open market against vendors who specialize in analytics platforms.

“We see our BI as a differentiator for our commerce platform,” said Wasley. “Standalone BI is evolving in itself. It’s tailored, and differentiates our commerce product.”

Moving forward, like the BI tools offered by other vendors, the Magento BI platform will become infused with more augmented intelligence and machine learning capabilities with innovation enhanced by Magento’s envelopment into the Adobe universe.

“We’re seeing how important data is across Adobe,” said Wasley. “All together, it’s meant to … make better use of data. Because of the importance of data across Adobe, we’re able to innovate a lot faster over the next 6 – 12 months.”

And presumably, that means further enhancement the Magento BI platform for the benefit of the vendor’s e-commerce customers.

Go to Original Article
Author: