Tag Archives: Intelligence

Microsoft integration adds more AI to Tibco analytics suite

Tibco continues to add to the augmented intelligence capabilities of its analytics platform, most recently revealing that Tibco Spotfire and Tibco Data Science now support Microsoft Azure Cognitive Services.

Azure Cognitive Services is a system from Microsoft that enables application developers to embed AI and machine learning capabilities. Spotfire, meanwhile, is Tibco’s chief business intelligence tool for data visualizations and Data Science is a BI tool focused less on visualization and more on hard core data analysis, and the two can be used together or independently of one another.

Tibco, founded in 1997 and based in Palo Alto, Calif., is adding support for Azure Cognitive Services following other AI investments in its analytics platform. In January 2020, the vendor added to the natural language generation capabilities of Spotfire via an integration with Arria NLG Studio for BI, and in the fall of 2019 it unveiled new products and added to existing ones with the credo of AI everywhere.

Meanwhile, the vendor’s addition of native support for Azure Cognitive Services, revealed June 2, comes after Tibco expanded the multi-cloud capabilities of its analytics platform through an integration with Microsoft Azure late in 2019; it already had an integration with Amazon Web Services and supports Google Cloud, among other cloud service providers.

“We don’t believe that AI is a marketing tool or a marketing term,” said Matt Quinn, Tibco’s COO. “We see that AI can actually be used as foundational element in people’s systems, and so working with Microsoft, doing this integration, is all about us being able to use our own technology, inside of our own products, as a foundational layer.”

A sample Tibco dashboard displays an organization's data.
An organization’s data is displayed on a sample Tibco dashboard.

AI, meanwhile, is an area where Tibco should be focused, according to Rick Sherman, founder and managing partner of Athena IT Solutions.

“With Spotfire, AI is definitely where they should be,” he said. “AI, machine learning and data science is where they’re great. They’re geared to sophisticated users, and if you’re doing a deeper dive, doing serious visualizations, Tibco is a way you want to go.”

Beyond simply adding a new integration, Tibco’s move to enable application developers to embed AI and machine learning capabilities by using Azure Cognitive Services continues the vendor’s process of expanding its analytics platform.

While some longtime BI vendors have struggled to maintain an innovative platform, Tibco, after losing some momentum in the early 2000s, has been able to remain among the top vendors with a suite of BI tools that are considered innovative.

We see that AI can actually be used as foundational element in people’s systems, and so working with Microsoft, doing this integration, is all about us being able to use our own technology, inside of our own products, as a foundational layer.
Matt QuinnCOO, Tibco

Tibco’s platform is entirely cloud-based, which allows Tibco to deliver new and upgraded features without having to roll out a major update each time, and its partnership strategy gives it the ability to embed products such as Azure Cognitive Services and Arria NLG Studio for BI without having to develop them in-house.

“Tibco has really evolved into a much more partner-centric company,” Quinn said. “We realize we are part of a broader ecosystem of tools and technologies, and so these partnerships that we’ve created are pretty special and pretty important, and we’ve been really happy with the bidirectional of those, especially the relationship with Microsoft. It’s clear that they have evolved as we have evolved.”

As far as motivation for the addition of Azure Cognitive Services to the Tibco analytics platform, Quinn said it’s simply about making data scientists more productive.

Customers, he added, were asking for the integration, while Tibco had a preexisting relationship with Microsoft that made adding Azure Cognitive Services a natural fit.

“Data scientists use all sorts of tools from all different walks of life, and because of our integration heritage we’re really good at integrating those types of things, so what we’re doing is we’re opening up the universe of all the Microsoft pieces to this data science group that just wants to be more productive,” Quinn said. “It enhances the richness of the platform.”

Similarly, Sherman said that the new integration is a positive move for data scientists.

Tibco’s acquisitions in recent years, such as its 2018 purchase of Scribe Software and its 2019 purchase of SnappyData, helped advance the capabilities of Tibco’s analytics platform, and now integrations are giving it further powers.

“They’re doing some excellent things,” Sherman said. “They’re aiming at deeper analytics, digging deeper into data science and data engineering, and this move to get their analytics closer to their data science makes a heck of a lot of sense.”

In the coming months, Quinn said that Tibco plans to continue adding integrations in order to add to the capabilities of its analytics platform. In addition, ease of use will be a significant focus for the vendor.

Meanwhile, ModelOps — the lifecycle of model development — will be a new area of emphasis for Tibco.

“ModelOps is really the set of things you have to do to take a model, whether it’s AI or just plain data science, and move it into the real world, and then how do you change it, how do you evolve it, who needs to sign off on it,” Quinn said. “For Tibco it’s great because it really brings together the data science piece with the hardcore engineering stuff that people have known us for.”

Go to Original Article
Author:

Cisco acquisition of ThousandEyes has many user benefits

Cisco plans to acquire internet intelligence vendor ThousandEyes. The acquisition, announced this week, would give Cisco customers better visibility into internet connections among the data center, branch offices and multiple cloud providers.

The Cisco acquisition is expected to be complete by the end of October. After the transaction closes, Cisco’s software-defined WAN customers would be the first to benefit from the combined company, said Shamus McGillicuddy, an analyst at Enterprise Management Associates.

Like most SD-WANs, Cisco’s Viptela product provides users with solid visibility into the virtual network, or overlay, between the SD-WAN at the branch and cloud applications. With ThousandEyes, Cisco would also provide intelligence on the underlay’s performance, which would be the public internet.

“Cisco has the opportunity to offer integrated visibility into both layers of a hybrid network,” McGillicuddy said.

ThousandEyes monitors the performance of traffic paths over the internet through software agents running on dozens of facilities owned by AWS, Google Cloud and Microsoft Azure. The vendor also has agents in the data centers of colocation partners, such as Equinix and Cogent, in more than 150 cities.

ThousandEyes customers also deploy agents on the virtual private clouds companies create to run applications on the infrastructure of public clouds. As a result, the vendor’s customers view internet performance from the software running on a cloud provider to the end user.

ThousandEyes and Cisco AppDynamics

Cisco is likely to provide some integration between ThousandEyes technology and AppDynamics, McGillicuddy said. The latter is Cisco’s product for monitoring the performance of applications and computing infrastructure in the data center.

ThousandEyes has many customers using its technology and AppDynamics as their primary performance management products, the company said.

McGillicuddy said Cisco could also package ThousandEyes with Cisco’s Duo Security two-step authentication service and its AnyConnect VPN software. The combination would provide Cisco customers with tools for improving application access for people working from home during the COVID-19 pandemic.

“This is something that is so important right now with so many people working from home,” McGillicuddy said. “I urge Cisco to focus on this opportunity.”

Cisco did not disclose how much it was paying for ThousandEyes, but Bloomberg reported the cost of the Cisco acquisition was nearly $1 billion. Cisco planned to fold the company into the networking services business unit.

ThousandEyes CEO and co-founder Mohit Lad will become general manager of the ThousandEyes operation, with CTO and co-founder Ricardo Oliveira continuing to lead product development.

Go to Original Article
Author:

With new Garage project Trove, people can contribute photos to help developers build AI models – Microsoft Garage

Every day, developers and researchers are finding creative ways to leverage AI to augment human intelligence and solve tough problems. Whether they’re training a computer vision model that can spot endangered snow leopards or help us do our business expenses more easily when we scan pictures of receipts, they need a lot of quality pictures to do it. Developers usually crowd source these large batches of pictures by enlisting the help of gig workers to submit photos, but often, these calls for photos feel like a black box. Participants have little insight into why they’re submitting a photo and can feel like their time was lost when their submissions are rejected without explanation. At the same time, developers can find that these sourcing projects take a long time to complete due to lower quality and less diverse inputs.

We’re excited to announce that Trove, a Microsoft Garage project, is exploring a solution that can enhance the experience and agency for both parties. Trove is a marketplace app that allows people to contribute photos to AI projects that developers can then use to train machine learning models. Interested parties can request an invite to join the experiment as a contributor or developer. Trove is currently accepting a small number of participants in the United States on both Android and iOS.

A marketplace that puts transparency and choice first

Today, most data collection is passive, with many people unaware that their data is being collected or not making a real-time, active choice to contribute their information. And even those who contribute more directly to model training projects are often not provided the greater context and purpose of the project; there’s little to no feedback loop to correct and align data submissions to better fit the needs of project.

For people who rely on this data gig work as an important source of income, this rejection experience can leave them feeling frustrated and without any agency to contribute better submissions and a higher return on their time investment. With machine learning being a critical step in unlocking advancements from speech to image recognition, there’s an important opportunity to increase the quality of data, while making sure that contributors have the clarity and choice they need to participate in the process.

The Trove team has found a way to overcome these tough tradeoffs in a marketplace solution that emphasizes greater communication, context, and feedback between developers and project participants. “There’s a better way we can do this. You can have the transparency of how your data is being used and actually want to opt in to contribute to these projects and advance science and AI,” shares Krishnan Raghupathi, the Senior Program Manager for Trove. “We’d love to see this become a community where people are a key part of the project.”

To read more about key features and how Trove works for developers and contributors, check it out on the Garage Workbench.

Aspiring to higher quality data and increased contributor agency

The team behind Trove was originally inspired by thought leaders exploring how we can embrace the need for a large volume of data to enable AI advancements, while providing more agency to contributors and recognizing the value of their data. “We wanted to explore these concepts through something concrete,” shared Christian Liensberger, the lead Principal Program Manager on the project. “We decided to form an incubation team and build something that could show how things could be different.”

In creating Trove, the incubation team had to think through principles that would guide them as they brought such an experience to life. They believe that the best framework to produce the higher quality data needed to train these AI models involves connecting content creators to AI developers more directly. Trove was built with a design and approach that focuses on four core principles:

  • Transparency See all the projects available, details about who is posting them, and how your data will be used
  • Control Decide which projects you want to contribute to, and control when and how much you contribute
  • Enrichment Learn directly from AI developers how your contributions are valuable, and see how your participation will advance AI projects
  • Connection Communicate with AI developers to stay informed on projects you contributed to

“I love working on this project, it’s a continuous shift between the user need for privacy and control, and professionals’ need for data to innovate and create new products,” said Devis Lucato, Principal Engineering Manager for Trove. “We’re pushing the boundaries of all the technologies that we touch, exploring new features and challenging decisions determined by the status quo.”

Before releasing this experiment to external users, the team piloted Trove with Microsoft employees from across the US. While Trove is still in an experimental phase, the team is excited for even more feedback. “Our solution is still a bit rough around the edges, but we want to hear from the community about what we should focus on next,” shares Christian. Trinh Duong, the Marketing Manager on the project added, “My favorite part about working on this has been how much the app incorporates users into the experience. We want to invite our users to reach out and join us as true participants in the creation of this concept.”

The team is welcoming feedback from experiment participants here, and is enthusiastic for the input of users who are as passionate about the principles of transparency, control, enrichment, and connection as they are.

Request an invite and share your feedback

Trove will be able to try in the United States upon request while room in the experiment is still available. Request an invite to join the experiment, or request to add an ML project to the experiment.

Go to Original Article
Author: Microsoft News Center

Data Science Central co-founder talks AI, data science trends

The data science field has changed greatly with the advent of AI. Artificial intelligence has enabled the rise of citizen data scientists, the automation of data scientist’s workloads, as well as the need for more skilled data scientists.

Vincent Granville, co-founder of Data Science Central, a community and resource site for data specialists, expects to see an increase in AI and IoT in data science over the next few years, even as AI continues to change the data science field.

In this Q&A, Granville discusses data science trends, the impact of AI and IoT on data scientists, how organizations and data scientists will have to adapt to increased data privacy regulations, and the evolution of AI.

Data Science Central was acquired by TechTarget on March 4.

Will an increase in citizen data scientists due to AI, as well as an increase of more formal data science education programs, help fix the so-called data scientist shortage?

Vincent Granville: I believe that we will see an increase in two fronts. We will see more data science programs being offered by universities, perhaps even doctorates in addition to master degrees, as well as more bootcamps and online training aimed at practitioners working with data but lacking some skills such as statistical programming or modern techniques such as deep learning — something old but that became popular recently due to the computational power now available to train and optimize these models.

Vincent Granville, Data Science Central, Data science trendsVincent Granville

There is also a parallel trend that will increase, consisting of hiring professionals not traditionally thought of as data scientists, such as physicists, who have significant experience working with data. This is already the case in fintech, where these professionals learn the new skills required on the job. Along with corporations training staff internally via sending selected employees to tech and data bootcamps, this will help increase the pipeline of potential recruits for the needed positions.

Also, AI itself will help build more tools to automate some of the grunt work, like data exploration, that many data scientists do today, currently eating up to 80% of their time. Think of it as AI to automate AI.

Similarly, how will, or how has, data science changed with the advent of AI that can automate various parts of the data science workflow?

Granville: We will see more automation of data science tasks. In my day-to-day activities, I have automated as much as I can, or outsourced or used platforms to do a number of tasks — even automating pure mathematical work such as computing integrals or finding patterns in number sequences.

The issue is resistance by employees to use such techniques, as they may perceive it as a way to replace them. But the contrary is true: Anything you do [manually] that can be automated actually lowers your job security. A change in mentality must occur for further adoption of automated data science for specific tasks, simple or not so simple, such as the creation of taxonomies, or programs that write programs.

The trend probably started at least 15 years ago, with the advent of machine-to-machine communications, using API’s and the internet at large for machines, aka robots, to communicate between themselves, and even make decisions. Now with a huge amount of unexploited sensor data available, it even has a term of its own: IoT.

An example is this: EBay purchases millions of keywords on Google; the process, including predicting the value, ROI and set[ting] the pricing for keywords, is fully automated. There is a program at eBay that exchanges info with one running at Google to make this transparent, including keyword purchasing, via programmed APIs. Yet eBay employs a team of data scientists and engineers to make sure things run smoothly and are properly maintained, and same with Google.

How will increased data privacy regulations and a larger focus on cybersecurity change data science?

Granville: It will always be a challenge to find the right balance. People are getting concerned that their data is worth something, more than just $20, and don’t like to see this data sold and resold time and over by third parties, or worse, hijacked or sold for nefarious purposes such as surveillance. Anything you post on Facebook can be analyzed by third parties and end up in the hands of government agencies from various countries, for profiling purposes, or detection of undesirable individuals.

Some expectations are unrealistic: You cannot expect corporations to tell what is hidden in the deep layers of their deep learning algorithms. This is protected intellectual property. When Google shows you search results, nobody, not even Google, knows how what you see — sometimes personalized to you — came up that way. But Google publishes patents about these algorithms, and everyone can check them.

The same is true with credit scoring and refusal to offer a loan. I think in the future, we will see more and more auditing of these automated decisions. Sources of biases will have to be found and handled. Sources of errors due to ID theft, for example, will have to be found and addressed. The algorithms are written by human beings, so they are not less biased than the human beings who designed them in the first place. Some seemingly innocuous decisions such as deciding which features, or variables, to introduce in your algorithm, potentially carry a bias.

I could imagine some companies [may] relocate … or even stop doing business altogether in some countries that cause too many challenges. This is more likely to happen to small companies, as they don’t have the resources to comply with a large array of regulations. Yet we might see in the future AI tools that do just that: help your business comply transparently with all local laws. We have that already for tax compliance.

What other data science trends can we expect to see in 2020 and beyond?

Granville: We live in a world with so many problems arising all the time — some caused by new technologies. So, the use of AI and IoT will increase.

We live in a world with so many problems arising all the time — some caused by new technologies. So, the use of AI and IoT will increase.
Vincent GranvilleCo-founder and executive data scientist, Data Science Central

Some problems will find solutions in the next few years, such as fake news detection or robocalls, just like it took over 10 years to fix email spamming. But it is not just a data science issue: if companies benefit financially short-term from the bad stuff, like more revenue to publishers because of fake news or clickbait, or more revenue to mobile providers due to robocalls, it needs to be addressed with more than just AI.

Some industries evolve more slowly and will see benefits in using AI in the future: Think about automated medical diagnostics or personalized dosing of drugs, small lawsuits handled by robots, or even kids at school being taught, at least in part, by robots. And one of the problems I face all the time with my spell-checker is its inability to detect if I write in French or English, resulting in creating new typos rather than fixing them.

Chatbots will get better too, eventually, for tasks such as customer support, or purchasing your groceries via Alexa without even setting foot in a grocery store or typing your shopping list. In the very long term, I could imagine the disappearance of written language, replaced by humans communicating orally with machines.

Go to Original Article
Author:

Creating a more accessible world with Azure AI

At Microsoft, we are inspired by how artificial intelligence is transforming organizations of all sizes, empowering them to reimagine what’s possible. AI has immense potential to unlock solutions to some of society’s most pressing challenges.

One challenge is that according to the World Health Association, globally, only 1 in 10 people with a disability have access to assistive technologies and products. We believe that AI solutions can have a profound impact on this community. To meet this need, we aim to democratize AI to make it easier for every developer to build accessibility into their apps and services, across language, speech, and vision.

In view of the upcoming Bett Show in London, we’re shining a light on how Immersive Reader enhances reading comprehension for people regardless of their age or ability, and we’re excited to share how Azure AI is broadly enabling developers to build accessible applications that empower everyone.

Empowering readers of all abilities

Immersive Reader is an Azure Cognitive Service that helps users of any age and reading ability with features like reading aloud, translating languages, and focusing attention through highlighting and other design elements. Millions of educators and students already use Immersive Reader to overcome reading and language barriers.

The Young Women’s Leadership School of Astoria, New York, brings together an incredible diversity of students with different backgrounds and learning styles. The teachers at The Young Women’s Leadership School support many types of learners, including students who struggle with text comprehension due to learning differences, or language learners who may not understand the primary language of the classroom. The school wanted to empower all students, regardless of their background or learning styles, to grow their confidence and love for reading and writing.

A teacher and student looking at a computer together

Watch the story here

Teachers at The Young Women’s Leadership School turned to Immersive Reader and an Azure AI partner, Buncee, as they looked for ways to create a more inclusive and engaging classroom. Buncee enables students and teachers to create and share interactive multimedia projects. With the integration of Immersive Reader, students who are dyslexic can benefit from features that help focus attention in their Buncee presentations, while those who are just learning the English language can have content translated to them in their native language.

Like Buncee, companies including Canvas, Wakelet, ThingLink, and Nearpod are also making content more accessible with Immersive Reader integration. To see the entire list of partners, visit our Immersive Reader Partners page. Discover how you can start embedding Immersive Reader into your apps today. To learn more about how Immersive Reader and other accessibility tools are fostering inclusive classrooms, visit our EDU blog.

Breaking communication barriers

Azure AI is also making conversations, lectures, and meetings more accessible to people who are deaf or hard of hearing. By enabling conversations to be transcribed and translated in real-time, individuals can follow and fully engage with presentations.

The Balavidyalaya School in Chennai, Tamil Nadu, India teaches speech and language skills to young children who are deaf or hard of hearing. The school recently held an international conference with hundreds of alumni, students, faculty, and parents. With live captioning and translation powered by Azure AI, attendees were able to follow conversations in their native languages, while the presentations were given in English.

Learn how you can easily integrate multi-language support into your own apps with Speech Translation, and see the technology in action with Translator, with support for more than 60 languages, today.

Engaging learners in new ways

We recently announced the Custom Neural Voice capability of Text to Speech, which enables customers to build a unique voice, starting from just a few minutes of training audio.

The Beijing Hongdandan Visually Impaired Service Center leads the way in applying this technology to empower users in incredible ways. Hongdandan produces educational audiobooks featuring the voice of Lina, China’s first blind broadcaster, using Custom Neural Voice. While creating audiobooks can be a time-consuming process, Custom Neural Voice allows Lina to produce high-quality audiobooks at scale, enabling Hongdandan to support over 105 schools for the blind in China like never before.

“We were amazed by how quickly Azure AI could reproduce Lina’s voice in such a natural-sounding way with her speech data, enabling us to create educational audiobooks much more quickly. We were also highly impressed by Microsoft’s commitment to protecting Lina’s voice and identity.”—Xin Zeng, Executive Director at Hongdandan

Learn how you can give your apps a new voice with Text to Speech.

Making the world visible for everyone

According to the International Agency for the Prevention of Blindness, more than 250 million people are blind or have low vision across the globe. Last month, in celebration of the United Nations International Day of Persons with Disabilities, Seeing AI, a free iOS app that describes nearby people, text, and objects, expanded support to five new languages. The additional language support for Spanish, Japanese, German, French, and Dutch makes it possible for millions of blind or low vision individuals to read documents, engage with people around them, hear descriptions of their surroundings in their native language, and much more. All of this is made possible with Azure AI.

Try Seeing AI today or extend vision capabilities to your own apps using Computer Vision and Custom Vision.

Get involved

We are humbled and inspired by what individuals and organizations are accomplishing today with Azure AI technologies. We can’t wait to see how you will continue to build on these technologies to unlock new possibilities and design more accessible experiences. Get started today with a free trial.

Check out our AI for Accessibility program to learn more about how companies are harnessing the power of AI to amplify capabilities for the millions of people around the world with a disability.

Go to Original Article
Author: Microsoft News Center

NLP, embedded BI and data literacy top 2020 analytics trends

While 2019 in business intelligence was marked by consolidation and the incremental improvement of augmented intelligence capabilities, 2020 analytics trends are expected to include increased adaptation of embedded BI and a rising level of data literacy across organizations.

The continued improvement of AI features, of course, will also be one of the 2020 analytics trends to watch.

“There’s still no generalized AI, but we’re starting to see it,” said Donald Farmer, principal at TreeHive Strategy. “It was so overhyped, but now we’re seeing generally intelligent assistants.”

Natural language processing

If there’s one AI feature that may become pervasive, it’s natural language processing (NLP).

“There’s a lot of buzz around NLP in BI platforms,” said Mike Leone, an analyst at Enterprise Strategy Group. “While it exists today, it’s still in its infancy. I think as the younger workforce continues to fill data-centric roles in organizations, there will be a growing desire to use voice technology. And I say ‘younger’ simply because that demographic is arguably the most adept at relying on voice technology in their everyday lives and will look for ways to use it to boost productivity at work too.”

By translating a vocal query into an SQL query, NLP will allow users to simply speak a data query and receive a vocal response.

It has the potential to significantly simplify data queries, lessening the need for a background in data science and opening data exploration to a wider range of business users.

But barriers still stand between the ideal of NLP and its reality, limited by the level of its machine learning capabilities.

Computers understand highly specific, unambiguous commands — code. They don’t understand human speech, and even when programmed to do so they can’t adjust for variations such as accents or imperfect syntax. Nevertheless, NLP features are appearing, and are expected to become more prominent as 2020 progresses.

Tableau, for example, introduced Ask Data in early 2019 and updated the tool in its November release. ThoughtSpot, meanwhile, unveiled SearchIQ in the fall of 2018. And Qlik acquired CrunchBot in early 2019 to add conversational capabilities.

“Natural language processing has been a trend the last few years, but now it’s reaching critical mass,” Farmer said.

Embedded BI

Another significant 2020 analytics trends is expected to be the expansion of embedded BI.

Eventually, analytics won’t be conducted on a standalone platform — it will be part of other commonplace business applications.

Instead of running reports — asking a data-driven question, sifting through stored data to come up with the relevant information to address the question, cleaning and preparing the data and ultimately creating a visualization based on the relevant information on a BI platform — business users will have key information delivered without ever having to ask for it, and without having to go through an IT department.

Next-generation architecture will tap data in applications, which will make getting real-time information easier. That will change the analytical paradigm.
Dan SommerGlobal market intelligence lead, Qlik

“Next-generation architecture will tap data in applications, which will make getting real-time information easier,” said Dan Sommer, global market intelligence lead at Qlik. “That will change the analytical paradigm. It was about reports once, and increasingly it will be about embedded — insights will come to you in your moment. It will make insights more consumerized — not from IT or developers. Now it will be everyone.”

Similarly, Doug Henschen, analyst at Constellation Research, pointed to embedded BI as a 2020 analytics trends to watch.

“There’s an increasingly popular saying that ‘every company is now a software company,’ but that’s an intentional overstatement pointing to what’s really a leading-edge trend,” he said.

Organizations — innovators and fast-followers — are making use of their data and finding ways to both enrich and monetize that data, he continued.

“A key enabler is embedded BI and analytics platforms that accelerate the development and delivery of data-driven and insight-driven software and services,” Henschen said. “These embedded platforms have been used primarily by independent software and services vendors to date. This more mainstream embrace of embedding is just getting started, and I think we’ll see more of it in 2020 and beyond.”

Data literacy

One more analytics 2020 analytics trend to watch — beyond such known entities as continued migration to the cloud — is the effort to make more of the workforce data literate.

Data literacy — the ability to derive insight from data — has been expanding beyond the realm of data scientists, but it still remains the domain of a minority within most organizations rather than the majority of employees.

“Data literacy as a service [will be a 2020 analytics trend],” Sommer said. “Data literacy is the ability to read, argue and use data, and data literacy has to happen for us to move into the digital stage. Organizations will realize they need help with this.”

Now, in an attempt to increase data literacy, a group of BI vendors are doing more than merely selling their software platforms. They’re also training line-of-business workers in the language of data.

In May of 2019, Qlik, whose research showed only 24% of employees were confident in their ability to effectively utilize data, began offering a free data literacy certification program. The previous October, Tableau, which offers free data literacy training videos, introduced a certification exam for beginners looking to improve their BI skills. And in September, IBM revealed that it certified 140 new data scientists after developing a data scientist certification program in partnership with The Open Group.

Alteryx, meanwhile, operates the Alteryx for Good program, which partners with colleges and universities — Stanford, the University of California Berkeley, the University of Michigan and Harvard among them — to incorporate data science and analytics into their curriculum.

“I think we’ll see a continued emphasis on enabling the desired visibility [of data],” Leone said, “and enablement of more personas to access data and derive insights.”

Go to Original Article
Author:

Aviso introduces version 2.0 of AI-guided sales platform

Aviso announced version 2.0 of its artificial intelligence guided sales platform last week. The new version is aimed at lowering costs and reducing the time that sales reps spend working on CRM databases by providing them with AI tools that predict deal close probabilities and guide next best actions.

Algorithmic-guided selling using AI technology and existing sales data to guide sellers through deals is a new but increasingly popular technology. Nearly 51% of sales organizations have already deployed or plan to deploy algorithmic-guided selling in the next five years, according to a 2019 Gartner survey.

Aviso’s 2.0 sales platform uses AI tools to prioritize sales opportunities and analyze data from sources including CRM systems, emails, user calendars, chat transcripts and support and success tools to deliver real-time insights and suggest next best action for sales teams. The support and success tools are external offerings that Aviso’s platform can connect with, including customer support tools like Zendesk or Salesforce Service Cloud, and customer success tools like Gainsight or Totango, according to Amit Pande, vice president of marketing at Aviso.

The forecasting and sales guidance vendor claims the new version will help sales teams close 20% more deals and reduce spending on non-core CRM licenses by 30% compared with conventional CRM systems. The cost reduction calculation is based on “the number of non-core licenses that can be eliminated, as well as additional costs such as storage and add-ons that can be eliminated when underutilized or unused licenses are eliminated,” Pande said.

According to Aviso, new AI-based features in version 2.0 of its sales platform include:

  • Deal Execution Tools, a trio of tools meant to assist in finalizing deals. Bookings Timeline uses machine learning to calculate when deals will close based on an organization’s unique history. Each booking timeline also includes the top factors that influence the prediction. Opportunity Acceleration helps sales teams determine which opportunities carry the highest probability of closing early if they are pulled into the current quarter. Informed Editing is intended to limit typos and unsaved changes during entry of data. The tool gives users contextual help before they commit to edits, telling them what quarter or whose forecast they are updating. Changes and edits are automatically saved by the software.
  • Deal and Forecast Rooms enable users to do what-if analysis, use scenario modeling and automatically summarize forecast calls and deal review transcripts.
  • Coaching Rooms help sales managers improve sales rep performance with data from past and current deals and from team activity in Deal and Forecast Rooms. 
  • Nudges provide reminders for sales reps through an app on mobile devices. Nudges also offer recommendation for course corrections, and potential next steps based on insights from the specific deal.

Aviso’s 2.0 sales platform is currently in beta with select customers.

Cybersecurity company FireEye has been using the Aviso platform for several years and is among the selected customers. Andy Pan, director of Americas and public sector sales operations at FireEye, said the Aviso platform has helped FireEye operate in a more predictive measure through some of its new AI-driven features. “The predictive features helps us review both the macro business as a whole, and the deal-specific features provides guided pathways towards the inspection of deals.”

Other sales forecasting tools vendors in the market include Salesforce and Clari. Sales forecasting feature from Salesforce enables organizations to make forecasts specific to their needs and let managers track their team’s performance. Clari’s product includes features such as predictive forecasting, which uses AI-based projection to see the team’s achievement at the end of the quarter, and history tracking to see who last made changes to the forecast.

Go to Original Article
Author:

The Week It Snowed Everywhere – New Zealand News Centre

NIWA and Microsoft Corp. are teaming up to make artificial intelligence handwriting recognition more accurate and efficient in a project that will support climate research.

The project aims to develop better training sets for handwriting recognition technology that will “read” old weather logs. The first step is to use weather information recorded during a week in July 1939 when it snowed all over New Zealand, including at Cape Reinga.

NIWA climate scientist Dr. Andrew Lorrey says the project has the potential to revolutionise how historic data can be used. Microsoft has awarded NIWA an AI for Earth grant for the artificial intelligence project, which will support advances in automating handwriting recognition. AI for Earth is a global programme that supports innovators using AI to support environmental initiatives related to water, climate change, sustainable agriculture and biodiversity.

Microsoft’s Chief Environmental Officer, Lucas Joppa, sees a project that could quite literally be world-changing. “This project will bring inanimate weather data to life in a way everyone can understand, something that’s more vital than ever in an age of such climate uncertainty.

“I believe technology has a huge role to play in shining a light on these types of issues, and grantees such as NIWA are providing the solutions that we get really excited about.”

YouTube Video

Dr. Lorrey has been studying the weather in the last week of July 1939 when snow lay 5 cm deep on top of Auckland’s Mt. Eden, the hills of Northland turned white and snow flurries were seen at Cape Reinga. “Was 1939 the last gasp of conditions that were more common during the Little Ice Age, which ended in the 1800s? Or the first glimpse of the extremes of climate change thanks to the Industrial Revolution?”

Weather records at that time were meticulously kept in logbooks with entries made several times a day, recording information such as temperature, barometric pressure and wind direction. Comments often included cloud cover, snow drifts or rainfall.

“These logs are like time machines, and we’re now using their legacy to help ours,” Dr. Lorrey says.

“We’ve had snow in Northland in the recent past, but having more detail from further back in history helps us characterise these extreme weather events better within the long-term trends. Are they a one-in-80-year event, do they just occur at random, can we expect to see these happening with more frequency, and why, in a warming climate, did we get snow in Northland?”

Dr Drew Lorrey

Until now, however, computers haven’t caught up with humans when it comes to deciphering handwriting. More than a million photographed weather observations from old logbooks are currently being painstakingly entered by an army of volunteer “citizen scientists” and loaded by hand into the Southern Weather Discovery website. This is part of the global Atmospheric Circulation Reconstructions over the Earth (ACRE) initiative, which aims to produce better daily global weather animations and place historic weather events into a longer-term context.

“Automated handwriting recognition is not a solved problem,” says Dr. Lorrey. “The algorithms used to determine what a symbol is — is that a 7 or a 1? — need to be accurate, and of course for that there needs to be sufficient training data of a high standard.” The data captured through the AI for Earth grant will make the process of making deeper and more diverse training sets for AI handwriting recognition faster and easier.

“Old data is the new data,” says Patrick Quesnel, Senior Cloud and AI Business Group Lead at Microsoft New Zealand. “That’s what excites me about this. We’re finding better ways to preserve and digitise old data reaching back centuries, which in turn can help us with the future. This data is basically forgotten unless you can find a way to scan, store, sort and search it, which is exactly what Azure cloud technology enables us to do.”

Dr. Lorrey says the timing of the project is especially significant.
“This year is the 80th anniversary of The Week It Snowed Everywhere, so it’s especially fitting we’re doing this now. We’re hoping to have all the New Zealand climate data scanned by the end of the year, and quality control completed with usable data by the end of the next quarter.”

Ends.
About NIWA
The National Institute of Water and Atmospheric Research (NIWA) is New Zealand’s leading provider of climate, freshwater and ocean science. It delivers the science that supports economic growth, enhances human well-being and safety and enables good stewardship of our natural environment.
About Microsoft
Microsoft (Nasdaq “MSFT” @microsoft) enables digital transformation for the era of an intelligent cloud and an intelligent edge. Its mission is to empower every person and every organization on the planet to achieve more.

For more information contact:

Dr. Andrew Lorrey
NIWA climate scientist
Ph 09 375-2055
Mob 021 313-404
Andrea Jutson
On behalf of Microsoft New Zealand
Ph: (09) 354 0562 or 021 0843 0782
[email protected]

Go to Original Article
Author: Microsoft News Center

Tableau analytics platform gets AI, data management upgrades

LAS VEGAS — Improved augmented intelligence and data preparation capabilities are at the core of new updates to the Tableau analytics platform.

The Seattle-based vendor unveiled the enhancements on Wednesday at its annual user conference here.

Tableau revealed an upgrade to Ask Data, the Tableau analytics platform’s natural language processing tool that was introduced in February. Ask Data is now able to interpret more complex questions than it could before, and it can now do year-over-year geospatial comparisons, according to the vendor.

Tableau evolving its platform

In addition, with the recent introduction of Tableau Catalog and an upgrade to Prep Conductor, users of the Tableau analytics platform will be able to prepare data within Tableau itself rather than another product before importing the cleansed data into Tableau.

Finally, Tableau added to the Tableau analytics platform Metrics, a mobile-first tool that will enable users to monitor key performance indicators.

The product moves unveiled at the conference show that Tableau is continuing to evolve its popular analytics and business intelligence platform by adding features to help end users do more with self-service, said Doug Henschen, principal analyst at Constellation Research.

“With great self-service freedom comes great responsibility to do a better job of data governance,” Henschen said. “Tableau Catalog gives Tableau customers a data access and data lineage tracking aid to help users spot the data they should be using to help avoid recreating data sets and analyses that already exist or that could easily be extended to cover new use cases.”

Tableau Catalog gives Tableau customers a data-access and data-lineage tracking aid to help users spot the data they should be using to help avoid recreating data sets and analyses that already exist or that could easily be extended to cover new use cases.
Doug HenschenPrincipal analyst, Constellation Research

The host of upgrades come a day after Tableau revealed an enhanced partnership agreement with Amazon Web Services, Modern Cloud Analytics, designed to help Tableau’s many on-premises users migrate to the cloud.

A user looks at the new features

Meanwhile, one of the self-service customers Henschen alluded to is the University of Michigan, which has nearly 100,000 potential users with 50,000 employees and 48,000 students.

While it hasn’t yet taken advantage of the burgeoning data management capabilities of the Tableau analytics platform, the school is interested in Tableau’s natural language processing capabilities.

But with nearly 100,000 potential users — from hospital staff to the history department — nothing is as simple as choosing to use one BI tool within an overall system and eschewing another.

“Our data is relatively simple enough that we don’t need to constantly pivot or join a number of different things from a number of different places together,” Matthew Pickus, senior business intelligence analyst, said of Michigan’s decision to not yet employ tools like Tableau Catalog and Prep Conductor. “We try and keep the system as enterprise as possible.”

Christopher Gardner, business intelligence senior analyst at the University of Michigan, added that the potential cost of using the data preparation tools, given the number of users across the university, is a constraint.

That said, because data across the university’s myriad departments is often kept by each department according to that department’s own method — creating data silos — data standardization is something that could be on the horizon at Michigan, the school’s BI analysts said.

“It’s starting to get talked about a little more, so it may be something we start investigating,” Gardner said.

Bringing analytics to end users

“Some of the data management tools will become much more needed in the future,” Pickus added. “We’re just trying to figure out the best way to approach it. It’s going to become more important.

Tableau reaching down not just how to visualize your data but how to help you manage and organize your data across all the sets is going to be very helpful in the future.”

NLP, meanwhile, is something Michigan’s IT leaders see as a way to make analytics more accessible to its employees and students.

A gif displays Ask Data, Tableau's natural language processing tool.
A gif shows Tableau’s Ask Data, its natural language processing tool, in action.

But Gardner and Pickus said they want more from NLP tools than they’re currently capable of providing, whether part of the Tableau analytics platform or any other BI vendor’s suite.

“Our executives are very interested in it,” said Gardner. “They’re looking for ways to make data more accessible to users who aren’t familiar with reporting tools. To us it’s kind of frustrating, because we’ve got the reporting tools. Let’s take it a step further, and instead of just reporting let’s start doing analysis and start getting trends.”

Perhaps that’s next for the Tableau analytics platform.

Go to Original Article
Author:

SwiftStack 7 storage upgrade targets AI, machine learning use cases

SwiftStack turned its focus to artificial intelligence, machine learning and big data analytics with a major update to its object- and file-based storage and data management software.

The San Francisco software vendor’s roots lie in the storage, backup and archive of massive amounts of unstructured data on commodity servers running a commercially supported version of OpenStack Swift. But SwiftStack has steadily expanded its reach over the last eight years, and its 7.0 update takes aim at the new scale-out storage and data management architecture the company claims is necessary for AI, machine learning and analytics workloads.

SwiftStack said it worked with customers to design clusters that scale linearly to handle multiple petabytes of data and support throughput of more than 100 GB per second. That allows it to handle workloads such as autonomous vehicle applications that feed data into GPU-based servers.

Marc Staimer, president of Dragon Slayer Consulting, said throughput of 100 GB per second is “really fast” for any type of storage and “incredible” for an object-based system. He said the fastest NVMe system tests at 120 GB per second, but it can scale only to about a petabyte.

“It’s not big enough, and NVMe flash is extremely costly. That doesn’t fit the AI [or machine learning] market,” Staimer said.

This is the second object storage product launched this week with speed not normally associated with object storage. NetApp unveiled an all-flash StorageGrid array Tuesday at its Insight user conference.

Staimer said SwiftStack’s high-throughput “parallel object system” would put the company into competition with parallel file system vendors such as DataDirect Networks, IBM Spectrum Scale and Panasas, but at a much lower cost.

New ProxyFS Edge

SwiftStack 7 plans introduce a new ProxyFS Edge containerized software component next year to give remote applications a local file system mount for data, rather than having to connect through a network file serving protocol such as NFS or SMB. SwiftStack spent about 18 months creating a new API and software stack to extend its ProxyFS to the edge.

Founder and chief product officer Joe Arnold said SwiftStack wanted to utilize the scale-out nature of its storage back end and enable a high number of concurrent connections to go in and out of the system to send data. ProxyFS Edge will allow each cluster node to be relatively stateless and cache data at the edge to minimize latency and improve performance.

SwiftStack 7 will also add 1space File Connector software in November to enable customers that build applications using the S3 or OpenStack Swift object API to access data in their existing file systems. The new File Connector is an extension to the 1space technology that SwiftStack introduced in 2018 to ease data access, migration and searches across public and private clouds. Customers will be able to apply 1space policies to file data to move and protect it.

Arnold said the 1space File Connector could be especially helpful for media companies and customers building software-as-a-service applications that are transitioning from NAS systems to object-based storage.

“Most sources of data produce files today and the ability to store files in object storage, with its greater scalability and cost value, makes the [product] more valuable,” said Randy Kerns, a senior strategist and analyst at Evaluator Group.

Kerns added that SwiftStack’s focus on the developing AI area is a good move. “They have been associated with OpenStack, and that is not perceived to be a positive and colors its use in larger enterprise markets,” he said.

AI architecture

A new SwiftStack AI architecture white paper offers guidance to customers building out systems that use popular AI, machine learning and deep learning frameworks, GPU servers, 100 Gigabit Ethernet networking, and SwiftStack storage software.

“They’ve had a fair amount of success partnering with Nvidia on a lot of the machine learning projects, and their software has always been pretty good at performance — almost like a best-kept secret — especially at scale, with parallel I/O,” said George Crump, president and founder of Storage Switzerland. “The ability to ratchet performance up another level and get the 100 GBs of bandwidth at scale fits perfectly into the machine learning model where you’ve got a lot of nodes and you’re trying to drive a lot of data to the GPUs.”

SwiftStack noted distinct differences between the architectural approaches that customers take with archive use cases versus newer AI or machine learning workloads. An archive customer might use 4U or 5U servers, each equipped with 60 to 90 drives, and 10 Gigabit Ethernet networking. By contrast, one machine learning client clustered a larger number of lower horsepower 1U servers, each with fewer drives and a 100 Gigabit Ethernet network interface card, for high bandwidth, he said.

An optional new SwiftStack Professional Remote Operations (PRO) paid service is now available to help customers monitor and manage SwiftStack production clusters. SwiftStack PRO combines software and professional services.

Go to Original Article
Author: