Tag Archives: third

AI for BI at the heart of third-generation analytics

AI for BI is a key tenet of the third generation of analytics.

Sometime in the middle of the current decade, features such as augmented intelligence, machine learning and natural language processing started to become key parts of business intelligence platforms.

In the years since, although analytics platforms have progressed, AI for BI still hasn’t matured to the point where analytics tools can truly free up humans from the mundane tasks associated with data analysis, to the point where data analysis is part of everyday applications rather than a stand-alone application unto itself, or to the point at which BI platforms can predict for humans a likely outcome before they even request it.

And it hasn’t gotten to the point where it’s accessible to everyone.

In September, Constellation Research released a report entitled “Augmented Analytics: How Smart Features Are Changing Business Intelligence.

Authored by analyst Doug Henschen, the report took a deep look at the third generation of business intelligence, which Henschen approximates began in 2015. The report homes in on the data preparation, data discovery and analysis, natural language interfaces and interaction, and forecasting and prediction capabilities of the BI platforms offered by leading vendors.

Henschen discussed some of his findings about AI for BI for a two-part Q&A. In Part I, he addressed what marked the beginning of this new era and who stands to benefit most from augmented BI capabilities. In Part II, he looked at which vendors are positioned to succeed, and where the third generation of BI is headed next.

Which vendors are in the best position to succeed in this new generation of AI for BI, and why?

Doug HenschenDoug Henschen

Doug Henschen: I think there have been companies that have been more aggressive about augmented analytics capabilities that have led the way, been first movers. IBM came out with Watson Analytics in 2014, SAP acquired KXEN in 2013, ThoughtSpot and BeyondCore got started in the middle of the decade. They were first movers, and then after some of the new capabilities emerged the fast followers in the market responded. Tableau was early with data visualization recommendations. But I don’t think any one company has augmented analytics locked up.

If not excelling across the board, who is doing well in certain areas of third-generation business intelligence?

Henschen: Across the four areas I’ve been seeing leaders — ThoughtSpot on search and natural language query, Oracle has stepped up a lot on natural language query and has a really good mobile app for natural language query support. Salesforce has done a lot with Einstein Analytics in trending and forecasting and prediction to focus on outcomes … three steps removed from the actual business action. In BI and analytics there’s this desire now, particularly as you move toward the business community, to say, ‘Don’t show me a dashboard, don’t show me a report that I have to interpret; tell me what to do.’ In that area Salesforce has been very aggressive with Einstein and Einstein Analytics. My next report is going to be on embedded analytics, which is about bringing analytics into applications, and I think that’s where we’re really going to see this idea of democratization realized.

Which vendors are in a precarious position, not adapting to this era of AI for BI quickly enough?

Henschen: Generally the market is responding. For vendors that have been more focused on reporting, augmented intelligence is not as much of a factor, so they’ve been less motivated to move into that area. That point of 2015 is when augmented analytics became a criteria that got added to the list, but it was by no means the only criteria, or the key criteria. I think we’re going to gradually see, as this stuff gets more and more powerful, it will become more and more important, and as it matures we will also see more capabilities. I graded vendors on four categories, but when some reports first started looking at augmented capabilities there was one grade. I’m sure in the future there will be more aspects of augmented capabilities to look at. It’s just the cycle of maturation. As capabilities become proven it will become more and more commonplace for every vendor to have to have that.

Beyond the four capabilities you discussed in your report, what’s something we can expect to see soon in the evolution of AI for BI?

I think there have been companies that have been more aggressive about augmented analytics capabilities that have led the way, been first movers … but I don’t think any one company has augmented analytics locked up.
Doug HenschenAnalyst, Constellation Research

Henschen: BI has always been about decision support, so you have these broad horizontal platforms and tools that let you analyze anything, but where the rubber meets the road is where these insights actually help customers make a decision. That’s my next report on embedded analytics — ways we’re going to be doing a better and better job of embedding analytics right into the context of decisions and transactions and applications. There are some technologies that are helping to make that happen. Microservices will let us make delivery insight more granular, so instead of a full page report or a full page dashboard we can have a micro-chart or just a KPI or imbed just that nugget of information — that nugget of insight that helps us make the decision — into the app.

There’s also this trend of no-code and low-code application development, and that will enable analysts and even business users to start developing these analytical applications or develop applications with analytics embedded within them. Those are good trends, but this is just the start of these things. The vision is always well ahead of the reality of what’s actually happening in the market.

What’s something we’re seeing the beginnings of now that you expect to be improved?

Henschen: We’re seeing a lot of natural language query, for example, but I think that’s still really immature. One of the takeaways from my report is that we’re in the opening innings, and this stuff needs to get a lot better. The customers I talked to are getting a lot more out of natural language generation than they are out of natural language query, and it’s because query is very context-specific. As we see more sophisticated query capabilities, and we see the blending of query capabilities with understanding of intent — looking at the patterns of what people are asking, what groups of people are asking for, and learning of that — that query capability is going to become more powerful than it is today.

Editor’s note: This interview has been edited for clarity and conciseness.

Go to Original Article
Author:

For Sale – 500gb and 1tb Seagate 2.5″ hard drives

Seagate’s third generation SSHDs (solid state hybrid drives), now for both laptops and desktops, are marketed as a replacement for HDDs and serve as a good option for those otherwise considering an SSD. SSHDs aim to offer users the price-point and robust capacity of HDDs while also utilizing…

Go to Original Article
Author:

Addressing the coming IoT talent shortage – Microsoft Industry Blogs

This blog is the third in a series highlighting our newest research, IoT Signals. Each week will feature a new top-of-mind topic to provide insights into the current state of IoT adoption across industries, how business leaders can develop their own IoT strategies, and why companies should use IoT to improve service to partners and customers.

As companies survey the possibilities of the Internet of Things (IoT), one of the challenges they face is a significant growing talent shortage. Recent research from Microsoft, IoT Signals, drills down into senior leaders’ concerns and plans. Microsoft surveyed 3,000 decision-makers at companies across China, France, Germany, Japan, the United States, and the United Kingdom who are involved in IoT.

Exploring IoT skills needs at enterprises today

Most IoT challenges today relate to staffing and skills. Our research finds that only 33 percent of companies adopting IoT say they have enough workers and resources, 32 percent lack enough workers and resources, and 35 percent reported mixed results or didn’t know their resourcing issues. Worldwide, talent shortages are most acute in the United States (37 percent) and China (35 percent).

Of the top challenges that impede the 32 percent of companies struggling with IoT skills shortages, respondents cited a lack of knowledge (40 percent), technical challenges (39 percent), lack of budget (38 percent), an inability to find the right solutions (28 percent), and security (19 percent).

a close up of a logo graph of tech assessment

a close up of a logo graph of tech assessment

Companies will need to decide which capabilities they should buy, in the form of hiring new talent; build, in the form of developing staff competencies; or outsource, in the form of developing strategic partnerships. For example, most companies evaluating the IoT space aren’t software development or con­nectivity experts and will likely turn to partners for these services.

Adequate resourcing is a game-changer for IoT companies

Our research found that having the right team and talent was critical to IoT success on a number of measures. First, those with sufficient resources were more likely to say that IoT was very critical to their company’s future success: 51 percent versus 39 percent. Hardship created more ambivalence, with only 41 percent of IoT high performers saying IoT was somewhat critical to future success, whereas 48 percent of lower-performing companies agreed.

Similarly, companies with strong IoT teams viewed IoT as a more successful investment, attributing 28 percent of current ROI to IoT (inclusive of cost savings and efficiencies) versus 20 percent at less enabled companies. That’s likely why 89 percent of those who have the right team is planning to use IoT more in the future versus 75 percent of those who lack adequate resources.

IoT talent shortage may cause higher failure rate

Getting IoT off the ground can be a challenge for any company, given its high learning curve, long-term commitment, and significant investment. It’s doubly so for companies that lack talent and resources. IoT Signals found that companies who lack adequate talent and resources have a higher failure rate in the proof of concept phase: 30 percent versus 25 percent for those with the right team. At companies with high IoT success, the initiative is led by a staffer in an IT role, such as a director of IT, a chief technology officer, or a chief information officer. With leadership support, a defined structure, and budget, these all-in IoT organizations are able to reach the production stage on an average of nine months, while those who lack skilled workers and resources take 12 months on average.

Despite initial challenges, company leaders are unlikely to call it quits. Business and technology executives realize that IoT is a strategic business imperative and will be increasingly required to compete in the marketplace. Setting up the right team, tools, and resources now can help prevent team frustration, business burnout, and leadership commitment issues.

Overcoming the skills issues with simpler platforms

Fortunately, industry trends like fully hosted SaaS platforms are reducing the complexity of building IoT programs: from connecting and managing devices to providing integrated tooling and security, to enabling analytics.

Azure IoT Central, a fully managed IoT platform, is designed to let anyone build an IoT initiative within hours, empowering business teams and other non-technical individuals to easily gain mastery and contribute. Azure includes IoT Plug and Play, which provides an open modeling language to connect IoT devices to the cloud seamlessly.

Additionally, Microsoft is working with its partner ecosystem to create industry-specific solutions to help companies overcome core IoT adoption blockers and investing in training tools like IoT School and AI Business School. Microsoft has one of the largest and fastest-growing partner ecosystems. Our more than 10,000 IoT partners provide domain expertise across industries and help address connectivity, security infrastructure, and application infrastructure requirements, allowing companies to drive to value faster. 

Learn more about how global companies are using IoT to drive value by downloading the IoT Signals report and reading our Transform Blog on IoT projects companies such as ThyssenKrupp, Bühler, Chevron, and Toyota Material Handling Group are driving.

Go to Original Article
Author: Microsoft News Center

Exten Technologies releases 3.0 version of NVMe platform

Exten Technologies has released the third generation of its HyperDynamic storage software, which was designed with the aim of bringing more resiliency, performance and management to data center customers.

New features in generation three include node-level resiliency with synchronous replicas, shared volumes with replicas for supporting parallel file systems, dual parity resiliency, and integrated drive management and hot swap.

Exten software is deployed on the storage target and does not require proprietary software on compute clients.

According to Exten, HyperDynamic 3.0 aims to improve TCP performance with Solarflare TCP acceleration that provides TCP performance near remote direct memory access.

It also has new features designed to simplify NVMe-oF storage management and deployment, Exten claims. These features include declustered RAID, which enables the configuration of resilient volumes that use Linux multi-path IO software to provide redundancy in both networking and storage. Exten’s interface provides node- and cluster-level telemetry. Users can also set quality-of-service limits in order to manage performance during drive or node rebuilds.

Exten Technologies is part of a batch of newer vendors making their way in the NVMe market.

Apeiron Data Systems offers a handful of NVMe storage products, including enterprise NVMe. It is NVMe over Ethernet, as opposed to over fabric, and was designed with the goal of delivering the performance and cost of server-based scale-out but with the manageability of enterprise storage.

Vendor Vexata also touts its RAID-protected NVMe and claims it has ultralow latency at scale. According to its website, the company was founded in an attempt to provide better performance and efficiency, while at a lower cost than other flash storage solutions.

Exten Technologies’ HyperDynamic 3.0 is available now.

Go to Original Article
Author:

Insider preview: Windows container image

Earlier this year at Microsoft Build 2018, we announced a third container base image for applications that have additional API dependencies beyond nano server and Windows Server Core. Now the time has finally come and the Windows container image is available for Windows Insiders.

Why another container image?

In conversations with IT Pros and developers there were some themes coming up which went beyond the nanoserver and windowsservercore container images:
Quite a few customers were interested in moving their legacy applications into containers to benefit from container orchestration and management technologies like Kubernetes. However, not all applications could be easily containerized, in some cases due to missing components like proofing support which is not included in Windows Server Core.
Others wanted to leverage containers to run automated UI tests as part of their CI/CD processes or use other graphics capabilities like DirectX which are not available within the other container images.

With the new windows container image, we’re now offering a third option to choose from based on the requirements of the workload. We’re looking forward to see what you will build!

How can you get it?

If you are running a container host on Windows Insider build 17704, you can get this container image using the following command:

docker pull mcr.microsoft.com/windows-insider:10.0.17704.1000

To simply get the latest available version of the container image, you can use the following command:

docker pull mcr.microsoft.com/windows-insider:latest

Please note that for compatibility reasons we recommend running the same build version for the container host and the container itself.

Since this image is currently part of the Windows Insider preview, we’re looking forward to your feedback, bug reports, and comments. We will be publishing newer builds of this container image along with the insider builds.

Alles Gute,
Lars

New Cisco ACI fabric stretches across multiple data centers

Cisco’s third major release of its Application Centric Infrastructure lets companies run the vendor’s software-defined networking architecture across multiple data centers, excluding those of the major public cloud providers.

Introduced Thursday, Cisco ACI 3.0 can provide network services to applications running in a maximum of five data centers. Each facility can run an ACI fabric with as many as 400 leaf switches.

Cisco has aimed its latest ACI fabric upgrade at large enterprises that want to expand their use of the policy-driven form of software-based networking from a single data center to several facilities. Companies demanding multisite networking are typically at the cutting edge of technology.

A recent survey of 200 IT organizations found 90% working on networking projects that spanned multiple data centers, according to analyst firm Enterprise Management Associates, based in Boulder, Colo. More than a quarter of those companies planned to connect five data centers or more.

With ACI 3.0, Cisco is providing a competitive product to sell to those companies, said EMA analyst Shamus McGillicuddy. “Multicloud and multidata fabrics are a must-have for these cutting-edge companies.”

Cisco ACI fabrics connect across data centers

Cisco is competing with virtualization vendor VMware in letting companies replicate the vendors’ respective application-centric networking environments so customers can manage a multisite configuration as one. The core of VMware’s approach is its NSX network overlay, while Cisco uses its hardware as the foundation.

Companies that want to access all the capabilities of ACI 3.0 will have to use Cisco’s Application Policy Infrastructure Controller (APIC) to build in each data center a networking fabric comprised of the vendor’s Nexus 9000 switches. Once that is done, the customer can connect each structure to an APIC-powered appliance that presents a single view of the multisite network.

From the appliance’s software console, network engineers can create and distribute application-centric traffic instructions to defined groups of switches in the form of policies. Also, management and monitoring tools can pull network and application performance data through the appliance’s APIC APIs.

ACI fabric’s multisite capabilities

Across multiple sites, ACI 3.0 can keep packet delivery to one second or less, said Srinivas Kotamraju, director of ACI product management at Cisco.  Traffic to an application that suddenly goes down in one data center can be redirected to a backup in another facility without changing the IP address.

Other multisite capabilities include taking a switch offline for maintenance or troubleshooting without disrupting the traffic flow. ACI 3.0 also provides latency monitoring between endpoints, such as ports and application tiers.

ACI 3.0 extends all policy-related functionality for virtual machines and bare-metal applications to containers in multiple sites. Cisco also provides integration between Kubernetes and ACI policies. Kubernetes is an open source system for managing Linux containers.

“The container stuff is most interesting for forward-looking developers and companies,” said Dan Conde, an analyst at Enterprise Strategy Group Inc., based in Milford, Mass.

ACI 3.0 is not supported in public clouds, such as Amazon Web Services, Microsoft Azure or Google Cloud Platform. Cisco plans to extend ACI fabric capabilities into public cloud environments using their respective APIs, Kotamraju said. Cisco has not provided a timetable.

Rival VMware has started to build a bridge between a customer’s virtualized data center and Amazon. The technology, however, remains a work in progress, with very few production deployments.