Tag Archives: among

Intel partner marketplace to drive ecosystem collaboration

Intel has rolled out the Solutions Marketplace in a bid to facilitate collaboration among its global partner ecosystem.

The Intel Solutions Marketplace, launched Wednesday, provides a platform for Intel partners to create virtual storefronts where they can market their businesses and products. According to Intel, partners can use the Solutions Marketplace to browse other partners’ offerings and engage one another for collaborative purposes. The Solutions Marketplace is the latest move by the company in laying the groundwork for the Intel Partner Alliance program, a revamped partner program slated to launch in the second half of 2020.

“We have this ecosystem across our partner program that spans almost the entirety of our industry, and, oftentimes, the level of collaboration needed between two points in that ecosystem — or three points or even four points in that ecosystem — is growing as the complexity of end customer demands … grows, as well. The Intel Solutions Marketplace would be the way that that industry comes together to facilitate that,” said Eric Thompson, Intel general manager of global partner enablement.

Intel built the Solutions Marketplace on its Solutions Directory, a previously established feature of the company’s IoT Solutions Alliance program. The Solutions Directory lets partners post and promote their IoT-related products and solutions, Thompson said.

Given its origins, the Solutions Marketplace is heavily focused on IoT, but Thompson said offerings will expand into other technology areas. He noted that the marketplace also carries Intel Select Solutions — data center-oriented products for running enterprise software applications such as SAP HANA.

At launch, the Solutions Marketplace has approximately 4,600 unique offerings from about 1,000 Intel partners, Thompson said.

Making partner-to-partner collaborations easier

Other vendors, such as IBM, have recently made efforts to facilitate the partner matchmaking process within their complex channel ecosystems. There are a number of factors driving the need for partner-to-partner collaborations.

Thompson said the Intel partner ecosystem ranges from ODMs, OEMs and ISVs to systems and solutions integrators, services providers and cloud service providers. Customer demand for advanced IT solutions increasingly requires channel firms to combine their skill sets and expertise in joint engagements.

When Intel designed the Solutions Marketplace, the company sought partner feedback on the challenges typically involved in collaborations, Thompson said. Intel partners cited issues such as finding the right companies to connect with and identifying the right people in those companies to contact. “Our intent was to build features into the marketplace to help solve some of those challenges with collaboration,” he noted.

Each virtual storefront lets an Intel partner display a listing of its offerings as well as a detailed profile of its business and targeted industries, focus areas and geographic markets. Partners can be contacted by potential collaborators directly through the storefronts. Additionally, a dashboard gives partners insight into user visits to their storefronts, lead management functions and reporting on lead statistics, Thompson said.

With the Solutions Marketplace launched, Thompson noted that Intel will also continue to host face-to-face partner matchmaking events where partners can learn about one another’s companies and forge alliances.

Intel’s vision for the Solutions Marketplace is to also extend beyond partner-to-partner collaboration. The company aims to drive end customers to the Solutions Marketplace — for example, by shepherding customers to the platform from the Intel.com website, Thompson said. “We see this as a good opportunity for us to help connect end customers … to partners across that variety of solution spaces,” he said.

Other news

  • Rackspace, a cloud and managed service provider, augmented its Service Blocks portfolio of packaged services for cloud environments. The portfolio now includes Container Services Journey, a Service Block to help customers develop container strategies and containerized apps; Hybrid Transformation with VMware Cloud on AWS, which offers tools and expertise for transitioning to hybrid cloud with VMware Cloud on AWS; and Data Modernization, aimed at strengthening customers’ analytics processes, Rackspace said. Rackspace this week also closed its acquisition of Onica, an AWS Premier Consulting Partner and AWS Managed Service Provider.
  • IT management software vendor SolarWinds released the latest version of its N-central remote management and monitoring tool. N-central 12.2 adds network topology mapping capabilities, as well as features for disk encryption, automation and patching, SolarWinds said.
  • NTT Data, an IT services provider based in Tokyo, will resell GoodData’s analytics platform under a new agreement between the companies. NTT Data will also use GoodData’s technology in its iQuattro industrial IoT platform.
  • NTT Data Services, a Plano, Texas, division of NTT Data Corp., signed a definitive agreement to acquire Flux7, an IT services provider and AWS Premier Consulting Partner. Flux7’s expertise includes cloud implementation and migration, automation, and DevOps consulting services, according to NTT Data.
  • Cost and security are key barriers impeding SMBs’ cloud migration, an Insight Enterprises survey found. Fifty-six of the 408 SMB IT decision-maker respondents cited cloud costs as an obstacle, while 50% of those polled identified security requirements. In other findings, the Insight report said 95% of respondents have either implemented or plan to implement within the next year digital transformation initiatives, but 49% rate integrating new technology with legacy systems as very or extremely challenging.
  • Cloud managed service provider Faction introduced a free educational series for companies adopting VMware Cloud on AWS. Dubbed “6-Step Blueprint for the Success,” the program offers business and technical best practices.
  • MSP360, formerly CloudBerry Lab, rolled out macOS and iOS releases of MSP360 Remote Assistant, a freeware remote access and control offering. The Lewes, Del., company said the Apple-oriented releases will make it easier for MSPs to support customers from MacOS computers as well as iOS and iPadOS devices.
  • InterVision, an IT services provider with headquarters in Santa Clara, Calif., and St. Louis, said it has obtained Premier Consulting Partner status within the AWS Partner Network.
  • Wipro, an IT consulting and business process services company, has unveiled cloud Security Operations Center (SOC) services using Microsoft Azure Sentinel. Azure Sentinel is a security information event management offering. Wipro will provide managed cloud SOC services with integrated AI and orchestration capabilities in light of the Microsoft relationship. Wipro will also use its HOLMES AI platform to measure risk factors against compliance standards, according to the company.
  • CloudCheckr, a cloud management platform provider, rolled out a global partner enablement program. The Business Partner Program offers business expertise, sales enablement tools and cloud technology to support MSPs and resellers building cloud service practices, CloudCheckr said.
  • Coronet, a small business data breach platform provider, is partnering with Coalition, a cyber insurance provider for SMBs. The arrangement lets Coronet’s customers obtain Coalition’s cyber insurance products.
  • Identity services provider GlobalSign has signed up Impression, a Johannesburg, South Africa-based solutions provider, to its Certified Regional Partner program.
  • The Internet of Things Security Services Association (IoTSSA) named Robin Miller as its director of channel. Miller will oversee IoTSSA’s industry engagement as the organization develops cybersecurity education resources for MSPs and managed security service providers, IoTSS said.

Market Share is a news roundup published every Friday.

Go to Original Article
Author:

Contact center agent experience needs massive overhaul

Gone are the days when it is acceptable to have greater than 40% turnover rates among contact center agents.

Leading organizations are revamping the contact center agent experience to improve business metrics, such as operational costs, revenue and customer ratings; and a targeted agent program keeps companies at a competitive advantage, according to the Nemertes 2019-20 Intelligent Customer Engagement research study of 518 organizations.

The problems

CX leaders participating in the research pointed to several issues responsible for a failing contact center agent experience:

  • Low pay. In some organizations it’s at minimum wage, despite requirements for bachelor’s degrees and/or experience.
  • Dead-end job. Organizations typically do not have a growth path for agents. They expect them to last 18 months to two years, and there always will be a revolving door of agents coming and going.
  • Lack of customer context. Agents find it difficult to take pride in their work when they don’t have the right tools. Without CRM integrations, AI assistance and insightful agent desktops, it is difficult to delight customers.
  • Cranky customers. Agents also find it difficult to regularly interact with dissatisfied customers. With a better work environment, more interaction channels, better training, more analytics and context, they could change those attitudes.
  • No coaching. Because supervisors are busy interviewing and hiring to keep backfilling the agents who are leaving, they rarely have time to coach the agents they have. What’s more, they don’t have the analytics tools — from contact center vendors such as Avaya, Cisco, Five 9, Genesys and RingCentral; or from pure-play tools such as Clarabridge, Medallia, and Maritz CM — to provide performance insight.

The enlightenment

Those in the contact center know this has been status quo for decades, but that is starting to change.

One of the big change drivers is the addition of a chief customer officer (CCO). Today, 37% of organizations have a CCO, up from 25% last year. The CCO is an executive-level individual with ultimate responsibility for all customer-facing activities and strategy to maximize customer acquisition, retention and satisfaction.

The CCO has budget, staff and the attention of the entire C-suite. As a result, high agent turnover rates are no longer flying under the radar. After bringing the issue to CEOs and CFOs, they are investing resources into turning around the turnover rates.

Additionally, organizations value contact centers more today, with 61% of research participants say the company views the contact center as a “value center” versus a “cost center.” Four years ago, that figure was reversed, with two-thirds viewing the contact center as a cost center.

Research shows there are five common changes organizations are now making to improve the contact center agent experience and reduce the turnover rate.

Companies are adding more outbound contact centers, targeting sales or proactive customer engagement — such as customer check-ups, loyalty program invitations and discount offers — and they are supporting new products and services. This helps to explain why, despite the growth in self-service and AI-enabled digital channels, 44% of companies actually increased the number of agents in 2019, compared to 13% who decreased, 40% who were flat and 3% unsure.

The solution

Research shows there are five common changes organizations are now making to improve the contact center agent experience and reduce the turnover rate — now at 21%, down from 38% in 2016. These changes include:

  • Improved compensation plan. Nearly 47% of companies are increasing agent compensation, compared to the 7% decreasing it. The increase ranges from 22% to 28%. Average agent compensation is $49,404, with projected increases up to $60,272, minimally, by the end of 2020.
  • Investment in agent analytics. About 24% of companies are using agent analytics today, with another 20.2% planning to use the tools by 2021. Agent analytics provides data on performance to help with coaching and improvement, in addition to delivering real-time screen pops to help agents on the spot during interactions with customers. Those using analytics see a 52.6% improvement in revenue and a 22.7% decrease in operational costs.
  • Increases in coaching. By delivering data from analytics tools, supervisors have a better picture of areas of success and those that need improvement. By using a product such as Intradiem Contact Center RPA, they can automate the scheduling of training and coaching during idle times.
  • Addition of gamification. Agents are inspired with programs that inject competitiveness among agents, by awarding badges for bragging rights, weekly gift cards for top performance and monthly cash bonuses. Such rewards improve their loyalty to the company and reduce turnover.
  • Development of career path. Successful companies are developing a solid career path with escalations into marketing, product development and supervisory roles in the contact center or CX apps/analysis.

Developing a solid game plan that provides agents with the compensation, support and career path they deserve will drastically reduce turnover rates. In a drastic example, one consumer goods manufacturing company reduced agent turnover from 88% to 2% with a program that addressed the aforementioned issues. More typically, companies are seeing 5% to 15% reductions in their turnover rates one year after developing such a plan.

Go to Original Article
Author:

Enterprise IT weighs pros and cons of multi-cloud management

Multi-cloud management among enterprise IT shops is real, but the vision of routine container portability between clouds has yet to be realized for most.

Multi-cloud management is more common as enterprises embrace public clouds and deploy standardized infrastructure automation platforms, such as Kubernetes, within them. Most commonly, IT teams look to multi-cloud deployments for workload resiliency and disaster recovery, or as the most reasonable approach to combining companies with loyalty to different public cloud vendors through acquisition.

“Customers absolutely want and need multi-cloud, but it’s not the old naïve idea about porting stuff to arbitrage a few pennies in spot instance pricing,” said Charles Betz, analyst at Forrester Research. “It’s typically driven more by governance and regulatory compliance concerns, and pragmatic considerations around mergers and acquisitions.”

IT vendors have responded to this trend with a barrage of marketing around tools that can be used to deploy and manage workloads across multiple clouds. Most notably, IBM’s $34 billion bet on Red Hat revolves around multi-cloud management as a core business strategy for the combined companies, and Red Hat’s OpenShift Container Platform version 4.2 updated its Kubernetes cluster installer to support more clouds, including Azure and Google Cloud Platform. VMware and Rancher also use Kubernetes to anchor multi-cloud management strategies, and even cloud providers such as Google offer products such as Anthos with the goal of managing workloads across multiple clouds.

For some IT shops, easier multi-cloud management is a key factor in Kubernetes platform purchasing decisions.

“Every cloud provider has hosted Kubernetes, but we went with Rancher because we want to stay cloud-agnostic,” said David Sanftenberg, DevOps engineer at Cardano Risk Management Ltd, an investment consultancy firm in the U.K. “Cloud outages are rare, but it’s nice to know that on a whim we can spin up a cluster in another cloud.”

Multi-cloud management requires a deliberate approach

With Kubernetes and VMware virtual machines as common infrastructure templates, some companies use multiple cloud providers to meet specific business requirements.

Unified communications-as-a-service provider 8×8, in San Jose, Calif., maintains IT environments spread across 15 self-managed data centers, plus AWS, Google Cloud Platform, Tencent and Alibaba clouds. Since the company’s business is based on connecting clients through voice and video chat globally, placing workloads as close to customers’ locations as possible is imperative, and this makes managing multiple cloud service providers worthwhile. The company’s IT ops team keeps an eye on all its workloads with VMware’s Wavefront cloud  monitoring tool.

Dejan Deklich, chief product officer, 8x8Dejan Deklich

 “It’s all the same [infrastructure] templates, and all the monitoring and dashboards stay exactly the same, and it doesn’t really matter where [resources] are deployed,” said Dejan Deklich, chief product officer at 8×8. “Engineers don’t have to care where workloads are.”

Multiple times a year, Deklich estimated, the company uses container portability to move workloads between clouds when it gets a good deal on infrastructure costs, although it doesn’t move them in real time or spread apps among multiple clouds. Multi-cloud migration also only applies to a select number of 8×8’s workloads, Deklich said.

We made a conscious decision that we want to be able to move from cloud to cloud. It depends on how deep you go into integration with a given cloud provider.
Dejan DeklichChief product officer, 8×8

“If you’re in [AWS] and using RDS, you’re not going to be able to move to Oracle Cloud, or you’re going to suffer connectivity issues; you can make it work, but why would you?” he said. “There are workloads that can elegantly be moved, such as real-time voice or video distribution around the world, or analytics, as long as you have data associated with your processing, but moving large databases around is not a good idea.”

Maintaining multi-cloud portability also requires a deliberate approach to integration with each cloud provider.

“We made a conscious decision that we want to be able to move from cloud to cloud,” Deklich said. “It depends on how deep you go into integration with a given cloud provider — moving a container from one to the other is no problem if the application inside is not dependent on a cloud-specific infrastructure.”

The ‘lowest common denominator’ downside of multi-cloud

Not every organization buys in to the idea that multi-cloud management’s promise of freedom from cloud lock-in is worthwhile, and the use of container portability to move apps from cloud to cloud remains rare, according to analysts.

“Generally speaking, companies care about portability from on-premises environments to public cloud, not wanting to get locked into their data center choices,” said Lauren Nelson, analyst at Forrester Research. “They are far less cautious when it comes to getting locked into public cloud services, especially if that lock in comes with great value.”

Generally speaking, companies care about portability from on-premises environments to public cloud, not wanting to get locked into their data center choices. They are far less cautious when it comes to getting locked into public cloud services …
Lauren NelsonAnalyst, Forrester Research

In fact, some IT pros argue that lock-in is preferable to missing out on the value of cloud-specific secondary services, such as AWS Lambda.

“I am staunchly single cloud,” said Robert Alcorn, chief architect of platform and product operations at Education Advisory Board (EAB), a higher education research firm headquartered in Washington, D.C. “If you look at how AWS has accelerated its development over the last year or so, it makes multi-cloud almost a nonsensical question.”

For Alcorn, the value of integrating EAB’s GitLab pipelines with AWS Lambda outweighs the risk of lock-in to the AWS cloud. Connecting AWS Lambda and API Gateway to Amazon’s SageMaker for machine learning  has also represented almost a thousandfold drop in costs compared to the company’s previous container-based hosting platform, he said.

Even without the company’s interest in Lambda integration, the work required to keep applications fully cloud-neutral isn’t worth it for his company, Alcorn said.

“There’s a ceiling to what you can do in a truly agnostic way,” he said. “Hosted cloud services like ECS and EKS are also an order of magnitude simpler to manage. I don’t want to pay the overhead tax to be cloud-neutral.”

Some IT analysts also sound a note of caution about the value of multi-cloud management for disaster recovery or price negotiations with cloud vendors, depending on the organization. For example, some financial regulators require multi-cloud deployments for risk mitigation, but the worst case scenario of a complete cloud failure or the closure of a cloud provider’s entire business is highly unlikely, Forrester’s Nelson wrote in a March 2019 research report, “Assess the Pain-Gain Tradeoff of Multicloud Strategies.”

Splitting cloud deployments between multiple providers also may not give enterprises as much of a leg up in price negotiations as they expect, unless the customer is a very large organization, Nelson wrote in the report.

The risks of multi-cloud management are also manifold, according to Nelson’s report, from high costs for data ingress and egress between clouds to network latency and bandwidth issues, broader skills requirements for IT teams, and potentially double the resource costs to keep a second cloud deployment on standby for disaster recovery.

Of course, value is in the eye of the beholder, and each organization’s multi-cloud mileage may vary.

“I’d rather spend more for the company to be up and running, and not lose my job,” Cardano’s Sanftenberg said.

Go to Original Article
Author:

Analyst forecasts the next big things in BI

Consolidation among business intelligence vendors is driven by what’s perceived to be the next big things in BI, and that was the case during the run of merger and acquisition deals during the first half of 2019.

According to Wayne Eckerson, founder and principal consultant of Eckerson Group, self-service analytics was a key part of what made Looker and Tableau attractive to Google and Salesforce, respectively. When the next consolidation wave hits, according to Eckerson, augmented intelligence could be a big driver, as could cloud-based BI tools — and they will be viewed as the next big things in BI.

Eckerson has more than 25 years of experience in the BI software market and is the author of two books — Secrets of Analytical Leaders: Insights from Information Insiders and Performance Dashboards: Measuring, Monitoring, and Managing Your Business.

In the second part of a two-part Q&A, Eckerson talks about the driving forces behind the recent merger and acquisition deals, self-service analytics, and what excites him about the future of BI. In the first part, Eckerson discusses the divide between enterprises that use data and those that don’t, as well as the importance of DataOps and data strategies and how they play into the data divide.

Among other trends, a wave of consolidation over the last six to 12 months has left fewer vendors but ones with more end-to-end capabilities. What do you see as the next big things in BI that might spark the next wave?

Headshot of Wayne Eckerson, founder and principal consultant of Eckerson GroupWayne Eckerson

Eckerson: It definitely goes in cycles — we’ve seen this consolidation before. The last big one was in 2007-08 when the three biggest BI players — Business Objects, Cognos, Hyperion — were bought by large application vendors SAP, IBM and Oracle, respectively. Usually these cycles are based on the advent of new technology that’s come into the market. In 2002, we moved from client-server to the web, and now we’re in the age of the cloud and self-service, and Looker and Tableau caught the self-service wave with visualization and desktop tools. The next big disruption to the BI market is the cloud. We’re seeing a lag between when these new BI technologies fully maturing on these new platforms and when they get purchased and the market consolidates. If we’re to project out, maybe we’ll see some consolidation around cloud-based, AI-based BI tools where things are much more automated, things are in the cloud, and maybe it’s all embedded and you won’t even notice the BI tools. That’s probably the next wave in five or 10 years.

One thing that jumps out about first Google’s acquisition of Looker and then even more so with Salesforce’s purchase of Tableau is the price. Why are companies suddenly paying so much for BI vendors?

Eckerson: They’ve always gone for a premium, but now the premium is in the billions and not the hundreds of millions. We’re in this data-driven age now, and these are the tools that the business users touch and feel and use, so that maybe gives them a higher premium than middleware or database technology that’s behind the scenes. Tableau has been a meteor, and they probably sold at the right time for them. They’re under duress now from competitors. I think it’s just a testimony to how much data is front and center to the way businesses operate in today’s environment.

Ease of use to make data available to the citizen data scientist has been a significant push. Do you see self-service analytics taking over, or will there always be some things that are just too big and complex for average users and self-service BI will just be part of the picture?

Eckerson: Self-service is an interesting topic because there’s been so much frustration with IT, the IT bottleneck and delivering new applications for analytics that businesses wanted self-service just to get away from IT, but in the end what you really want is a blend of both. There are things that are too complex for the average business person to create on their own. If you want to build an enterprise unit for everybody, no business unit is going to do that alone, so you’d need a central group just for that. And then every division has some complex custom apps that need to be built, so you’ll need a corporate development team to build cutting-edge applications that will really help the company compete. On a day-to-day basis every business unit needs its core data analysts and data scientists to be looking into data to help optimize decisions, help optimize business processes, respond creatively and quickly to events as they happen on the ground, to win business, to avoid losing business, to manage risk — all that stuff. The self-service is really the agile, innovative arm of the business, whereas as the corporate IT team is the run-the-business operational side that will build stuff that’s needed on a long-term basis. You need both sides to operate effectively.

As you analyze the BI industry, what are the next big things in BI that get you excited?

Eckerson: I am excited about AI for BI — it’s really transforming the way people are using data to make decisions, and it’s going to transform these BI tools. Before you needed a hypothesis of what to look for when you’re doing an analysis, and now the tools will dig into the data for you. They’ll do thousands of drill-downs in a matter of seconds and expose and surface only the most relevant correlations for you to look at. That’s pretty interesting. DataOps is pretty interesting, because that will fix the back end — the data that’s being delivered into these analytical tools. I think time-series analytics is the next big wave that we’ll see hit the marketplace. Especially as the internet of things and big data take hold, companies can use time-series analytics to automate decisions. The intersection of time-series analytics, AI and cloud-based computing with its infinite storage and elasticity — the combination of those things is going to bring about a sea change. There’s a lot to be excited about in our space.

Editors’ note: This interview has been edited for clarity and conciseness.

Go to Original Article
Author:

Using visualizations and analytics in media content

BOSTON — Among countless online newspapers and journals, blogs, videos and social media feeds, the modern digital consumer has a dizzying amount of media sources to choose from.

As content creators vie for consumer attention, some organizations have turned to data visualization and advanced analytics in media to gain an advantage.

Visualizing data analytics in media

Take, for example, Condé Nast, an American-based mass media company whose 19 brands attract around 150 million consumers.

With a diverse portfolio that includes The New Yorker, Wired and Teen Vogue, the media company needs to capture the attention of numerous social groups and niches around the world. Condé Nast has found that interactive charts and graphs seem to appeal to inquisitiveness of most types of consumers.

Compared with static images, interactive visualizations “introduce a whole new level [to content], and increase time spent” on content by consumers, said Danielle Carrick, a data visualization designer and developer at Condé Nast, during a presentation this week at the 2018 Data Visualization Summit.

Carrick showed examples of colorful, easy-to-read charts and graphs. Large gray and red bars with moveable sliders on the entertainment and culture site Glamour plainly illustrated the disparity between men and women Oscar nominees since 1928.

On Teen Vogue, an in-depth interactive scatterplot of tweets from @realDonaldTrump splashed red dots across the screen. Each visualization, though in itself an example of analytics in media, was different.

“Same type of data, totally different way to look at it,” Carrick said of the visualizations.

Danielle Carrick, Condé Nast, 2018 Data Visualization Summit Boston
Danielle Carrick of Condé Nast speaks at the 2018 Data Visualization Summit in Boston this week.

Static still around

The benefits of consistently changing the way data sets are illustrated are twofold, Carrick said. This varied approach gives consumers new and fresh ways to interact with different data sets, and also enables her and her team to be creative.

Same type of data, totally different way to look at it.
Danielle Carrickdata visualization designer and developer, Condé Nast

Carrick noted that despite the increased use of interactive visuals, static graphs and images are far from being phased out.

Static visuals still are used most often, and are developed separately by each brand, rather than a team working directly under the Condé Nast flag. Understandably, interactive data sets are harder to create, and require input from the local editor, writer and design team working on the content piece.

There’s a lot of communication, Carrick said, and ultimately, it’s up to the brand to decide if it will use the visual.

“They’re not going to publish something they don’t think they’re readers are interested in,” she said.

Internally, the team employs Qlik software, which has revamped its visualization capabilities recently to better compete with rival self-service BI vendor Tableau, for analytics in media.

And while Carrick admitted that more tracking needs to be done to measure the results of using interactive visuals, they seem to both draw in more consumers and keep them on the webpage longer.

Ad analytics

Visualizations aren’t the only ways organizations are using analytics in media, however.

In a separate presentation at the parallel 2018 Big Data Innovation Summit, Carla Pacione, senior director of data and systems at Comcast Spotlight, talked about how advanced analytics plays a role in the telecommunication conglomerate’s advertising efforts. In particular, Pacione highlighted the importance of digital metrics, which she claimed to have “really took the level of advertising to a whole new level.”

Thanks to new and updated technologies in TV and digital metrics, including embedding a pixel in commercials that can capture household and engagement data, organizations like Comcast can better measure metrics today and enable them to gain deeper insights, Pacione said.

Comcast is piloting more advanced “household addressable TV advertising” — the ability to send more targeted and relevant ads to different households watching the same TV program.

While Pacione noted Comcast uses third-party organizations to track purchases and predict future purchases, better being able to measure metrics has enabled such analytics in media advertising advancements.

With so many different ways of consuming media, Pacione said it will be important for media partners to work together to share information and advice and ultimately better target consumers.

Already, she said, “we’re starting to see that sharing in the industry because there’s just so much to learn.”

The 2018 Data Visualization Summit and the 2018 Big Data Innovation Summit were held Sept. 11 to 12 at the Renaissance Boston Waterfront Hotel.

New tech trends in HR: Josh Bersin predicts employee experience ‘war’

LAS VEGAS — Among fresh tech trends in HR, one that may garner the most interest is a new layer of software — which superstar analyst Josh Bersin called an employee experience platform — that will fit between core HR and talent management tools.

Bersin said he expects employee experience to become the next-generation employee portal — in other words, the go-to application for modern workers who need HR-based information. Vendors are lining up to address the need, he added.

“There is going to be a holy war for [what] system your employees use first,” said Bersin, an independent analyst who founded Bersin by Deloitte. Although his quote served as hyperbole, it nonetheless stuck with attendees here at the 2018 HR Technology Conference & Exposition.

“He hit home,” said Rita Reslow, senior director of global benefits at HR software vendor Kronos, based in Lowell, Mass. “We have all these systems, and we keep buying more.” But she wondered aloud when one product would tie her systems together for employees.

No vendor has achieved a true employee experience platform, Bersin told a room packed with 900 or so attendees at the conference on Tuesday. However, ServiceNow, PeopleDoc — which Ultimate Software acquired in July — and possibly IBM appear to have a head start, he added.

Tech trends in HR point to team successes

There is going to be a holy war for [what] system your employees use first.
Josh Bersinindependent analyst

Bersin, who plans to release an extensive report about 2019 tech trends in HR, said software development within the industry reflects a shift in management that steers away from employee engagement and company culture in favor of increased team performance.

Unless a recession hits, “I think the focus of the tech market for the next couple of years … is on performance, productivity and agility,” he said.

The shift to productivity will require future technology to simplify work life, said Cliff Howe, manager of enterprise applications at Cox Enterprises, a communications and media company in Atlanta. “Our employees are being inundated,” Howe said. “We don’t want to hit our employees with too much [technology].”

Bersin suggested that HR software buyers consider the following tips when evaluating new human capital management products:

  • Shop around for vendors that focus on your company’s particular market. For example, if your organization exhibits a compliance-based culture, find a vendor that mirrors that approach.
  • Evaluate the “personality of the vendor,” he said. As an example, determine if the vendor’s reps listen to your decision-makers and help them. If the answer is no, it may be time to drop that vendor from consideration.

AI auditing, real-time payrolls needed in future

In other upcoming tech trends in HR, Bersin pegged AI as a quickly growing field that smart HR departments will learn how to monitor and audit in the future. That notion was on the minds of many at the HR Technology Conference, for which TechTarget — the publisher of SearchHRSoftware — is a media partner.

AI innovation has increased rapidly in the last two years. Today, even small HR software vendors with three to five engineers can use technology from Google or IBM, combine it with open source options and scale a new product on the cloud quickly, Bersin said. HR professionals will need to adjust their skills in order to better understand why AI software makes its decisions, which is an area not fully grasped yet, he added.

Howe agreed AI has grown beyond wish-list status. “AI will be a requirement, rather than a shiny object,” he said.

Bersin also noted that software will need to reflect a possible switch to a continuous payroll model — perhaps as often as daily. Younger workers, some of whom might not have bank accounts, have increased their demands to be compensated in real time, and this request is not just for the gig economy, he said.

Convenience: Driver of BI innovation

Allaa “Ella” Hilal is among that rare breed of computer experts who straddle the academic and commercial worlds. As director of data at Ottawa-based Shopify, Hilal oversees data product development for the e-commerce company’s international and larger merchants, also known as Plus customers. She is also an adjunct associate professor in the Centre for Pattern Analysis and Machine Intelligence at the University of Waterloo in Ontario, where she earned a Ph.D. in electrical and computer engineering.

An expert in data intelligence, wireless sensor networks and autonomous systems, Hilal is among the featured speakers at the Real Business Intelligence Conference on June 27 to 28 in Cambridge, Mass.  Here, Hilal discusses what’s driving business intelligence (BI) innovation today and some of the pitfalls companies should be aware of.

What is driving BI innovation today?

Ella Hilal: First of all, in this day and age, companies are creating more and more products to derive customer convenience. This convenience ends up saving time, which ties to money. When we become more efficient, whether it’s in our IT systems or in our daily commute, we gain moments that we can spend on something else. We can have more time with our families and loved ones, or even gain more time or resources to do the things we love or care about.

There is this immediate need and craving for more efficiency and convenience from the customer side. And businesses all are aware of this craving. They are trying to think about what they can do with the data that exists within the systems or data being collected from IoT, which they know is valuable. The power of BI lies in the fact that it can take all of these different data sources and derive valuable insights to drive business decisions and data products that empower customers and the business in general.

There are many methodologies of how you can apply this to your business, and I plan to discuss some methodologies during my talk at the Real Business Intelligence Conference.

Companies have been doing business intelligence for a long time; they’ve had to figure out which data is useful and which is not for their businesses. What’s different about capitalizing on data generated from technologies like IoT and smart systems?

Hilal: Generally, only 12% of company data that is analyzed today is critical to a business — the rest is either underutilized or untapped. If we think we’re doing such a good job with the analytics we have today, imagine if you apply these efforts across the entire data available in your business. At Shopify, we work to identify the pain points of running a business and use data to provide value to the merchants so they have a better experience as an entrepreneurs.

So, there is huge value we can mine and surface. And when we talk about advanced analytics, we’re not talking about just basic business analytics; we’re talking also about applying AI, machine learning, prediction, forecasting and even prescriptive analytics.

Most CIOs are acutely aware that AI and advanced analytics should be part of a BI innovation strategy. But even big companies are having trouble finding skilled people to do this work.

Hilal: It’s a problem every company will face, because the skilled data scientist is still scarce compared to the need. One challenge is that the people who have the technical abilities to do this strong analytical work don’t always have the business acumen that is needed for an experienced data scientist. They might be very smart in doing sophisticated analysis, but if we don’t tie that with business acumen, they fail to communicate the business value and enable the decision-makers with useful insights. Furthermore, the lack of business acumen makes it challenging to build data products you can utilize or sell. So, you need to build the right kind of team.

Community and university collaborations are one of the strongest approaches that big companies are adopting; you can see that Google, Uber and Shopify, for example, are all partnering with university research labs and reaping the benefits from a technical perspective. They have the technical team and the business acumen team, which then brings the work in-house to focus on data analytics products. So, you get to bridge the gap between this amazing research initiative and the productization of the results.

Another benefit is that with these partnerships, researchers with very strong technical AI and statistical backgrounds can also develop business acumen, because they are working closely with product managers and production teams. This is definitely a longer-term strategy. Wearing my research hat, I can say that universities are also working hard to introduce programs with a mix of computer science and machine learning, programs with a good mix of the old pillars of data science and new approaches.

So, companies need to come up with new frameworks for capitalizing on data. Are there pitfalls companies want to keep in mind?

Hilal: You’ll hear me say this time and time again: We all need to have a sense of responsible innovation. We’re in this industrial race to build really good products that can succeed in the market, and we need to keep in mind that we are building these products for ourselves, as well as for others.

When we create these products, it is the distributed responsibility of all of us to make sure that we embed our morals and ethics in them, making sure they are secure, they are private, they don’t discriminate. At Shopify, we are always asking ourselves, ‘Will this close or open a door for a merchant?’ It is not enough that our products are functional; they have to maintain certain ethical standards, as well.

We’ve reported on how the IoT space may pose a threat because developers are under such pressure to get these products to market that considerations like security and ethics and who owns the data are an afterthought.

Hilal: We should not be putting anything out there that we wouldn’t want in our own homes. But this is not just about AI or IoT. Whether it is a piece of software or hardware system, we need to make sure that security is not a bolt-on, or that privacy is fixed after the fact with a new policy statement — these things need to be done early on and need to be thought of before and throughout the production process.

DRaaS solution: US Signal makes rounds in healthcare market

A managed service provider’s disaster-recovery-as-a-service offering is carving a niche among healthcare market customers, including Baystate Health System, a five-hospital medical enterprise in western Massachusetts.

The DRaaS solution from US Signal, an MSP based in Grand Rapids, Mich., is built on Zerto’s disaster recovery software, US Signal’s data center capability and the company’s managed services. The offering is designed to work in VMware vCenter Server and Microsoft System Center environments. One target market is healthcare.

“We have several healthcare facilities … all across the Midwest using this solution,” said Jerry Clark, director of cloud sales development at US Signal. The DRaaS solution meets HIPAA standards, according to the company.

Clark said many hospitals — and organizations in other industries, for that matter — are searching for ways to avoid the investment in duplicate hardware traditional DR approaches require. With DRaaS, hardware becomes the service provider’s issue. Instead of paying for hardware upfront, the customer pays a monthly management fee to the DRaaS provider. The approach has expanded the channel opportunity in DR.

“Enterprises … run into the same situation: ‘Do we spend all this Capex on disaster recovery hardware that may or may not ever get used?'” Clark noted. “A DRaaS solution makes it much more economical.”

Chart showing anticipated budget growth across various IT sectors
One-third of the respondents to TechTarget’s IT Priorities survey identified disaster recovery as an area for budget growth.

Baystate Health adopts DRaaS solution

US Signal found an East Coast customer, Baystate Health, based in Springfield, Mass., though VertitechIT, a US Signal consulting partner located in nearby Holyoke, Mass.

Jerry Clark, director of cloud sales development at US SignalJerry Clark

VertitechIT helped Baystate Health launch a software-defined data center initiative. The implementation uses the entire VMware stack across three active data centers. The three-node arrangement provides local data replication, but David Miller, senior IT director and CTO at Baystate Health, said an outage in 2016 knocked out all three sites — contrary to design assumptions — for 10 hours.

Miller said his organization had been looking into some form of remote replication and high availability but had yet to land a good solution. The downtime event, however, increased the urgency of finding one.

“We realized we had to do something now rather than later,” Miller said.

David Miller, CTO at Baystate Health SystemDavid Miller

VertitechIT introduced US Signal to Baystate Health. The companies met in VertitechIT’s corporate office and US Signal proposed its DRaaS solution. In its DRaaS solution, US Signal deploys Zerto’s IT Resilience Platform, specifically Zerto Virtual Manager and Virtual Replication Appliance. The software installed in the customer source environment replicates data writes for each protected virtual machine to the DR target site, in this case US Signal’s Grand Rapids data center. An MPLS link connects Baystate Health to the Michigan facility.

The remote replication service provides the benefit of geodiversity, according to the companies. Baystate Health’s data centers are all in the Springfield area.

[embedded content]

CIO of Christian Brothers Services discusses the
company’s infrastructure partnership with US Signal.

US Signal’s DRaaS solution also includes a playbook, which documents the steps Baystate Health IT personnel should take to failover to the disaster recovery site in the event of an outage. In addition, US Signal’s DRaaS package provides two annual DR tests. The DRaaS provider also tests failover before the DR plan goes into effect and documents that test in the playbook, Clark noted.

Miller said the DR service, which went live about a year ago, provides a recovery point objective (RPO) of “less than a couple of minutes” for Baystate Health’s PeopleSoft system, one of the healthcare provider’s tier-one applications. The recovery time objective (RTO) is less than two hours. RPO and RTO characteristics differ according to the application and its criticality.

Initially, the DRaaS solution covered a handful of apps, but the list of protected systems has expanded over the past 12 months, Miller said.

A DRaaS ‘showcase’

Myles Angell, executive project officer at VertitechIT, said the Baystate Health deployment has become “a showcase” when meeting with potential clients that have similar DR challenges.

Myles Angell, executive project officer at VertitechITMyles Angell

“We’re talking to other hospitals about it,” he said.

Other organizations interested in DRaaS should pay close attention to their application portfolios, however. Angell said businesses need to have a thorough understanding of applications before embarking on a DR strategy.

“To successfully build a disaster recovery option — and have confidence in the execution — relies on complete documentation of the application’s running state, dependencies and any necessary changes that would need to be executed at the time of a DR cut over,” he explained. “These pieces of information are vital to knowing how to adhere to the RTO/RPO objectives that have been defined.”

Angell said businesses may have a good understanding of their tier-one applications but may have less of a handle with regard to their tier-three or tier-four systems. The recovery of an application that isn’t well-documented or completely understood becomes a riskier endeavor when a disaster strikes.

“The DR option may miss the objectives and targets that the business is expecting and, therefore, the company may actually be worse off due to lost time trying to scramble for the little things that were not documented,” Angell said.

IDC, Cisco survey assesses future IT staffing needs

Network engineers, architects and administrators will be among the most critical job positions to fill if enterprises are to meet their digital transformation goals, according to an IDC survey tracking future IT staffing trends.

 The survey, sponsored by Cisco, zeroed in on the top 10 technology trends shaping IT hiring and 20 specific roles IT professionals should consider in terms of expanding their skills and training. IDC surveyed global IT hiring managers and examined an estimated 2 million IT job postings to assess current and future IT staffing needs.

The survey results showed digital transformation is increasing demand for skills in a number of key technology areas, driven by the growing number of network-connected devices, the adoption of cloud services and the rise in security threats.

Intersections provide hot jobs

IDC classified the intersections of where hot technologies and jobs meet as “significant IT opportunities” for current and future IT staffing, said Mark Leary, directing analyst at Cisco Services.

“From computing and networking resources to systems software resources, lots of the hot jobs function at these intersections and take advantage of automation, AI and machine learning.” Rather than eliminating IT staff jobs, a lot of jobs take advantage of those same technologies, he added.

Organizations are preparing for future IT staffing by filling vacant IT positions from within rather than hiring from outside the company, then sending staff to training, if needed, according to the survey.

But technology workers still should investigate where the biggest challenges exist and determine where they may be most valued, Leary said.

“Quite frankly, IT people have to have greater understanding of the business processes and of the innovation that’s going on within the lines of business and have much more of a customer focus.”

The internet of things illustrates the complexity of emerging digital systems. Any IoT implementation requires from 10 to 12 major technologies to come together successfully, and the IT organization is seen as the place where that expertise lies, Leary said.

IDC’s research found organizations place a high value on training and certifications. IDC found that 70% of IT leaders believe certifications are an indicator of a candidate’s qualifications and 82% of digital transformation executives believe certifications speed innovation and new ways to support the business.

Network influences future IT staffing

IDC’s results also reflect the changes going on within enterprise networking.

Digital transformation is raising the bar on networking staffs, specifically because it requires enterprises to focus on newer technologies, Leary said. The point of developing skills in network programming, for example, is to work with the capabilities of automation tools so they can access analytics and big data.

This isn’t something that’s evolutionary; it’s revolutionary.
Mark Learydirecting analyst, Cisco Services

In 2015, only one in 15 Cisco-certified workers viewed network programming as critical to his or her job. By 2017, the percentage rose to one in four. “This isn’t something that’s evolutionary; it’s revolutionary,” Leary said.

While the traditional measure of success was to make sure the network was up and running with 99.999% availability, that goal is being replaced by network readiness, Leary said. “Now you need to know if your network is ready to absorb new applications or that new video stream or those new customers we just let on the network.”

Leary is involved with making sure Cisco training and certifications are relevant and matched to jobs and organizational needs, he said. “We’ve been through a series of enhancements for the network programmability training we offer, and we continually add things to it,” he added. Cisco also monitors customers to make sure they’re learning about the right technologies and tools rather than just deploying technologies faster.

To meet the new networking demands, Cisco is changing its CCNA, CCNP and CCIE certifications in two different ways, Leary said. “We’ve developed a lot of new content that focuses on cybersecurity, network programming, cloud interactions and such because the person who is working in networking is doing that,” he said. The other emphasis is to make sure networking staff understands language of other groups like software developers.