Tag Archives: companies

AI, data analytics, recruiting tech among HR priorities, leaders say

LAS VEGAS — HR leaders at top national companies want tech that delivers insights and improves talent management. The top HR priorities included boosting candidate and employee experience through stellar technology. That was the message to vendors and attendees at the 2018 HR Technology Conference & Expo from a panel on what it takes to create top-notch HR. Improved recruiting platforms, AI, data analytics and user-driven learning platforms were all listed as important.

The HR chiefs from Accenture, BlackRock, Delta Air Lines, Johnson & Johnson and The Walt Disney Co., who appeared on a panel, discussed their technology priorities and interests. They weren’t picking and choosing vendors, and they made a point of avoiding mentioning any of the vendors at the conference.

But this group of global HR leaders had a clear idea of what they thought was important to conference attendees and vendors. It was a strategic, but pointed, overview of how they are using technology and what their firms want from it.

Stellar HR requires a candidate-focused recruiting system

Johnson & Johnson interviews a million people a year to hire 28,000 individuals. “So, how do you make sure that they [the candidates] have visibility [into] how they’re tracking through the process, like you would track a Domino’s pizza or a UPS or a FedEx package?” asked Peter Fasolo, executive vice president and chief human resources officer (CHRO) at Johnson & Johnson, based in New Brunswick, N.J.

At BlackRock, talent is an ongoing executive board-level discussion, said Matt Breitfelder, managing director and chief talent officer. The New York-based company is using technology to help improve the diversity of its hiring.

The firm wants diversity on its teams, so its employees are “challenging each other to think more clearly about what they’re seeing in markets,” Breitfelder said.

BlackRock is using tools in its hiring process to make sure it is “not just replicating an industry that has tended to have one way of thinking,” Breitfelder said. “We know it’s about teams, not about individual stars.”

Data analytics makes us more human

We democratized all of our learning.
Ellyn Shookchief leadership and human resource officer at Accenture

“Data analytics makes us more human, because our own data analytics shows there’s a lot of liberal arts majors who make great investors, which is very counterintuitive,” Breitfelder said. 

Delta Air Lines has begun using machine learning and AI technologies to help discover “good predictors of success” in its hiring, said Joanne Smith, the company’s executive vice president and CHRO. “That’s going to help us get smarter and smarter and smarter about hiring,” she said.

Learning and a focus on employee experience

Learning technology was also mentioned as a priority, and Accenture explained why that is. In response to the competition in the labor market, the firm decided to go big on training employees on entirely new skills.

“We democratized all of our learning,” said Ellyn Shook, chief leadership and human resource officer at Accenture, based in Dublin. Learning “is now in real time, on demand and available to our people anytime, anywhere, any device,” she said.

Some 300,000 of Accenture’s 450,000 employees have taken advantage of it in the last two years, which includes some “leading-edge technical areas that there would be no way we could have hired at that scale,” Shook said.

A common theme for the conference panel was the need for consumer-like HR technologies.

“Help me do what I’m doing. Help my employees be better at what we’re doing. But have a consumer mindset to it,” said Jayne Parker, senior executive vice president and CHRO of Disney.

Federal privacy regulations usher in the age of tech lawmakers

Tech companies that have successfully lobbied against stricter privacy regulations are facing pushback from consumers on their latest campaign to curtail data privacy rights.

Big tech’s call for federal regulation comes amid a reactionary call for privacy rights, as data breach media coverage has exposed companies’ poor management of personal information and piqued consumers’ data protection concerns.  

“Consumers are seeing data breaches and privacy mistakes in the news every single day, and the breaches are getting larger in scope. And the number of individuals impacted seems to be larger for every single one,” said Nicholas Merker, partner and co-chair of the data security and privacy practice at Ice Miller, based in Indianapolis. “People understand that some companies are misusing their data or not protecting their data appropriately, and it’s creating a risk for these individuals.”

Shortly after GDPR — the European law that unified data privacy protection and specified consumer rights to their personal data — went into effect last spring, California passed the California Consumer Privacy Act (CCPA) of 2018. The new state law gives users the right to request details about individual data collected by the companies they do business with and to delete personal data without penalty to service.

Now, tech giants like Facebook, IBM and Microsoft are playing offense and proposing federal privacy regulations that override the California rules.

As the fight between state and federal laws plays out, CIOs and their data privacy experts may well find themselves advising their companies on where to come down on data privacy rights.

A company‘s best course will likely depend, in large part, on where it does business, how it makes money and how much its customers value data privacy.

Why the push for federal law?

Tech companies with multistate operations are gunning for the federal law in order to avoid having to comply with up to 50 competing jurisdictions. Experts expect other states to begin following in California’s footsteps by amending or creating state privacy laws.

The CCPA has certainly set the bar for other like-minded states, said Erin Illman, co-chair of Bradley’s cybersecurity and privacy practice group and member of the North Carolina Bar Association’s Privacy and Data Security Committee.

“You’re going to see the states that have taken a forward stance in privacy start to really look at California and say, ‘Maybe we need to amend our laws that are already on the books, but maybe we also need to put forward a similar law or something that even goes farther than California,'” Illman said.

But big tech’s effort to get a federal law passed is not just to save themselves the headache of state-specific compliance, experts said, but also to preserve profits amid growing concern over business preservation.

And if we look to the GDPR as a model for U.S. legislation, we must also examine the immediate aftermath, Merker said.

“The GDPR is a great example of what [strict federal privacy legislation] would do to the behavioral advertising firm, targeted advertising firm, company index firm industry — it would destroy it,”  Merker said.

“When GDPR was implemented for publicly traded companies, you saw massive drops in stock prices; you saw some companies that just no longer existed, because their practices are no longer legitimate under the GDPR.”

Data: The new dollar

Data privacy experts advise CIOs keep a close eye on the proposed legislation and its framework, including exactly whom it seeks to regulate.

For example, one of the proposals for the federal privacy regulations defines consumers as users who have purchased something from the company. Under this definition, social media businesses like Facebook and email businesses like Gmail that do not charge for their services or sell products would have far fewer reportable consumers than sites that sell a product or charge a nominal fee for service. Even a $1 yearly fee makes each individual a consumer whose privacy is protected instead of a user who remains exempt from privacy regulations.

Experts noted that this distinction shows the defining characteristic of online business: Data is money.

Personal information is the currency of the internet — more so than bitcoin, more so than the dollar. [Data] is what is being bartered for services and then sold for revenue,” said Nader Henein, research director of data protection and privacy at Gartner.

“Like any other currency, it needs to be regulated. Otherwise, it loses its value, and it’s inconsistent.”

Love affair gone sour

In the face of big tech’s all-out lobbying effort for the federal law, data privacy interest groups have not hung back. Instead, they are taking advantage of growing consumer sentiment that the titans of Silicon Valley can delight customers and still not have their best interest at heart.

The inability of business to prevent massive data breaches that expose sensitive information has also fueled consumer interest in wanting more control over personal data. 

Internationally, America seems like we are now behind the times when it comes to privacy law.
Nicholas Merkerpartner and co-chair of Ice Miller’s data security and privacy practice

A major point on the tech companies’ list of wishes is self-regulation and the creation of industry guidelines with no legal or financial penalty for noncompliance. Trade groups such as the U.S. Chamber of Commerce, the Internet Association and the Information Technology Industry Council are all pushing for voluntary standards.

Tech companies’ C-suites claim they know exactly what data is being collected, how it’s used and, ultimately, how to protect it. They argue self-regulation allows for flexible compliance that protects privacy and the ability to remain profitable.

Privacy advocates, on the other hand, cite years of improper data management, privacy violations and data breaches as examples of the whittling of trust that’s occurred between the general public and tech businesses.

“There’s a lot of trust that’s been lost between the general public and between privacy advocates and business,” Illman said. “Because of that loss of trust, the concept of self-regulation is something that privacy advocates are pushing back against and saying, ‘You know, we don’t really trust you to regulate yourselves.'”

So, what’s the next battle move? The proposal and establishment of federal privacy regulations could be a positive change if companies develop strategies that are fair, transparent and create a more equal benefit for company and user.

“Internationally, America seems like we are now behind the times when it comes to privacy law,” Merker said. “All privacy advocates want America to catch up and be standing with the rest of the world.”

A quick take on the State of Hybrid Cloud survey

What does hybrid cloud mean to IT professionals, and why are so many companies using it? Microsoft conducted a survey with research firm Kantar TNS in January 2018, asking more than 1700 respondents to chime in. Surveys were collected from IT professionals, developers, and business decision makers to identify how they perceive hybrid cloud, what motivates adoption, and what features they see as most important. Survey participants in the United States, the United Kingdom, Germany, and India were asked their thoughts about hybrid, which for the survey was defined as consisting of “private cloud or on-premises resources/applications integrated with one or more public clouds”. We’ve created a summary infographic of the survey that you can review. A few survey highlights:

  • Hybrid is common, with a total of 67 percent of respondents now using or planning to deploy a hybrid cloud. Many of those hybrid users have made the move recently, 54 percent of users in the past two years.
  • Cost, a consistent IT experience, and the ability to scale quickly were all given as important reasons for moving to hybrid cloud.
  • The perceived benefits of hybrid cloud, as well as some of the challenges, vary by the geographic location of respondents. For example, increased security was the top benefit cited in the United Kingdom and Germany, while the top United States benefit was better scalability of compute resources.
  • The top use case given for hybrid cloud was controlling where important data is stored at 71 percent. Using the cloud for backup and disaster recovery was a close second at 69 percent.

image

We invite you to download and share our infographic on the state of today’s hybrid cloud. For a more complete review of the State of Hybrid Cloud 2018 survey findings, watch the on-demand webinar Among the Clouds: Enterprises Still Prefer Hybrid in 2018.

For more information about hybrid networking, identity, management and data on Azure, you can also check out this new Azure Essentials segment, Integrating Azure with your on-premises infrastructure.


State of Hybrid Cloud 2018 survey

Participants for this online survey were recruited from (non-Microsoft) local market lists selected by Microsoft and the international research firm Kantar TNS, which was hired to conduct the outreach. Survey participants included IT professionals, professional developers, and business decision makers/influencers who use, are planning, or have considered a hybrid cloud deployment. Surveyed company sizes were from mid-market to enterprise (250+). The survey was conducted January 4 – 24, 2018. For the purposes of this survey, hybrid cloud was defined as follows. Hybrid cloud consists of private cloud or on-premises resources/applications integrated with one or more public clouds.

Deloitte CIO survey: Traditional CIO role doesn’t cut it in digital era

CIOs who aren’t at the forefront of their companies’ digital strategies risk becoming obsolete — and they risk taking their IT departments with them.

The message isn’t new to IT executives, who have been counseled in recent years to take a leadership role in driving digital transformation. But new data suggests CIOs are struggling to make the shift. According to a recently published global CIO survey by Deloitte Consulting, 55% of business and technology leaders polled said CIOs are focused on delivering operational efficiency, reliability and cost-savings to their companies.

Kristi Lamar, managing director and U.S. CIO program leader at Deloitte and a co-author of the report, said IT executives who are serving in a traditional CIO capacity should take the finding as a clarion call to break out of that “trusted operator” role — and soon.

“If they don’t take a lead on digital, they’re ultimately going to be stuck in a trusted operator role, and IT is going to become a back office function versus really having a technology-enabled business,” she said. “The pace of change is fast and they need to get on board now.”

Taking on digital

Manifesting legacy: Looking beyond the digital era” is the final installment of a three-part, multiyear CIO survey series on CIO legacy. The idea was to chronicle how CIOs and business leaders perceived the role and to explore how CIOs delivered value to their companies against the backdrop of digital transformation.

Kristi Lamar, managing director and U.S.CIO program leader at DeloitteKristi Lamar

In the first installment, the authors developed three CIO pattern types. They are as follows:

  • Business co-creators: CIOs drive business strategy and enable change within the company to execute on the strategy.
  • Change instigators: CIOs lead digital transformation efforts for the enterprise.
  • Trusted operators: CIOs operate in a traditional CIO role and focus on operational efficiency and resiliency, as well as cost-savings efforts.

Based on their findings, the authors decided that CIOs should expect to move between the three roles, depending on what their companies needed at a given point in time. But this year’s CIO survey of 1,437 technology and business leaders suggested that isn’t happening for the most part. “We have not seen a huge shift in the last four years of CIOs getting out of that trusted operator role,” Lamar said.

The pace of change is fast and they need to get on board now.
Kristi Lamarmanaging director, Deloitte

Indeed, 44% of the CIOs surveyed reported they don’t lead digital strategy development or lead the execution of that strategy.

The inability of CIOs to break out of the trusted operator role is a two-way street. Lamar said that companies still see CIOs as — and need CIOs to be — trusted operators. But while CIOs must continue to be responsible for ensuring a high level of operational excellence, they also need to help their companies move away from what’s quickly becoming an outdated business-led, technology-enabled mindset.

The more modern view is that every company is a technology company, which means CIOs need to delegate responsibility for trustworthy IT operations and — as the company’s top technology expert — take a lead role in driving business strategy.

“The reality is the CIO should be pushing that trusted operator role down to their deputies and below so that they can focus their time and energy on being far more strategic and be a partner with the business,” she said.

Take your seat at the table

To become a digital leader, a trusted operator needs to “take his or her seat at the table” and change the corporate perception of IT, according to Lamar. She suggested they build credibility and relationships with the executive team and position themselves as the technology evangelist for the company.

“CIOs need to be the smartest person in the room,” she said. “They need to be proactive to educate, inform and enable the business leaders in the organization to be technology savvy and tech fluent.”

Trusted operators can get started by seeing any conversation they have with business leaders about digital technology as an opportunity to begin reshaping their relationship.

If they’re asked by the executive team or the board about technology investments, trusted operators should find ways to plant seeds on the importance of using new technologies or explain ways in which technology can drive business results. This way, CIOs continue to support the business while bringing to the discussion “the art of the possible and not just being an order taker,” Lamar said.

Next, become a ‘digital vanguard’

Ultimately, CIOs want to help their organizations join what Deloitte calls the “digital vanguard,” or companies with a clear digital strategy and that view their IT function as a market leader in digital and emerging technologies.

Lamar said organizations she and her co-authors identified as “digital vanguards” — less than 10% of those surveyed — share a handful of traits. They have a visible digital strategy that cuts across the enterprise. In many cases, IT — be it a CIO or a deputy CIO — is leading the execution of the digital strategy.

CIOs who work for digital vanguard companies have found ways to shift a percentage of their IT budgets away from operational expenses to innovation. According to the survey, baseline organizations spend on average about 56% of their budgets on business operations and 18% on business innovation versus 47% and 26% respectively at digital vanguard organizations.

Digital vanguard CIOs also place an emphasis on talent by thinking about retention and how to retool employees who have valuable institutional knowledge for the company. And they seek out well-rounded hires, employees who can bring soft skills, such as emotional intelligence, to the table, Lamar said.

Talent is top of mind for most CIOs, but digital vanguards have figured out how to build environments for continuous learning and engagement to both attract and retain talent. Lamar called this one of the hardest gaps to close between organizations that are digital vanguards and those that aren’t. “The culture of these organizations tends to embrace and provide opportunities for their people to do new things, play with new tools or embrace new technologies,” she said.

Juniper preps 400 GbE across PTX, MX and QFX hardware

Juniper plans to add 400 Gigabit Ethernet across its PTX and MX routers and QFX switches as internet companies and cloud providers gear up for the higher throughput needed to meet global demand from subscribers.

Juniper said this week it would roll out higher speed ports in the three product series over the next 12 months. The schedule is in line with analysts predictions that vendors would start shipping 400 GbE devices this year.

Juniper will market the devices for several uses, including a data center backbone, internet peering, data center interconnect, a metro core, telecommunication services and a hyperscale data center IP fabric.

The announcement follows by a month Juniper’s release of the 400 GbE-capable Penta, a 16 nanometer (nm) packet-forwarding chipset that consumes considerably less energy than Juniper’s other silicon. Juniper designed the Penta for carriers rearchitecting their data centers to deliver 5G services.

Penta is destined for some of the new hardware, which will help Juniper meet carrier demand for more speed, said Eric Hanselman, an analyst at New York-based 451 Research.

“Juniper has such a strong base with service providers and network operators and they’re already seeing strong pressure for higher capacity,” Hanselman said. “Getting the Penta silicon out into the field on new platforms could help to move Juniper forward [in the market].”

The upcoming hardware will also use a next-generation ExpressPlus chipset and Q5 application-specific integrated circuit. The Juniper silicon will provide better telemetry and support for VXLAN and EVPN, the company said.

Cloud developers use EVPN, VXLAN and the Border Gateway Protocol to set up a multi-tenancy network architecture that supports multiple customers. The design isolates customers so data and malware can’t travel between them.

For the IP transport layer, Juniper plans to introduce in the second half of the year the 3-RU PTX10003 Packet Transport Router for the backbone, internet peering and data center interconnect applications. The hardware supports 100 and 400 GbE and plugs into an existing multirate QSFP-DD fiber connector system for a more straightforward speed upgrade. The Juniper system provides MACSec support for 160 100 GbE interfaces and FlexE support for 32 400 GbE interfaces. The upcoming ExpressPlus silicon powers the device.

Also, in the second half of the year, Juniper plans to release for the data center the QFX10003 switch. The system packs 32 400 GbE interfaces in 3-RU hardware that can scale up to 160 100 GbE. The next-generation Q5 chip will power the system.

In the first half of next year, Juniper expects to release the QFX5220 switch, which will offer up to 32 400 GbE interfaces in a 1-RU system. The Q5-powered hardware also supports a mix of 50, 100 and 400 GbE for server and inter-fabric connectivity.

Finally, for wide-area network services, Juniper plans to release Penta-powered 400 GbE MPC10E line cards for the MX960, MX480 and MX240. The vendor plans to release the products on the first of next year.

Juniper is likely to face stiff competition in the 400 GbE market from Cisco and Arista. Initially, prices for the high-speed interfaces will be too high for many companies. However, Hanselman expects that to change over time.

“The biggest challenge with 400 GbE is getting interface prices to a point where they can open up new possibilities,” he said. “[But] healthy competition is bound to make this happen.”

Indeed, in 2017, competition for current hardware drove Ethernet bandwidth costs down to a six-year low, according to analyst firm Crehan Research Inc., based in San Francisco. By 2022, 400 GbE will account for the majority of Ethernet bandwidth from switches, Crehan predicts.

GE and Microsoft enter into their largest partnership to date, accelerating industrial IoT adoption for customers | Stories

Expanded partnership will help industrial companies capture greater intelligence from IoT and asset data, boosts GE innovation across its business

SAN RAMON, Calif. & REDMOND, Wash. — JULY 16, 2018 — GE (NYSE: GE) and Microsoft Corp. (Nasdaq: “MSFT”) today announced an expanded partnership, bringing together operational technology and information technology to eliminate hurdles industrial companies face in advancing digital transformation projects. As part of the union, GE Digital plans to standardize its Predix solutions on Microsoft Azure and will deeply integrate the Predix portfolio with Azure’s native cloud capabilities, including Azure IoT and Azure Data and Analytics. The parties will also co-sell and go-to-market together, offering end customers premier Industrial IoT (IIoT) solutions across verticals. In addition, GE will leverage Microsoft Azure across its business for additional IT workloads and productivity tools, including internal Predix-based deployments, to drive innovation across the company.

According to Gartner, companies have evolved from “talking about” to implementing IoT proofs of concept (POCs) and pilots. While POC projects tend to be easy to start, few enterprises have ramped up large-scale initiatives.* The GE-Microsoft partnership helps industrial customers streamline their digital transformations by combining GE Digital’s leading IIoT solutions that ingest, store, analyze and act on data to drive greater insight with Microsoft’s vast cloud footprint, helping customers transform their operations at the enterprise level.

Advancing Industrial IoT Applications

GE Digital’s Predix is the application development platform that equips industrial organizations with everything they need to rapidly build, securely deploy and effectively run IIoT applications from edge to cloud, turning asset data into actionable insights. Leading industrial companies such as BP, Exelon, Schindler and Maersk are using GE Digital’s solutions – including flagship applications  Predix Asset Performance Management and Predix ServiceMax – as well as thousands of Predix-based apps created by customers and partners to improve operations and efficiency of their assets. Tapping into the power of Azure will help accelerate adoption of the Predix portfolio. The partnership brings together GE Digital’s expertise in industrial data and applications with Microsoft’s enterprise cloud, helping customers speed deployment of industrial applications and achieve tangible outcomes faster, ultimately fueling growth and business innovation.

Driving Innovation across GE

GE also plans to leverage Azure across the company for a wide range of IT workloads and productivity tools, accelerating digital innovation and driving efficiencies. This partnership also enables the different GE businesses to tap into Microsoft’s advanced enterprise capabilities, which will support the petabytes of data managed by the Predix platform, such as GE’s monitoring and diagnostics centers, internal manufacturing and services programs.

Microsoft Azure has announced 54 regions across the globe, with 42 currently available – more than any other major cloud provider. Its cloud meets a broad set of international standards and compliance requirements to ensure customer solutions can scale globally. This partnership also enhances the security layer within the Predix platform, which meets the specialized requirements of industries such as aviation, power and utilities. Leveraging Azure enables GE to expand its cloud footprint globally, helping the companies’ mutual customers rapidly deploy IIoT applications.

The global IoT market is expected to be worth $1.1 trillion in revenue by 2025 as market value shifts from connectivity to platforms, applications and services, according to new data from GSMA Intelligence.

“Every industrial company will have to master digital to compete in the future, connecting machines and making them more intelligent to drive efficiency and productivity,” said Bill Ruh, Chief Digital Officer, GE and CEO, GE Digital. “Our customers are asking for us to make a deeper connection between our two companies. Through this expanded partnership, Microsoft and GE are enabling customers around the world to harness the power of the Predix portfolio, including Predix Asset Performance Management, to unlock new capabilities to drive growth.”

“The industrial sector plays an important role in fueling economies around the world,” said Judson Althoff, Executive Vice President, Microsoft. “With this strategic partnership, GE and our mutual customers will benefit from a trusted platform with the flexibility and scalability to deliver unprecedented results and help advance their business potential.” 

As part of this expanded partnership, the companies will go-to-market together and also explore deeper integration of Predix IIoT solutions with Power BI, PowerApps and other third-party solutions, as well as integration with Microsoft Azure Stack to enable hybrid deployments across public and private clouds.

*Gartner, Hype Cycle for the Internet of Things, 2017, 24 July 2017

###

About GE Digital

GE Digital is reimagining how industrials build, operate and service their assets, unlocking machine data to turn valuable insights into powerful business outcomes. GE Digital’s Predix portfolio – including the leading Asset Performance Management, Field Service Management and MES applications – helps its customers manage the entire asset lifecycle. Underpinned by Predix, the leading application development platform for the Industrial Internet, GE Digital enables industrial businesses to operate faster, smarter and more efficiently. For more information, visit www.ge.com/digital.

About Microsoft

Microsoft (Nasdaq “MSFT” @microsoft) enables digital transformation for the era of an intelligent cloud and an intelligent edge. Its mission is to empower every person and every organization on the planet to achieve more.

Contacts:

Amy Sarosiek, GE Digital

+1 224 239 6028

amy.sarosiek@ge.com

 

Microsoft Media Relations

WE Communications for Microsoft

(425) 638-7777

Broadcom acquisition of CA seeks broader portfolio

The out-of-the-blue Broadcom acquisition of CA Technologies has analysts scratching their heads about how the two companies’ diverse portfolios weave together strategically, and how customers might feel the impacts — beneficial or otherwise.

CA’s strength in mainframe and enterprise infrastructure software, the latter of which is a growing but fragmented market, gives chipmaker Broadcom another building block to create an across-the-board infrastructure technology company, stated Hock Tan, president and CEO of Broadcom.

But vaguely worded statements from both companies’ execs lent little insight into potential synergies and strategic short- or long-term goals of the $18.9 billion deal.

One analyst believes the deal is driven primarily by financial and operational incentives, and whatever technology synergies the two companies create are a secondary consideration for now.

“The operating margins from mainframes are very healthy and that fits very well with Broadcom’s financial model,” said Stephen Elliot, an analyst at IDC.

The bigger issue will be Broadcom’s ability to manage the diverse software portfolio of a company the size of CA. To date, Broadcom’s acquisition strategy has focused almost exclusively on massive deals for hardware companies, in areas such as storage, wireless LAN and networking. “The question is, is this too far of a reach for them? Customers are going to have to watch this closely,” Elliot said.

The overall track record of acquisitions that combine hardware-focused companies and large software companies is not good, Elliot noted. He pointed to the failures of Intel’s acquisition of LANDesk and Symantec’s purchase of Veritas.

Broadcom’s ability to manage CA’s complex and interwoven product portfolio is another concern.

The question is, is this too far of a reach for [Broadcom]? Customers are going to have to watch this closely.
Stephen Elliotanalyst, IDC

“As far as I can see, Broadcom has little or no visible prior execution or knowledge about a complicated and nuanced software and technology arena such as the one CA addresses … that includes DevOps, agile and security,” said Melinda Ballou, research director for IDC’s application life-cycle management program. “Infrastructure management would be more in their line of work, but still very different.”

Broadcom’s acquisition of CA also fills a need to diversify, particularly in the aftermath of its failed attempt to buy Qualcomm earlier this year, which was blocked by the Trump administration for national security reasons.

“They need to diversify their offerings to be more competitive given they primarily focus on chips, networking and the hardware space,” said Judith Hurwitz, president and CEO of Hurwitz & Associates LLC. “CA has done a lot of work on the operational and analytics side, so maybe [Broadcom] is looking at that as a springboard into the software enablement space.”

Hurwitz does see opportunities for both companies to combine their respective products, particularly in network management and IoT security. And perhaps this deal portends more acquisitions will follow, potentially among companies that compete directly or indirectly with CA. Both Broadcom and CA have pursued growth through numerous acquisitions in recent years.

“You could anticipate Broadcom goes on a spending spree, going after other companies that are adjacent to what CA does,” Hurwitz said. “For example, there was talk earlier this year that CA and BMC would merge, so BMC could be a logical step with some synergy there.”

IT spending shows cloud-focused data center architectures

Companies are dramatically changing the architectures of their private data centers in preparation for eventually running more business applications across multiple cloud providers.

The transformational changes underway include higher network speeds, more server diversity and an increase in software-defined storage, an IHS Markit survey of 151 North American enterprises found. The strategy behind the revamping is to make the data center “a first-class citizen as enterprises build their multi-clouds.”

Companies are increasing network speeds to meet an expected rise in data flowing through enterprise data centers. A total of 68% of companies are increasing the capacity of the network fabric while 62% are buying technology to automate the movement of virtual machines and support network virtualization protocols, London-based IHS reported. The three trends are consistent with the building of cloud-based data center architectures.

IHS also found 49% of the survey respondents planned to increase spending on switches and routers — virtual and physical — to keep up with traffic flow. The top five Ethernet switch vendors were Cisco, Dell, Hewlett Packard Enterprise, Juniper and Huawei.

Companies turning to containers in new data center architectures

Companies are also increasing the use of containers to run applications in cloud computing environments. The shift is affecting the use of hypervisors, which are the platforms for running virtual machines in data centers today. IHS expects the number of servers running hypervisors to fall from 37% to 30% by 2019.

“That’s a transition away from server virtualization potentially toward software containers,” IHS analyst Clifford Grossner said. “End users are looking to use more container-[based] software.”

IHS found 77% of companies planned to increase spending on servers with the number of physical devices expected to double on average. Enterprises plan to run hypervisors or containers on 73% of their servers by 2019, up from 70% today.

“We’re seeing that progression where more and more servers are multi-tenant — that is running multiple applications,” Grossner said. “We’re seeing the density being packed tighter on servers.”

High density and multi-tenancy on servers are also attributes of cloud-focused data center architectures.

The rise of the one-socket server

Whenever possible, companies are buying one-socket servers to lower capital expenditures. IHS expects the cheaper hardware to account for 9% of corporate servers by 2019 from 3% today.

“The one-socket server market is offering more powerful options that are able to satisfy the needs of more workloads at a better price point,” Grossner said.

Finally, IHS found an upswing in storage spending. Fully, 53% of companies planned to spend more on software-defined storage, 52% on network-attached storage and 42% on solid-state drives.

As enterprises rearchitect their data centers, they are also spending more on public cloud services and infrastructure. IDC expects spending on the latter to reach $160 billion this year, an increase of more than 23% over last year. By 2021, spending will reach $277 billion, representing an annual increase of nearly 22%.

Execs: Content management in the cloud not as easy as it looks

TORONTO — Companies like Oracle, SAP and Microsoft are pushing content management in the cloud, and they’re joined by OpenText, which announced the containerization of its systems for use on public clouds, such as Microsoft Azure, Google Cloud and AWS.

“Friends don’t let friends buy data centers.” That was OpenText CEO and CTO Mark Barrenechea’s recurring joke during his OpenText Enterprise World 2018 keynote, during which the company unveiled its cloud- and DevOps-friendly OT2 platform.

Barrenechea later clarified to reporters that while some customers are standardizing on AWS and Azure, most OpenText cloud customers are on OpenText’s private cloud. Opening OpenText apps and microservices, such as its Magellan AI tools, to the public clouds will also open up new markets for content management in the cloud, Barrenechea said.

But several speakers from the stage — including celebrity nonfiction writer and Toronto native Malcolm Gladwell — cautioned that while the cloud might bring convenience and freedom from data center upkeep, it also brings challenges.

The two most frequently mentioned were data security and process automation, as well as a related issue: automating bad or unnecessarily complicated processes that should have been fixed before their digital transformations.

Data security getting more complicated

If you have 854,000 people with top-secret clearances, I would venture to say that it’s no longer top-secret.
Malcolm Gladwellauthor

The internet of things and mobile devices comprise a major security vulnerability that, if left unsecure, can multiply risk and create entry points for hackers to penetrate networks. Opening up content management in the cloud — and the necessary multiplication of data transactions that comes with it — can spread that risk outside the firewall.

Persistent connectivity is the challenge for Zoll Medical’s personal defibrillators, said Jennifer Bell, enterprise CMS architect at the company. Zoll Medical’s IoT devices not only connect the patient to the device, but also port the data to caregivers and insurance providers in a regulatory-compliant way, which mandates data security the whole time.

“Security is huge, with HIPAA [Health Insurance Portability and Accountability Act] and everything,” she said.

IT leaders are just beginning to grasp the scale of risks.

At the National Institute of Allergy and Infectious Diseases (NIAID), even “smart microscopes” with which researchers take multi-gigabyte, close-up images have to check in with their manufacturer’s servers every night, said Matt Eisenberg, acting chief of NIAID’s business processes and information branch.

“Every evening, when the scientists are done with those devices, it has to phone home and recalibrate. And this is blowing the infrastructure guys away, because they’re not used to allowing this kind of bidirectional communication from something that really doesn’t look or feel like a computer or a laptop,” Eisenberg said.

Best-selling author Malcolm Gladwell giving conference keynote
Author Malcolm Gladwell delivering keynote at OpenText Enterprise World 2018

Meanwhile, Gladwell warned that data security threats are coming from every direction, inside and outside of organizations, and from new perpetrators.

Also coming under the spotlight was security of content management in the cloud when Chelsea Manning and Edward Snowden were able to steal sensitive military documents and hand them over to WikiLeaks, Gladwell said.

Government data security experts are having a hard time preventing another such breach, he continued, because security threats are rapidly changing. The feds, however, haven’t; they’re stuck with Cold War-era systems and processes that focused on a particular enemy and their operatives.

“It’s no longer that you have a short list of people high up that you have to worry about. Now, you have to worry about everyone,” Gladwell said. “If you have 854,000 people with top-secret clearances, I would venture to say that it’s no longer top-secret.”

Cloud: BPM boon or problem?

Content management in the cloud by way of SaaS apps can also bring process automation, AI and analytics tools to content formerly marooned in on-premises data silos. It can also extend a workforce beyond office walls, giving remote, traveling or field-based workers access to the same content their commuting co-workers get.

That’s if it’s done right.

Kyle Hufford, digital asset management director at Monster Energy, based in Corona, Calif., serves rich media content to an international marketing team that must comply with many national, state and local regulations, as well as standardized internal processes, approval trees and branding rules.

His job, he said, is opening access to Monster Energy’s sometimes-edgy content worldwide, while ensuring end users stay compliant.

The work starts with detailed examination of how a process is done before moving it into the cloud.

“People think there [are] complexities around approvals and how to get things done,” Hufford said. “In reality, they can take a 15-step process and make it a two- or three-step process and save everybody time.”

Panelists at OpenText Enterprise World 2018 conference
Panelists at OpenText Enterprise World 2018 conference, from left to right: Marl Barrenechea, OpenText CEO and CTO; Gopal Padinjaruveetil, vice president and chief information security officer at The Auto Club Group; Jennifer Bell, enterprise content management architect and analyst at Zoll Medical; Kyle Hufford, director of digital asset management at Monster Energy; and Matt Eisenberg, acting chief of the U.S. NIAID business process and information management branch.

As mature companies like SAP, Microsoft, OpenText and Oracle make big pushes into the cloud and bring their big customers along to migrate from on-premises systems, process issues like these are bound to happen, said Craig Wentworth, principal analyst for U.K.-based MWD Advisors.

Wentworth advised enterprise IT leaders to take a critical look at the vendor’s model in the evaluation stage before embarking on a project for content management in the cloud.

“I worry that, sometimes … software firms that have been around for a long time [and add] cloud are coming to it from a very different place than those who are born in the cloud,” Wentworth said. “Whilst they will be successful certainly with their existing customers, they’ve got a different slant to it.”

Enterprise digital strategy: Design use cases that create business value

Digital transformation has become a top business priority, with many companies across industries focused on transforming their systems, business models and customer engagement to ensure e-business processes create value for the organization.

This makes carving out an effective enterprise digital strategy paramount to success. But, too often, organizations focus on the technological aspects of the transformation and ignore the business side of the equation, according to speakers at the recent LiveWorx 2018 conference in Boston. When building an enterprise digital strategy, organizations should start by looking at the fundamental business problems its leaders want to solve, and then move on to exploring how they can use technology to solve them, according to LiveWorx speaker Sarah Peach, senior director of business development at Samsung Electronics.

If they start with experimenting with the capabilities of a technology, they might end up with something that works, but is of no value to their business, Peach explained during her session, titled “The Next Frontier for Digital Businesses.

“Starting with the business problem — which then dictates the application and the technologies that you want to use to support that — is the approach that successful companies are taking,” she said.

Co-panelist Anand Krishnan said an enterprise digital strategy should be bucketed into three broad areas — products, platforms and partnerships — to help develop comprehensive use cases that benefit customers.

“Everyone is looking at digital touchpoints, but [should] set the focus on the journey itself, which is very important,” said Krishnan, vice president of big data and AI at Harman Connected Services. 

An enterprise digital strategy has to also meet the needs of the organization’s overall business strategy, said Jeffrey Miller, vice president of customer success and advisory services at PTC, based in Needham, Mass.

“You cannot produce a digital strategy without understanding your business strategy,” Miller said during his session, titled “Digital Transformation: Creating a Pragmatic Vision and Strategy.”

In order to move their digital transformation program forward, organizations should couple business strategy with its goals for innovation and its digital strategy, then design use cases that create value for the business, he said.

The evolving enterprise digital strategy in an IoT era

You cannot produce a digital strategy without understanding your business strategy.
Jeffrey Millervice president of customer success and advisory services at PTC

As companies deal with increasing numbers of connected systems, products and people, their enterprise digital strategy should address how to bring these areas together to meet one of two primary business objectives, said Ranjeet Khanna, a co-panelist of Peach and Krishnan.

“Either create efficiencies for the use cases that they are dealing with, or create a new revenue opportunity,” said Khanna, director of product management for IoT, public key infrastructure and embedded security solutions at Entrust Datacard, based in Shakopee, Minn.

Designing connected products is not just about incorporating mechanical and physical design anymore; manufacturers have to now worry about software design, Peach said. The manufacturing industry has, therefore, witnessed a rapid evolution in their digital strategy in the last couple of years, she added.

According to a recent Capgemini study, manufacturers estimate 47% of all their products will be smart and connected by 2020.

“If you are an OEM, you are now expected to produce a smart, connected product, and all of your digital systems have to change to support that,” Peach said.

The data generated from connected products throughout their lifecycle is another big change that manufacturers deal with today, she said.

“Your digital strategy has to start at the design side and follow all the way through to the end of life of your product,” she said.