Tag Archives: companies

Deloitte CIO survey: Traditional CIO role doesn’t cut it in digital era

CIOs who aren’t at the forefront of their companies’ digital strategies risk becoming obsolete — and they risk taking their IT departments with them.

The message isn’t new to IT executives, who have been counseled in recent years to take a leadership role in driving digital transformation. But new data suggests CIOs are struggling to make the shift. According to a recently published global CIO survey by Deloitte Consulting, 55% of business and technology leaders polled said CIOs are focused on delivering operational efficiency, reliability and cost-savings to their companies.

Kristi Lamar, managing director and U.S. CIO program leader at Deloitte and a co-author of the report, said IT executives who are serving in a traditional CIO capacity should take the finding as a clarion call to break out of that “trusted operator” role — and soon.

“If they don’t take a lead on digital, they’re ultimately going to be stuck in a trusted operator role, and IT is going to become a back office function versus really having a technology-enabled business,” she said. “The pace of change is fast and they need to get on board now.”

Taking on digital

Manifesting legacy: Looking beyond the digital era” is the final installment of a three-part, multiyear CIO survey series on CIO legacy. The idea was to chronicle how CIOs and business leaders perceived the role and to explore how CIOs delivered value to their companies against the backdrop of digital transformation.

Kristi Lamar, managing director and U.S.CIO program leader at DeloitteKristi Lamar

In the first installment, the authors developed three CIO pattern types. They are as follows:

  • Business co-creators: CIOs drive business strategy and enable change within the company to execute on the strategy.
  • Change instigators: CIOs lead digital transformation efforts for the enterprise.
  • Trusted operators: CIOs operate in a traditional CIO role and focus on operational efficiency and resiliency, as well as cost-savings efforts.

Based on their findings, the authors decided that CIOs should expect to move between the three roles, depending on what their companies needed at a given point in time. But this year’s CIO survey of 1,437 technology and business leaders suggested that isn’t happening for the most part. “We have not seen a huge shift in the last four years of CIOs getting out of that trusted operator role,” Lamar said.

The pace of change is fast and they need to get on board now.
Kristi Lamarmanaging director, Deloitte

Indeed, 44% of the CIOs surveyed reported they don’t lead digital strategy development or lead the execution of that strategy.

The inability of CIOs to break out of the trusted operator role is a two-way street. Lamar said that companies still see CIOs as — and need CIOs to be — trusted operators. But while CIOs must continue to be responsible for ensuring a high level of operational excellence, they also need to help their companies move away from what’s quickly becoming an outdated business-led, technology-enabled mindset.

The more modern view is that every company is a technology company, which means CIOs need to delegate responsibility for trustworthy IT operations and — as the company’s top technology expert — take a lead role in driving business strategy.

“The reality is the CIO should be pushing that trusted operator role down to their deputies and below so that they can focus their time and energy on being far more strategic and be a partner with the business,” she said.

Take your seat at the table

To become a digital leader, a trusted operator needs to “take his or her seat at the table” and change the corporate perception of IT, according to Lamar. She suggested they build credibility and relationships with the executive team and position themselves as the technology evangelist for the company.

“CIOs need to be the smartest person in the room,” she said. “They need to be proactive to educate, inform and enable the business leaders in the organization to be technology savvy and tech fluent.”

Trusted operators can get started by seeing any conversation they have with business leaders about digital technology as an opportunity to begin reshaping their relationship.

If they’re asked by the executive team or the board about technology investments, trusted operators should find ways to plant seeds on the importance of using new technologies or explain ways in which technology can drive business results. This way, CIOs continue to support the business while bringing to the discussion “the art of the possible and not just being an order taker,” Lamar said.

Next, become a ‘digital vanguard’

Ultimately, CIOs want to help their organizations join what Deloitte calls the “digital vanguard,” or companies with a clear digital strategy and that view their IT function as a market leader in digital and emerging technologies.

Lamar said organizations she and her co-authors identified as “digital vanguards” — less than 10% of those surveyed — share a handful of traits. They have a visible digital strategy that cuts across the enterprise. In many cases, IT — be it a CIO or a deputy CIO — is leading the execution of the digital strategy.

CIOs who work for digital vanguard companies have found ways to shift a percentage of their IT budgets away from operational expenses to innovation. According to the survey, baseline organizations spend on average about 56% of their budgets on business operations and 18% on business innovation versus 47% and 26% respectively at digital vanguard organizations.

Digital vanguard CIOs also place an emphasis on talent by thinking about retention and how to retool employees who have valuable institutional knowledge for the company. And they seek out well-rounded hires, employees who can bring soft skills, such as emotional intelligence, to the table, Lamar said.

Talent is top of mind for most CIOs, but digital vanguards have figured out how to build environments for continuous learning and engagement to both attract and retain talent. Lamar called this one of the hardest gaps to close between organizations that are digital vanguards and those that aren’t. “The culture of these organizations tends to embrace and provide opportunities for their people to do new things, play with new tools or embrace new technologies,” she said.

Juniper preps 400 GbE across PTX, MX and QFX hardware

Juniper plans to add 400 Gigabit Ethernet across its PTX and MX routers and QFX switches as internet companies and cloud providers gear up for the higher throughput needed to meet global demand from subscribers.

Juniper said this week it would roll out higher speed ports in the three product series over the next 12 months. The schedule is in line with analysts predictions that vendors would start shipping 400 GbE devices this year.

Juniper will market the devices for several uses, including a data center backbone, internet peering, data center interconnect, a metro core, telecommunication services and a hyperscale data center IP fabric.

The announcement follows by a month Juniper’s release of the 400 GbE-capable Penta, a 16 nanometer (nm) packet-forwarding chipset that consumes considerably less energy than Juniper’s other silicon. Juniper designed the Penta for carriers rearchitecting their data centers to deliver 5G services.

Penta is destined for some of the new hardware, which will help Juniper meet carrier demand for more speed, said Eric Hanselman, an analyst at New York-based 451 Research.

“Juniper has such a strong base with service providers and network operators and they’re already seeing strong pressure for higher capacity,” Hanselman said. “Getting the Penta silicon out into the field on new platforms could help to move Juniper forward [in the market].”

The upcoming hardware will also use a next-generation ExpressPlus chipset and Q5 application-specific integrated circuit. The Juniper silicon will provide better telemetry and support for VXLAN and EVPN, the company said.

Cloud developers use EVPN, VXLAN and the Border Gateway Protocol to set up a multi-tenancy network architecture that supports multiple customers. The design isolates customers so data and malware can’t travel between them.

For the IP transport layer, Juniper plans to introduce in the second half of the year the 3-RU PTX10003 Packet Transport Router for the backbone, internet peering and data center interconnect applications. The hardware supports 100 and 400 GbE and plugs into an existing multirate QSFP-DD fiber connector system for a more straightforward speed upgrade. The Juniper system provides MACSec support for 160 100 GbE interfaces and FlexE support for 32 400 GbE interfaces. The upcoming ExpressPlus silicon powers the device.

Also, in the second half of the year, Juniper plans to release for the data center the QFX10003 switch. The system packs 32 400 GbE interfaces in 3-RU hardware that can scale up to 160 100 GbE. The next-generation Q5 chip will power the system.

In the first half of next year, Juniper expects to release the QFX5220 switch, which will offer up to 32 400 GbE interfaces in a 1-RU system. The Q5-powered hardware also supports a mix of 50, 100 and 400 GbE for server and inter-fabric connectivity.

Finally, for wide-area network services, Juniper plans to release Penta-powered 400 GbE MPC10E line cards for the MX960, MX480 and MX240. The vendor plans to release the products on the first of next year.

Juniper is likely to face stiff competition in the 400 GbE market from Cisco and Arista. Initially, prices for the high-speed interfaces will be too high for many companies. However, Hanselman expects that to change over time.

“The biggest challenge with 400 GbE is getting interface prices to a point where they can open up new possibilities,” he said. “[But] healthy competition is bound to make this happen.”

Indeed, in 2017, competition for current hardware drove Ethernet bandwidth costs down to a six-year low, according to analyst firm Crehan Research Inc., based in San Francisco. By 2022, 400 GbE will account for the majority of Ethernet bandwidth from switches, Crehan predicts.

GE and Microsoft enter into their largest partnership to date, accelerating industrial IoT adoption for customers | Stories

Expanded partnership will help industrial companies capture greater intelligence from IoT and asset data, boosts GE innovation across its business

SAN RAMON, Calif. & REDMOND, Wash. — JULY 16, 2018 — GE (NYSE: GE) and Microsoft Corp. (Nasdaq: “MSFT”) today announced an expanded partnership, bringing together operational technology and information technology to eliminate hurdles industrial companies face in advancing digital transformation projects. As part of the union, GE Digital plans to standardize its Predix solutions on Microsoft Azure and will deeply integrate the Predix portfolio with Azure’s native cloud capabilities, including Azure IoT and Azure Data and Analytics. The parties will also co-sell and go-to-market together, offering end customers premier Industrial IoT (IIoT) solutions across verticals. In addition, GE will leverage Microsoft Azure across its business for additional IT workloads and productivity tools, including internal Predix-based deployments, to drive innovation across the company.

According to Gartner, companies have evolved from “talking about” to implementing IoT proofs of concept (POCs) and pilots. While POC projects tend to be easy to start, few enterprises have ramped up large-scale initiatives.* The GE-Microsoft partnership helps industrial customers streamline their digital transformations by combining GE Digital’s leading IIoT solutions that ingest, store, analyze and act on data to drive greater insight with Microsoft’s vast cloud footprint, helping customers transform their operations at the enterprise level.

Advancing Industrial IoT Applications

GE Digital’s Predix is the application development platform that equips industrial organizations with everything they need to rapidly build, securely deploy and effectively run IIoT applications from edge to cloud, turning asset data into actionable insights. Leading industrial companies such as BP, Exelon, Schindler and Maersk are using GE Digital’s solutions – including flagship applications  Predix Asset Performance Management and Predix ServiceMax – as well as thousands of Predix-based apps created by customers and partners to improve operations and efficiency of their assets. Tapping into the power of Azure will help accelerate adoption of the Predix portfolio. The partnership brings together GE Digital’s expertise in industrial data and applications with Microsoft’s enterprise cloud, helping customers speed deployment of industrial applications and achieve tangible outcomes faster, ultimately fueling growth and business innovation.

Driving Innovation across GE

GE also plans to leverage Azure across the company for a wide range of IT workloads and productivity tools, accelerating digital innovation and driving efficiencies. This partnership also enables the different GE businesses to tap into Microsoft’s advanced enterprise capabilities, which will support the petabytes of data managed by the Predix platform, such as GE’s monitoring and diagnostics centers, internal manufacturing and services programs.

Microsoft Azure has announced 54 regions across the globe, with 42 currently available – more than any other major cloud provider. Its cloud meets a broad set of international standards and compliance requirements to ensure customer solutions can scale globally. This partnership also enhances the security layer within the Predix platform, which meets the specialized requirements of industries such as aviation, power and utilities. Leveraging Azure enables GE to expand its cloud footprint globally, helping the companies’ mutual customers rapidly deploy IIoT applications.

The global IoT market is expected to be worth $1.1 trillion in revenue by 2025 as market value shifts from connectivity to platforms, applications and services, according to new data from GSMA Intelligence.

“Every industrial company will have to master digital to compete in the future, connecting machines and making them more intelligent to drive efficiency and productivity,” said Bill Ruh, Chief Digital Officer, GE and CEO, GE Digital. “Our customers are asking for us to make a deeper connection between our two companies. Through this expanded partnership, Microsoft and GE are enabling customers around the world to harness the power of the Predix portfolio, including Predix Asset Performance Management, to unlock new capabilities to drive growth.”

“The industrial sector plays an important role in fueling economies around the world,” said Judson Althoff, Executive Vice President, Microsoft. “With this strategic partnership, GE and our mutual customers will benefit from a trusted platform with the flexibility and scalability to deliver unprecedented results and help advance their business potential.” 

As part of this expanded partnership, the companies will go-to-market together and also explore deeper integration of Predix IIoT solutions with Power BI, PowerApps and other third-party solutions, as well as integration with Microsoft Azure Stack to enable hybrid deployments across public and private clouds.

*Gartner, Hype Cycle for the Internet of Things, 2017, 24 July 2017

###

About GE Digital

GE Digital is reimagining how industrials build, operate and service their assets, unlocking machine data to turn valuable insights into powerful business outcomes. GE Digital’s Predix portfolio – including the leading Asset Performance Management, Field Service Management and MES applications – helps its customers manage the entire asset lifecycle. Underpinned by Predix, the leading application development platform for the Industrial Internet, GE Digital enables industrial businesses to operate faster, smarter and more efficiently. For more information, visit www.ge.com/digital.

About Microsoft

Microsoft (Nasdaq “MSFT” @microsoft) enables digital transformation for the era of an intelligent cloud and an intelligent edge. Its mission is to empower every person and every organization on the planet to achieve more.

Contacts:

Amy Sarosiek, GE Digital

+1 224 239 6028

amy.sarosiek@ge.com

 

Microsoft Media Relations

WE Communications for Microsoft

(425) 638-7777

Broadcom acquisition of CA seeks broader portfolio

The out-of-the-blue Broadcom acquisition of CA Technologies has analysts scratching their heads about how the two companies’ diverse portfolios weave together strategically, and how customers might feel the impacts — beneficial or otherwise.

CA’s strength in mainframe and enterprise infrastructure software, the latter of which is a growing but fragmented market, gives chipmaker Broadcom another building block to create an across-the-board infrastructure technology company, stated Hock Tan, president and CEO of Broadcom.

But vaguely worded statements from both companies’ execs lent little insight into potential synergies and strategic short- or long-term goals of the $18.9 billion deal.

One analyst believes the deal is driven primarily by financial and operational incentives, and whatever technology synergies the two companies create are a secondary consideration for now.

“The operating margins from mainframes are very healthy and that fits very well with Broadcom’s financial model,” said Stephen Elliot, an analyst at IDC.

The bigger issue will be Broadcom’s ability to manage the diverse software portfolio of a company the size of CA. To date, Broadcom’s acquisition strategy has focused almost exclusively on massive deals for hardware companies, in areas such as storage, wireless LAN and networking. “The question is, is this too far of a reach for them? Customers are going to have to watch this closely,” Elliot said.

The overall track record of acquisitions that combine hardware-focused companies and large software companies is not good, Elliot noted. He pointed to the failures of Intel’s acquisition of LANDesk and Symantec’s purchase of Veritas.

Broadcom’s ability to manage CA’s complex and interwoven product portfolio is another concern.

The question is, is this too far of a reach for [Broadcom]? Customers are going to have to watch this closely.
Stephen Elliotanalyst, IDC

“As far as I can see, Broadcom has little or no visible prior execution or knowledge about a complicated and nuanced software and technology arena such as the one CA addresses … that includes DevOps, agile and security,” said Melinda Ballou, research director for IDC’s application life-cycle management program. “Infrastructure management would be more in their line of work, but still very different.”

Broadcom’s acquisition of CA also fills a need to diversify, particularly in the aftermath of its failed attempt to buy Qualcomm earlier this year, which was blocked by the Trump administration for national security reasons.

“They need to diversify their offerings to be more competitive given they primarily focus on chips, networking and the hardware space,” said Judith Hurwitz, president and CEO of Hurwitz & Associates LLC. “CA has done a lot of work on the operational and analytics side, so maybe [Broadcom] is looking at that as a springboard into the software enablement space.”

Hurwitz does see opportunities for both companies to combine their respective products, particularly in network management and IoT security. And perhaps this deal portends more acquisitions will follow, potentially among companies that compete directly or indirectly with CA. Both Broadcom and CA have pursued growth through numerous acquisitions in recent years.

“You could anticipate Broadcom goes on a spending spree, going after other companies that are adjacent to what CA does,” Hurwitz said. “For example, there was talk earlier this year that CA and BMC would merge, so BMC could be a logical step with some synergy there.”

IT spending shows cloud-focused data center architectures

Companies are dramatically changing the architectures of their private data centers in preparation for eventually running more business applications across multiple cloud providers.

The transformational changes underway include higher network speeds, more server diversity and an increase in software-defined storage, an IHS Markit survey of 151 North American enterprises found. The strategy behind the revamping is to make the data center “a first-class citizen as enterprises build their multi-clouds.”

Companies are increasing network speeds to meet an expected rise in data flowing through enterprise data centers. A total of 68% of companies are increasing the capacity of the network fabric while 62% are buying technology to automate the movement of virtual machines and support network virtualization protocols, London-based IHS reported. The three trends are consistent with the building of cloud-based data center architectures.

IHS also found 49% of the survey respondents planned to increase spending on switches and routers — virtual and physical — to keep up with traffic flow. The top five Ethernet switch vendors were Cisco, Dell, Hewlett Packard Enterprise, Juniper and Huawei.

Companies turning to containers in new data center architectures

Companies are also increasing the use of containers to run applications in cloud computing environments. The shift is affecting the use of hypervisors, which are the platforms for running virtual machines in data centers today. IHS expects the number of servers running hypervisors to fall from 37% to 30% by 2019.

“That’s a transition away from server virtualization potentially toward software containers,” IHS analyst Clifford Grossner said. “End users are looking to use more container-[based] software.”

IHS found 77% of companies planned to increase spending on servers with the number of physical devices expected to double on average. Enterprises plan to run hypervisors or containers on 73% of their servers by 2019, up from 70% today.

“We’re seeing that progression where more and more servers are multi-tenant — that is running multiple applications,” Grossner said. “We’re seeing the density being packed tighter on servers.”

High density and multi-tenancy on servers are also attributes of cloud-focused data center architectures.

The rise of the one-socket server

Whenever possible, companies are buying one-socket servers to lower capital expenditures. IHS expects the cheaper hardware to account for 9% of corporate servers by 2019 from 3% today.

“The one-socket server market is offering more powerful options that are able to satisfy the needs of more workloads at a better price point,” Grossner said.

Finally, IHS found an upswing in storage spending. Fully, 53% of companies planned to spend more on software-defined storage, 52% on network-attached storage and 42% on solid-state drives.

As enterprises rearchitect their data centers, they are also spending more on public cloud services and infrastructure. IDC expects spending on the latter to reach $160 billion this year, an increase of more than 23% over last year. By 2021, spending will reach $277 billion, representing an annual increase of nearly 22%.

Execs: Content management in the cloud not as easy as it looks

TORONTO — Companies like Oracle, SAP and Microsoft are pushing content management in the cloud, and they’re joined by OpenText, which announced the containerization of its systems for use on public clouds, such as Microsoft Azure, Google Cloud and AWS.

“Friends don’t let friends buy data centers.” That was OpenText CEO and CTO Mark Barrenechea’s recurring joke during his OpenText Enterprise World 2018 keynote, during which the company unveiled its cloud- and DevOps-friendly OT2 platform.

Barrenechea later clarified to reporters that while some customers are standardizing on AWS and Azure, most OpenText cloud customers are on OpenText’s private cloud. Opening OpenText apps and microservices, such as its Magellan AI tools, to the public clouds will also open up new markets for content management in the cloud, Barrenechea said.

But several speakers from the stage — including celebrity nonfiction writer and Toronto native Malcolm Gladwell — cautioned that while the cloud might bring convenience and freedom from data center upkeep, it also brings challenges.

The two most frequently mentioned were data security and process automation, as well as a related issue: automating bad or unnecessarily complicated processes that should have been fixed before their digital transformations.

Data security getting more complicated

If you have 854,000 people with top-secret clearances, I would venture to say that it’s no longer top-secret.
Malcolm Gladwellauthor

The internet of things and mobile devices comprise a major security vulnerability that, if left unsecure, can multiply risk and create entry points for hackers to penetrate networks. Opening up content management in the cloud — and the necessary multiplication of data transactions that comes with it — can spread that risk outside the firewall.

Persistent connectivity is the challenge for Zoll Medical’s personal defibrillators, said Jennifer Bell, enterprise CMS architect at the company. Zoll Medical’s IoT devices not only connect the patient to the device, but also port the data to caregivers and insurance providers in a regulatory-compliant way, which mandates data security the whole time.

“Security is huge, with HIPAA [Health Insurance Portability and Accountability Act] and everything,” she said.

IT leaders are just beginning to grasp the scale of risks.

At the National Institute of Allergy and Infectious Diseases (NIAID), even “smart microscopes” with which researchers take multi-gigabyte, close-up images have to check in with their manufacturer’s servers every night, said Matt Eisenberg, acting chief of NIAID’s business processes and information branch.

“Every evening, when the scientists are done with those devices, it has to phone home and recalibrate. And this is blowing the infrastructure guys away, because they’re not used to allowing this kind of bidirectional communication from something that really doesn’t look or feel like a computer or a laptop,” Eisenberg said.

Best-selling author Malcolm Gladwell giving conference keynote
Author Malcolm Gladwell delivering keynote at OpenText Enterprise World 2018

Meanwhile, Gladwell warned that data security threats are coming from every direction, inside and outside of organizations, and from new perpetrators.

Also coming under the spotlight was security of content management in the cloud when Chelsea Manning and Edward Snowden were able to steal sensitive military documents and hand them over to WikiLeaks, Gladwell said.

Government data security experts are having a hard time preventing another such breach, he continued, because security threats are rapidly changing. The feds, however, haven’t; they’re stuck with Cold War-era systems and processes that focused on a particular enemy and their operatives.

“It’s no longer that you have a short list of people high up that you have to worry about. Now, you have to worry about everyone,” Gladwell said. “If you have 854,000 people with top-secret clearances, I would venture to say that it’s no longer top-secret.”

Cloud: BPM boon or problem?

Content management in the cloud by way of SaaS apps can also bring process automation, AI and analytics tools to content formerly marooned in on-premises data silos. It can also extend a workforce beyond office walls, giving remote, traveling or field-based workers access to the same content their commuting co-workers get.

That’s if it’s done right.

Kyle Hufford, digital asset management director at Monster Energy, based in Corona, Calif., serves rich media content to an international marketing team that must comply with many national, state and local regulations, as well as standardized internal processes, approval trees and branding rules.

His job, he said, is opening access to Monster Energy’s sometimes-edgy content worldwide, while ensuring end users stay compliant.

The work starts with detailed examination of how a process is done before moving it into the cloud.

“People think there [are] complexities around approvals and how to get things done,” Hufford said. “In reality, they can take a 15-step process and make it a two- or three-step process and save everybody time.”

Panelists at OpenText Enterprise World 2018 conference
Panelists at OpenText Enterprise World 2018 conference, from left to right: Marl Barrenechea, OpenText CEO and CTO; Gopal Padinjaruveetil, vice president and chief information security officer at The Auto Club Group; Jennifer Bell, enterprise content management architect and analyst at Zoll Medical; Kyle Hufford, director of digital asset management at Monster Energy; and Matt Eisenberg, acting chief of the U.S. NIAID business process and information management branch.

As mature companies like SAP, Microsoft, OpenText and Oracle make big pushes into the cloud and bring their big customers along to migrate from on-premises systems, process issues like these are bound to happen, said Craig Wentworth, principal analyst for U.K.-based MWD Advisors.

Wentworth advised enterprise IT leaders to take a critical look at the vendor’s model in the evaluation stage before embarking on a project for content management in the cloud.

“I worry that, sometimes … software firms that have been around for a long time [and add] cloud are coming to it from a very different place than those who are born in the cloud,” Wentworth said. “Whilst they will be successful certainly with their existing customers, they’ve got a different slant to it.”

Enterprise digital strategy: Design use cases that create business value

Digital transformation has become a top business priority, with many companies across industries focused on transforming their systems, business models and customer engagement to ensure e-business processes create value for the organization.

This makes carving out an effective enterprise digital strategy paramount to success. But, too often, organizations focus on the technological aspects of the transformation and ignore the business side of the equation, according to speakers at the recent LiveWorx 2018 conference in Boston. When building an enterprise digital strategy, organizations should start by looking at the fundamental business problems its leaders want to solve, and then move on to exploring how they can use technology to solve them, according to LiveWorx speaker Sarah Peach, senior director of business development at Samsung Electronics.

If they start with experimenting with the capabilities of a technology, they might end up with something that works, but is of no value to their business, Peach explained during her session, titled “The Next Frontier for Digital Businesses.

“Starting with the business problem — which then dictates the application and the technologies that you want to use to support that — is the approach that successful companies are taking,” she said.

Co-panelist Anand Krishnan said an enterprise digital strategy should be bucketed into three broad areas — products, platforms and partnerships — to help develop comprehensive use cases that benefit customers.

“Everyone is looking at digital touchpoints, but [should] set the focus on the journey itself, which is very important,” said Krishnan, vice president of big data and AI at Harman Connected Services. 

An enterprise digital strategy has to also meet the needs of the organization’s overall business strategy, said Jeffrey Miller, vice president of customer success and advisory services at PTC, based in Needham, Mass.

“You cannot produce a digital strategy without understanding your business strategy,” Miller said during his session, titled “Digital Transformation: Creating a Pragmatic Vision and Strategy.”

In order to move their digital transformation program forward, organizations should couple business strategy with its goals for innovation and its digital strategy, then design use cases that create value for the business, he said.

The evolving enterprise digital strategy in an IoT era

You cannot produce a digital strategy without understanding your business strategy.
Jeffrey Millervice president of customer success and advisory services at PTC

As companies deal with increasing numbers of connected systems, products and people, their enterprise digital strategy should address how to bring these areas together to meet one of two primary business objectives, said Ranjeet Khanna, a co-panelist of Peach and Krishnan.

“Either create efficiencies for the use cases that they are dealing with, or create a new revenue opportunity,” said Khanna, director of product management for IoT, public key infrastructure and embedded security solutions at Entrust Datacard, based in Shakopee, Minn.

Designing connected products is not just about incorporating mechanical and physical design anymore; manufacturers have to now worry about software design, Peach said. The manufacturing industry has, therefore, witnessed a rapid evolution in their digital strategy in the last couple of years, she added.

According to a recent Capgemini study, manufacturers estimate 47% of all their products will be smart and connected by 2020.

“If you are an OEM, you are now expected to produce a smart, connected product, and all of your digital systems have to change to support that,” Peach said.

The data generated from connected products throughout their lifecycle is another big change that manufacturers deal with today, she said.

“Your digital strategy has to start at the design side and follow all the way through to the end of life of your product,” she said.

Qumulo storage parts the clouds with its K-Series active archive

Scale-out NAS startup Qumulo has added a dense cloud archiving appliance to help companies find hidden value in idle data.

Known as the K-Series, the product is an entry-level complement to Qumulo storage with C-Series hybrid and all-flash NVMe P-Series NAS primary arrays. The K-144T active archive target embeds the Qumulo File Fabric (QF2) scalable file system on a standard 1U server.

Qumulo, based in Seattle, didn’t disclose the source of the K-Series’ underlying hardware, but it has an OEM deal with Hewlett Packard Enterprise to package the Qumulo Scalable File System on HPE Apollo servers. Qumulo storage customers need a separate software subscription to add the K-Series archive to an existing Qumulo primary storage configuration.

“It’s routine for our customers to be storing billions of files, either tiny files generated by machines or large files generated by video,” Qumulo chief marketing officer Peter Zaballos said. “We now have a complete product line, from archiving up to blazing high performance.”

Analytics and cloud bursting

Customers can build a physical K-Series cluster with a minimum of six nodes and scale by adding single nodes. That allows them to replicate data from the K-Series target to an identical Qumulo storage cluster in AWS for analytics or cloud bursting. A cluster can scale to 1,000 nodes.

“There’s no need to pull data back from the cloud. You can do rendering against a tier of storage in the cloud and avoid [the expense] of data migration,” Qumulo product manager Jason Sturgeon said.

Each Qumulo storage K-Series node scales to 144 TB of raw storage. Each node accepts a dozen 12 TB HDDs for storage, plus three SSDs to capture read metadata. QumuloDB analytics collects the metadata information as the data gets written. A nine-node configuration provides 1 PB of usable storage.

Qumulo said it designed the K-Series arrays with an Intel Xeon D system-on-a-chip processor to reduce power consumption.

Exploding market for NFS, object archiving

Adding a nearline option to Qumulo storage addresses the rapid growth of unstructured data that requires file-based and object storage, said Scott Sinclair, a storage analyst at Enterprise Strategy Group.

“Qumulo is positioning the K-Series as a lower-cost, higher-density option for large-capacity environments,” Sinclair said. “There is a tremendous need for cheap and deep storage. Many cheap-and-deep workloads are using NFS protocols. This isn’t a file gateway that you retrofit on top of an object storage box. You can use normal file protocols.”

Those file protocols include NFS, SMB and REST-based APIs.

Sturgeon said the K-Series can ingest reads at 6 Gbps and writes at 3 Gpbs, per 1 PB of usable capacity.

To eliminate tree walks, the QF2 updates metadata of all files associated with a folder. Process checks occur every 15 seconds to provide visibility on the amount of data stored within the directory structure, allowing storage to be accessed and queried in nearly real time.

Qumulo has garnered more than $220 million in funding, including a $93 million Series D round earlier this month. Qumulo founders Peter Godman, Aaron Passey and Neal Fachan helped develop the Isilon OneFS clustered file system, leading the company to an IPO in 2006. EMC paid $2.25 billion to acquire the Isilon technology in 2010.

Godman is Qumulo CTO, and Fachan is chief scientist. Passey left in 2016 to take over as principal engineer at cloud hosting provider Dropbox.

New Arista switches use Barefoot Tofino programmable chip

Arista has launched a family of switches that companies can program to perform tasks typically handled by network appliances and routers. The company claims the consolidation capabilities of the new 7170 series reduces costs and network complexity.

The programmability of the 7170 family stems from the Barefoot Networks Tofino packet processor found in the hardware. Engineers program the silicon using P4, an open source language.

Barefoot markets Tofino as an alternative to fixed-function application-specific integrated circuits. Large enterprises, cloud and communication service providers are typical users of the high-speed Barefoot Tofino chip, which processes packets at 6.5 Tbps.

Arista, which uses Broadcom and Cavium packet processors in other switches, wants to broaden the potential customer base for the Barefoot Tofino chip by coupling it with the vendor’s EOS network operating system for leaf-spine architectures. To make programming on Barefoot Tofino silicon easier, Arista provides packaged profiles that contain data plane and control plane features for specific applications. Network managers can customize the patterns using P4 and deploy them on EOS.

“We’ll have to see what sort of benefits customers derive from using the [7170] technology in real-world production environments,” said Brad Casemore, an analyst at IDC. “In theory, it certainly has the potential to handle some tasks typically addressed by routers and middleboxes.” 

Arista application profiles

Examples of the applications defined in the Arista profiles include network overlays and virtualization to offload network functions, such as traffic segmentation or tunnel encapsulation from virtual servers.

Other profiles provide network and application telemetry for flow-level visibility, configurable thresholds and alarms, timestamping and end-to-end latency. Arista also offers patterns supporting some firewall functionality and large-scale network address translation. NAT is a way to manage multiple IP addresses by giving them a solitary public IP address. The methodology improves security and decreases the number of IP addresses an organization needs.

“How readily those profiles are embraced and productively employed could determine the extent to which the 7170 successfully addresses the use cases Arista has identified,” Casemore said.

The 7170 series has two models. The first is a 1RU chassis that supports 32, 64 or 128 ports at 40/100 GbE, 50 GbE and 10/20 GbE, respectively. The second is a 2RU system that supports 64, 128 or 256 interfaces at 40/100 GbE, 50 GbE and 10/25 GbE, respectively. The hardware processes up to 12.8 terabits per second.

Base pricing for a 64-port system is $1,200 per port.

In March, Arista introduced two 25/100 GbE switches for cloud providers, tier-one and tier-two service providers, high-tech companies and financial institutions ready to replace 40/100 GbE switches with more powerful systems.

Arista is targeting the two switches — the 7050X3 and the 7260X3 — at different use cases. The former is an enterprise or carrier top-of-rack switch, while the 7260X3 is for leaf-spine data center networks used in large cloud environments.

SAP and Accenture collaborate on entitlement management platform

SAP and Accenture are teaming to deliver an intelligent entitlement management application intended to help companies build and deploy new business models.

Entitlement management applications help companies grant, enforce and administer customer access entitlements (which are usually referred to as authorizations, privileges, access right, or permissions) to data, devices and services — including embedded software applications — from a single platform.

The new SAP Entitlement Management allows organizations to dynamically change individual customer access rights and install renewal automation capabilities in applications, according to SAP. This means they can create new offerings that use flexible pricing structures.

The new platform’s entitlement management and embedded analytics integrate with SAP S/4HANA’s commerce and order management functions, which according to SAP, can help organizations create new revenue streams and get new products and services to market faster.

Accenture will provide consulting, system development and integration, application implementation, and analytics capabilities to the initiative.

“As high-tech companies rapidly transition from stand-alone products to highly connected platforms, they are under mounting pressure to create and scale new intelligent and digital business models,” said David Sovie, senior managing director of Accenture’s high-tech practice, in a press release. “The solution Accenture is developing with SAP will help enable our clients to pivot to as-a-service business models that are more flexible and can be easily customized.”

SAP and Accenture go on the defense

SAP and Accenture also unveiled a new platform that provides digital transformation technology and services for defense and security organizations.

The digital defense platform is based in S/4HANA and contains advanced analytics capabilities, and allows more use of digital applications by military personnel. It includes simulations and analytics applications intended to help defense and security organizations plan and run operations efficiently and be able to respond quickly to changing operating environments, according to SAP and Accenture.

“This solution gives defense agencies the capabilities to operate in challenging and fast-changing geo-political environments that require an intelligent platform with deployment agility, increased situational awareness and industry-specific capabilities,” said Antti Kolehmainen, Accenture’s managing director of defense business, in a press release.

The platform provides data-driven insights intended to help leaders make better decisions, and it enables cross-enterprise data integration in areas like personnel, military supply chain, equipment maintenance, finances and real estate.

IoT integration will enable defense agencies to connect devices that can collect and exchange data. The digital defense platform technology is available to be deployed on premises or in the cloud, according to the companies.

“The next-generation defense solution will take advantage of the technology capabilities of SAP S/4HANA and Accenture’s deep defense industry knowledge to help defense agencies build and deploy solutions more easily and cost-effectively and at the same time enable the digital transformation in defense,” said Isabella Groegor-Cechowicz, SAP’s global general manager of public services, in a press release.

New application and customer experience tool for SAP environments

AppDynamics (a Cisco company) has unveiled a new application and customer experience monitoring software product for SAP environments.

AppDynamics for SAP provides visibility into SAP applications and customer experiences via code-level insights into customer taps, swipes and clicks, according to AppDynamics. This helps companies understand the performance of SAP applications and databases, as well as the code impact on customers and business applications.

To satisfy customer expectations, [the modern enterprise] needs to meet the demands of an agile, digital business, while also maintaining and operating essential core systems.
Thomas Wyattchief strategy officer, AppDynamics

“The modern enterprise is in a challenging position,” said Thomas Wyatt, AppDynamics’ chief strategy officer, in a press release. “To satisfy customer expectations, it needs to meet the demands of an agile, digital business, while also maintaining and operating essential core systems.”

AppDynamics for SAP allows companies to collaborate around business transactions, using a unit of measurement that automatically reveals customers’ interactions with applications. They can then identify and map transactions flowing between each customer-facing application and systems of records — SAP ERP or CRM systems that include complex integration layers, such as SAP Process Integration and SAP Process Orchestration.

AppDynamics for SAP includes ABAP code-level diagnostics and native ABAP agent monitoring that provides insights into SAP environments with code and database performance monitoring, dynamic baselines, and transaction snapshots when performance deviates from the norm. It also includes intelligent alerting to IT based on health rules and baselines that are automatically set for key performance metrics on every business transaction. Intelligent alerting policies integrate with existing enterprise workflow tools, including ServiceNow, PagerDuty and JIRA.

This means that companies can understand dependencies across the entire digital business and baseline, identify, and isolate the root causes of problems before they affect customers. AppDynamics for SAP also helps companies to plan SAP application migrations to the cloud and monitor user experiences post-migration, according to AppDynamics.