Category Archives: Enterprise IT news

Enterprise IT news

2018 MIT Sloan CIO Symposium: A SearchCIO guide


Today’s enterprise can be divided into two groups: the departments that are acquiring advanced digital capabilities and those that are lagging behind. This bifurcation of digital prowess was evident at the 2018 MIT Sloan CIO Symposium, where we asked CIOs and digital experts to expound on the factors driving digitalization at enterprises and the barriers holding them back. Not surprisingly, the departments that are customer-facing, such as marketing, are leading the digital transformation charge.

While the transition to a digitalized enterprise is happening at varied speeds for most companies, the need to develop a viable digital business mode is universally recognized. Indeed, this year’s event was all about taking action — it is no longer enough just to have a vision for digital transformation, and the conference underscored that: sessions featured leading CIOs, IT practitioners, consultants and academics from across the globe dispensing hard-won advice on methods for planning and executing a future-forward digital transformation strategy.

In this SearchCIO conference guide, experience the 2018 MIT Sloan CIO Symposium by delving into our comprehensive coverage. Topics include building an intelligent enterprise, talent recruitment, the expanding CIO role and integration of emerging technologies like AI, machine learning, cloud and more.

To view our complete collection of video interviews filmed at this year’s event, please see our video guide: “MIT CIO 2018 videos: Honing a digital leadership strategy.”

1Thriving in a digital economy

Digital transformation strategy and advice

Implementing a digital transformation strategy requires a clear set of objectives, IT-business alignment, recruitment of the right talent, self-disruption and building what experts call an “intelligent enterprise,” among other things. In this section, the pros discuss the intricacies of leading the digital transformation charge.

2Technology transformation

Utilizing emergent tech like AI, machine learning and cloud

Every digital transformation requires a future-forward vision that takes advantage of up-and-coming tools and technologies. In this section, academics and IT executives discuss the enterprise challenges, benefits, questions and wide-ranging potential that AI, machine learning, edge computing, big data and more bring to the enterprise.

3Evolving CIO role

The CIO’s ever-expanding role in a digital world

Digital transformation not only brings with it new technologies and processes, it also brings new dimensions and responsibilities to the CIO role. In this section, CIOs and IT executives detail the CIO’s place in an increasingly digital, threat-laden and customer-driven world and offer timely advice for staying on top of it all.


Interviews filmed on site

During the 2018 MIT Sloan CIO Symposium, SearchCIO staff had the pleasure of conducting several one-on-one video interviews with consultants and IT executives on the MIT campus in Cambridge, Mass. Below is a sampling of the videos.

A link to our full collection of videos filmed at the 2018 MIT Sloan CIO Symposium can be found at the top of this guide.

Driving digital transformation success with Agile

In this SearchCIO video, Bharti Airtel’s CIO Mehta, MIT Sloan CIO Leadership Award winner, explains why implementing Agile methodologies can help organizations scale their digital transformation projects.

Finalized TLS 1.3 update has been published at last

The finalized and completed version of TLS 1.3 was published last week following a lengthy draft review process.

The Internet Engineering Task Force (IETF) published the latest version of the Transport Layer Security protocol used for internet encryption and authentication on Friday, Aug. 10, 2018, after starting work on it in April 2014. The final draft, version 28, was approved in March. It replaces the previous standard, TLS 1.2, which was published in RFC 5246 in August 2008. Originally based on the Secure Sockets Layer protocol, the new version of TLS has been revised significantly.

“The protocol [TLS 1.3] has major improvements in the areas of security, performance, and privacy,” IETF wrote in a blog post.

Specifically, TLS 1.3 “provides additional privacy for data exchanges by encrypting more of the negotiation handshake to protect it from eavesdroppers,” compared with TLS 1.2, IETF explained. “This enhancement helps protect the identities of the participants and impede traffic analysis.”

TLS 1.3 also has forward secrecy by default, so current communications will stay secured even if future communications are compromised, according to IETF.

“With respect to performance, TLS 1.3 shaves an entire round trip from the connection establishment handshake,” IETF wrote in its blog post announcing the finalized protocol. “In the common case, new TLS 1.3 connections will complete in one round trip between client and server.”

As a result, TLS 1.3 is expected to be faster than TLS 1.2. It will also remove outdated cryptography, such as the RSA key exchange, 3DES and static Diffie-Hellman, and thus free TLS 1.3 of the vulnerabilities that plagued TLS 1.2, such as FREAK and Logjam.

“Although the previous version, TLS 1.2, can be deployed securely, several high profile vulnerabilities have exploited optional parts of the protocol and outdated algorithms,” IETF wrote. “TLS 1.3 removes many of these problematic options and only includes support for algorithms with no known vulnerabilities.”

And, as Mozilla explained in a blog post, “TLS 1.3 is designed in cooperation with the academic security community and has benefitted from an extraordinary level of review and analysis. This included formal verification of the security properties by multiple independent groups; the TLS 1.3 RFC cites 14 separate papers analyzing the security of various aspects of the protocol.”

TLS 1.3 has already been widely deployed, according to Mozilla. The Firefox and Google Chrome browsers have draft versions deployed, with final version deployments on the way. And Cloudflare, Google and Facebook have also partially deployed the protocol.

McAfee threat research team uncovers healthcare security risks

Recent discoveries by a well-known cybersecurity vendor highlight new healthcare security risks.

A research team at McAfee found that open network jacks and weak network protocols connecting IoT medical devices are susceptible to a “man-in-the-middle” attack that could be hazardous to a patient’s health. The team’s findings were published in a blog post last week.

Healthcare security risks have become more prevalent with the advent of IoT and connected medical devices. “Prior to the early 2000s, we didn’t see medical devices on the hospital’s production network very often,” said Mac McMillan, president of CynergisTek, a cybersecurity consulting firm. “Since then, the worlds of medical technology and connectivity have exploded. We see, in the provider space and in the headlines, that this is a real problem.

“These devices run commercial, off-the-shelf operating systems, frequently cut down or old versions, and so cannot be protected in the same way a PC or server can be. They are every bit as likely as any other networked device to be attacked or hacked. Many of the devices also store [electronic personal health information] and now you’ve had a breach, too. On top of that, they are used to deliver therapeutic and diagnostic services to patients, so it is much more than a security issue; it is quality of care and a direct patient care issue.”

Protocol reverse engineered

Mac McMillan, president of CynergisTekMac McMillan

The McAfee research highlights the concern about patient care. A member of the research team reverse engineered a protocol that allows communications between a standard patient monitor and a central monitoring station and found that he easily could “emulate a patient monitor from his computer and make it think it was talking to a central monitoring station, as well as actually inject, or spoof, patient vitals into that conversation,” said Steve Povolny, McAfee’s head of advanced threat research.

Because the protocol “is unauthenticated, unencrypted [and] it’s sent in the clear over this internal network, it’s relatively easy for an attacker, or a man in the middle of that connection, to be able to control the vitals that are transmitted between devices,” Povolny said.

In these scenarios, he said, an attacker could gain access to the devices’ communications protocol through an open jack in a hospital room or by gaining access to a hospital’s internal network.

A lot of these problems are not cutting-edge and can be mitigated quite easily.
Steve Povolnyhead of advanced threat research, McAfee

The danger, Povolny said, is that an attacker could manipulate a healthy patient’s vitals to indicate “some sort of arrhythmia or a change in heart rate that requires medical attention,” which could lead to a doctor administering unnecessary medication.

“A lot of these problems are not cutting-edge and can be mitigated quite easily,” Povolny said, “but the impact scenario is pretty powerful.”

Mitigating security risks not difficult

To mitigate healthcare security risks, he said, “vendors can implement basic authentication for devices so they are not broadcasting patient information in the clear over the wire.” Basic encryption and authentication would be a great step for these protocols.

Hospitals, he said, should ensure that medical devices are isolated, be aware of whether or not they allow open jacks in a room and be educated about security issues “and know how to respond in a scenario where they see anomalies like these.”

“In a real-world setting,” McMillan said, “the process is called a security risk assessment and it starts by identifying all the digital assets — hardware, data, operating systems, software. Once identified, you have to risk rate them, so you know what needs how much protection and how quickly. That risk rating can’t just be based on the fact that there are known vulnerabilities; it has to be based on the likelihood of someone exploiting that vulnerability and then the impact that exercising that vulnerability has on patient care and the business.

“If a network-connected blood pressure cuff has a weakness, [it’s] probably not urgent, but if someone can shut down an anesthesia machine and they do it during surgery, you have a different level of risk. This can’t be just about finding problems, it has to be about figuring out what the real problems are and then fixing those.”

As for the McAfee research into healthcare security risks, Povolny said, “We’re hoping that some of this research continues to elevate and eliminate the problems that we’re seeing … and hopefully convinces some of the vendors and medical systems and implementers to address these problems before they become the next headline.”

Scale-out Qumulo NAS qualifies latest Dell EMC PowerEdge servers

Qumulo today added a hardware option for its customers by qualifying its scale-out NAS software to run on Dell Technologies’ PowerEdge servers. That leaves open the possibility that Qumulo will gain customers on Dell EMC servers at the expense of Dell EMC Isilon’s clustered NAS platform.

Qumulo is nearly two years into an OEM deal with Dell EMC archrival Hewlett Packard Enterprise. HPE rebrands and sells Qumulo’s scale-out NAS software on its servers. There is no joint go-to-marketing agreement between Qumulo and Dell EMC, which is a NAS market leader. The partnership means customers can purchase PowerEdge hardware from their preferred Dell EMC resellers and install Qumulo NAS software on the box.

Dell qualified Qumulo NAS software to run on dual-socket 2U Dell EMC PowerEdge R740xd servers.

“There are a lot of customers who build private clouds on Dell hardware. We’re now in a position where they can choose our software to build their computing,” Qumulo chief marketing officer Peter Zaballos said.

Dell EMC 14th-generation PowerEdge are equipped with about 20% more NVMe flash capacity than R730 models. One of the use cases cited by Dell EMC is the ability to use a single PowerEdge 14G node to power its IsilonSD Edge virtual NAS software, which competes with Qumulo storage.

Will Qumulo on PowerEdge compete with Dell EMC Isilon NAS?

The Qumulo File Fabric (QF2) file system scales to support billions of files and hundreds of petabytes. QF2 is available on Qumulo C-Series hybrid arrays, all-flash P-Series or preinstalled on HPE Apollo servers. Customers also may run it as an Elastic Compute Cloud instance to burst and replicate in AWS.

Qumulo NAS gear is sold mostly to companies in media and entertainment and other sectors with large amounts of unstructured data.

Zaballos said QF2 on PowerEdge isn’t a direct attempt to displace Isilon. The goal is to give Dell EMC shops greater flexibility, he said.

“We’re looking to build the biggest footprint in the market. Between Dell and HPE, that’s about 40% of the server market for data centers,” Zaballos said.

Qumulo competes mainly with Isilon and NetApp’s NAS products and has won customers away from Isilon. Pressure on traditional NAS vendors is also coming from several file system-based cloud startups, including Elastifile, Quobyte, Stratoscale and WekaIO.

Qumulo founders Peter Godman, Aaron Passey and Neal Fachan helped develop the Isilon OneFS clustered file system, which paved the way for the startup’s initial public offering in 2006. EMC bought the Isilon technology for $2.25 billion in 2010 and then was acquired as part of the Dell-EMC merger in 2015.

Qumulo CEO Bill Richter was president of the EMC Isilon division for three years. He joined Qumulo in 2016.

Greg Schulz, an analyst with Server StorageIO, based in Stillwater, Minn., likened the Qumulo-PowerEdge configuration to Dell EMC’s “co-optetition” OEM agreement with hyper-converged vendor Nutanix.

Qumulo NAS has been focused on high-performance, big-bandwidth file serving, which may not play well in environments that have many smaller files and mixed workloads. That’s an area Isilon has adapted to over the years. The other obstacle is getting [beyond] large elephant-hunt deals into broader markets, and getting traction with Dell servers can help them fill gaps in their portfolio,” Schulz said.

Ron Pugh, vice president for Dell EMC OEM sales in North America, said it’s not unusual for potential competitors to rely on Dell hardware products.

“If you look deeply inside the Dell Technologies portfolio, some of our customers can be considered competitors. Our OEM program is here to be a building block for our customers, not to build competing products,” Pugh said.

Dell EMC also sells Elastifile cloud-based NAS on its servers and is an Elastifile strategic investor.

Qumulo: AI tests on P-Series flash

Qumulo this week also previewed upcoming AI enhancements to its P-Series to enable faster prefetching of application data in RAM. Those enhancements are due to roll out in September. Grant Gumina, a Qumulo senior product manager, said initial AI enhancements will improve performance of all-flash P-Series. Series proofs of concept are under way with media customers, Gumina said.

“A lot of studios are using SANs to power primarily file-based workloads in each playback bay. The performance features in QF2 effectively means they can install a NAS for the first time and move to a fully Ethernet-based environment,” Gumina said.

File storage vendors NetApp and Pure Storage recently added flash systems built for AI, incorporating Nvidia hardware.

Industrial cloud moving from public to hybrid systems

The industrial cloud runs largely in the public domain currently, but that may be about to change.

Over the next few years, manufacturers will move industrial cloud deployments from the public cloud to hybrid cloud systems, according to a new report from ABI Research, an Oyster Bay, N.Y., research firm that specializes in industrial technologies. Public cloud accounts for almost half of the industrial IoT market share in 2018 (49%), while hybrid cloud systems have just 20%. But by 2023 this script will flip, according to the report, with hybrid cloud systems making up 52% of the IIoT market and public cloud just 25%.

The U.S.-based report surveyed vice presidents and other high-level decision-makers from manufacturing firms of various types and sizes, according to Ryan Martin, ABI Research principal analyst. The main focus of the report was IoT industrial cloud and it surveyed the manufacturers and their predisposition to technology adoption.

According to the report, the industrial cloud encompasses the entirety of the manufacturing process  and unifies the digital supply chain. This unification can lead to a number of benefits. Companies can streamline internal and external operations through digital business, product, manufacturing, asset and logistics processes; use data and the insights generated to enable new services; and improve control over environmental, health and safety issues.

Changing needs will drive move to hybrid systems

Historically, most data and applications in the IoT resided on premises, often in proprietary systems, but as IoT exploded the public cloud became more prevalent, according to Martin. 

The cloud, whether public or private, made sense because it offers a centralized location for storing large amounts of data and computing power at a reasonable cost, but organizational needs are changing, Martin said. Manufacturers are finding that a hybrid approach makes sense because it’s better to perform analytics on the device or activity that’s generating the data, such as equipment at a remote site, than to perform analytics in the cloud.

You don’t want to be shipping data to and from the cloud every time you need to perform a query or a search because you’re paying for that processing power, as well as the bandwidth.
Ryan Martinprincipal analyst, ABI Research

“There’s a desire to keep certain system information on site, and it makes a lot of business sense to do that, because you don’t want to be shipping data to and from the cloud every time you need to perform a query or a search because you’re paying for that processing power, as well as the bandwidth,” Martin said. “Instead it’s better to ship the code to the data for processing then shoot the results back to the edge. The heavy lifting for the analytics, primarily for machine learning types of applications, would happen in the cloud, and then the inferences or insights would be sent to a more localized server or gateway.”

Providers like AWS and Microsoft Azure will likely carry the bulk of the cloud load, according to Martin, but several vendors will be prominent in providing services for the industrial cloud.

“There will be participation from companies like SAP, as well as more traditional industrial organizations like ABB, Siemens, and so forth,” Martin said. “Then we have companies like PTC, which has recently partnered with Rockwell Automation, doing aggregation and integration, and activation to the ThingWorx platform.”

The industrial cloud will increasingly move from public cloud to hybrid cloud systems.
The hybrid cloud market for IIOT will double by 2023.

Transformation not disruption

However, companies face challenges as they move to implement the new technologies and systems that comprise the hybrid industrial cloud. The most prominent challenge is to implement the changes without interrupting current operations, Martin said.

“It will be a challenge to bring all these components like AI, machine learning and robotics together, because their lifecycles operate on different cadences and have different stakeholders in different parts of the value chain,” Martin said. “Also they’re producing heterogeneous data, so there needs to be normalization of mass proportion, not just for the data, but for the application providers, partners and supplier networks to make this all work.”

The overall strategy should be about incremental change that focuses on transformation over disruption, he explained.

“This is analogous to change management in business, but the parallel for IIoT providers is that these markets in manufacturing favor those suppliers whose hardware, software and services can be acquired incrementally with minimal disruption to existing operations,” he said. “We refer to this as minimal viable change. The goal should be business transformation; it’s not disruption.”

New Dell EMC 100 GbE switch tailored for east-west traffic

Dell EMC has introduced a high-density 100 Gigabit Ethernet switch aimed at service providers and large enterprises that need more powerful hardware to support a growing number of cloud applications and digital business initiatives.

Dell EMC launched the Z9264F open networking switch this week, listing its target customers as hyperscale data center operators, tier-one  and -two service providers and enterprises. The Dell EMC 100 GbE switch is designed for leaf-spine switching architectures.

“Dell’s new, high-performance, low-latency 100 GbE switch is ideally suited for large enterprises and service providers,” said Rohit Mehra, analyst at IDC. “The continued growth of cloud applications that require high-performance, east-west traffic-handling capabilities will likely be one of the key drivers for this class of switches to see increased traction.”

Indeed, Dell EMC, Cisco, Hewlett Packard Enterprise (HPE) and Juniper Networks are counting on an increase in data center traffic to sell their 100 GbE switches. So far, demand for the hardware has been robust. In the first quarter, revenue from 100 GbE gear grew nearly 84% year over year to $742.5 million, according to IDC. Port shipments increased almost 118%.

The Dell EMC 100 GbE switch is 2RU hardware available with 64, 128 or 64 ports of 100 GbE, 25 GbE or 50 GbE, respectively. Options for 10 GbE and 40 GbE ports are also available. Broadcom’s 6.4 Tbps StrataXGS Tomahawk II chip powers the switch.

Dell EMC, along with rival HPE, is marketing its support for third-party network operating systems as a differentiator for its switches. Dell EMC is selling the Z9246F with the enterprise edition of its network operating system (NOS), called OS10, or with operating systems from Big Switch Networks, Cumulus Networks, IP Infusion or Pluribus Networks.

Other options for the Dell 100 GbE switch include the open source edition of OS10 and either the Metaswitch network protocol stack or the Quagga suite of open source applications for managing routing protocols. Finally, Dell EMC will sell just the hardware with several open source applications, including Quagga and the OpenSwitch or SONiC NOS.

The starting price for the Z9264F, without an operating system or optics, is $45,000.

Trends in the 100 GbE market

While open networking is not mainstream yet in the enterprise, providing choice in terms of the complete hardware and software stack is something that large enterprises and service providers have started to look at favorably.
Rohit Mehraanalyst at IDC

Several trends are driving the 100 GbE market. Service providers are redesigning their data centers to support software-based network services, including 5G and IoT. Also, financial institutions are providing services to customers over a growing number of mobile devices.

Meanwhile, cloud companies that provide infrastructure or platform as a service are buying more hardware to accommodate a significant increase in companies moving application workloads to the cloud. In 2017, public cloud data centers accounted for the majority of the $46.5 billion spent on IT infrastructure products — server, storage and switches — for cloud environments, according to IDC.

In the first quarter, original design manufacturers accounted for almost 30% of all infrastructure hardware and software sold to public cloud providers, according to Synergy Research Group, based in Reno, Nev. Dell EMC had a 5% to 10% share, which was the same size share as Cisco and HPE.

As a switch supplier, Dell EMC is a smaller player. The company is not one of the top five vendors in the market, according to IDC. Nevertheless, Dell EMC is a major supplier of open networking to the small number of IT shops buying the technology.

“While open networking is not mainstream yet in the enterprise, providing choice in terms of the complete hardware and software stack is something that large enterprises and service providers have started to look at favorably,” Mehra said.

Infosec mental health support and awareness hits Black Hat 2018

LAS VEGAS — Rather than continue being reactive to social issues, Black Hat 2018 took steps to be more proactive in addressing and bringing awareness to the topic of infosec mental health.

The Black Hat conference set up a “self-care” lounge for attendees and included two complementary sessions covering the negative infosec mental health issues of depression and burnout and how the cybersecurity community can prove to be a source of aid for those suffering from post-traumatic stress disorder (PTSD).

During “Mental Health Hacks: Fighting Burnout, Depression and Suicide in the Hacker Community,” speakers Christian Dameff, emergency medicine physician and clinical informatics fellow at the University of California, San Diego, and Jay Radcliffe, cybersecurity researcher at Boston Scientific, shared personal stories of depression and burnout, as well as ways to identify symptoms in oneself or in co-workers.

Radcliffe noted that the widely acknowledged skills gap could be a contributing factor of infosec mental health issues. 

“With global staffing shortages in information security, we’re seeing departments that should have 10 people work with five. And that increases stress,” said Radcliffe, adding that infosec workers can even have a “hero complex” that leads to taking on more work than is healthy.

Radcliffe said workers and employers should keep an eye out for common symptoms, including, “feeling cynical, no satisfaction from accomplishments, dreading going to work and no work-life balance.” He suggested options such as speaking to counselors, therapists and psychologists, and also being mindful that workers take vacations and managers ensure time off is encouraged.

In the talk, “Demystifying PTSD in the Cybersecurity Environment,” Joe Slowik, adversary hunter at Dragos Inc., expanded on those topics and talked about how working in the infosec community helped him deal with PTSD from his military service in Afghanistan.

Slowik was careful to point out that PTSD should not be confused with burnout, depression or other infosec mental health issues because, as he wrote via email, certain “solutions or mitigations that may be appropriate for one, [may not be for] others.”

“For example, it is likely advisable to tell someone to step away from work for a bit to combat burnout — but in the case of PTSD where an individual may gain empowerment or agency from doing work they love/are successful at, such a step may in fact be counterproductive (it is for me),” Slowik wrote. “Similarly, for depression, treatment may simply be a combination of taking time away, medication, and some degree of therapy, whereas successful treatment of PTSD requires more intensive interventions and likely must be ongoing and continuing to be effective. Combining all of these into the same category means very real mistakes can be made, which at best leave a situation unresolved, and at worst exacerbate it.”

Slowik added that being in the infosec community was “empowering” because it allowed him “to do well at doing good.”

Information security work has allowed me to reclaim a sense of agency by having direct, measurable, recognizable impact in meaningful affairs.
Joe Slowikadversary hunter, Dragos Inc.

“One of the more pernicious aspects of PTSD is a loss of agency deriving from a moment of helplessness when one’s life/integrity was placed in severe danger or risk — re-experiencing this event leaves one feeling worthless and helpless in the face of adversity,” Slowik wrote. “Information security work has allowed me to reclaim a sense of agency by having direct, measurable, recognizable impact in meaningful affairs, and at least for me has been instrumental in moving beyond past trauma.”

The talks showed two sides of the security community that don’t often get talked about: how the work can be both the cause of — and the remedy for — infosec mental health issues.

The attendance for the two talks was noticeably lower than for the more technical talks. It is unclear if this was due to poor marketing, unreasonable expectations for attendance, or the social stigmas surrounding mental health issues.

Slowik said he was grateful for those who attended and noted that the lower attendance could also be attributed to his talk being “the first scheduled talk the morning after Black Hat’s infamous parties.”

“Numbers are irrelevant, as conversations after the presentation made it clear this really reached members of the audience,” Slowik wrote. “My only hope is that this talk, along with other items from the Black Hat Community track, are made publicly available since so many good lessons and observations were made in this forum and these should be shared with the wider information security community.”

Microsoft Skype for Business update fixes Mac bugs

The latest software patch for on-premises Skype for Business eliminates bugs and adds features for users that run the Microsoft platform on Mac OS, narrowing an already minimal gap between the Mac and Windows clients.

For Mac users, the Skype for Business update lets delegates — users designated to receive someone else’s calls — create and edit a meeting on behalf of a colleague. Also, users can now be made a delegate even if their account isn’t part of an organization’s enterprise voice plan.

Microsoft has enabled video-based screen sharing for Mac users, the result of a next-generation screen-sharing protocol that the vendor added to Skype for Business earlier this year. The new system is faster and more reliable than the traditional method and works better in low-bandwidth conditions.

The Skype for Business update, available for download now, also fixes several bugs on the Mac client, including a flaw that prevented users from joining a meeting hosted by someone outside their organization.

Microsoft seems to announce updates to the Mac client more quickly than it does for other changes to the Skype for Business platform, and describes Mac upgrades in more detail, said Jim Gaynor, a vice president of the consulting group Directions on Microsoft, based in Kirkland, Wash.

“There are still a few gaps between SfB Mac and Windows clients, most around some of the advanced call control features, file upload/sharing, and the ability to upload PowerPoint decks for online presentations,” Gaynor said. “But they’re fairly minimal.”

Skype for Business 2015 server nears its end of life

The improvements to the Mac client were among roughly 40 enhancements released as part of Microsoft’s biannual update to the Skype for Business 2015 server.

This summer’s Skype for Business update introduces location-based routing for Skype for Business mobile clients. The feature gives businesses more control when steering calls between VoIP and PSTN endpoints based on geography.

Microsoft is expected to stop releasing feature updates and bug fixes for the 2015 server in fall 2020, the end of the typical five-year lifespan for the product.

The vendor recently published a preview of the 2019 server, which is due out by year’s end. That server will extend support for on-premises Skype for Business for at least another five years, primarily to serve large organizations that are not ready to migrate to Skype’s cloud-based successor, Microsoft Teams.

The 2019 server will encourage businesses to host some telephony and messaging features in the cloud. Meanwhile, Microsoft Teams, a team collaboration app similar to Slack, will soon replace Skype for Business Online within the cloud-based Office 365 suite.

LinkedIn Sales Navigator refresh adds deals pipeline

A LinkedIn Sales Navigator refresh adds a deals management feature, smoother search experience and mobile deal pages to the social media giant’s social sales platform.

The revamp injects an array of new ways to search, manipulate and process LinkedIn’s vast troves of personal and consumer data and data from CRM systems and puts LinkedIn in a better position to monetize the information — coming off a hot quarter for LinkedIn, which reported June quarter earnings of $1.46 billion, up 37% from Q2 2017.

These upgraded features represent the next step in AI-assisted sales and marketing campaigns in which B2B companies mash up their own customer data with information on LinkedIn.

Microsoft banking on LinkedIn revenue

Microsoft bought LinkedIn in June 2016 for $26.2 billion. While Microsoft doesn’t always announce how AI is assisting automation of sales-centric search tools in Sales Navigator, a premium LinkedIn feature that also integrates LinkedIn data to CRM platforms such as Salesforce and Dynamics CRM, some experts have noted how AI subtly manifests itself in the search. 

The LinkedIn Sales Navigator refresh was unveiled in a blog post by Doug Camplejohn, vice president of products for LinkedIn Sales Solutions.

The new “Deals” web interface extracts and imports sales pipeline data from the user’s CRM system and enables users to update pipelines considerably faster, Camplejohn said in the post about the LinkedIn Sales Navigator refresh.

“Reps can now update their entire pipeline in minutes, not hours,” he wrote.

Adobe Sign connector added

Meanwhile, a new feature in Deals, “Buyer’s Circle,” pulls in and displays opportunity role information to streamline the B2B buying process. Users can see if any “key players” such as decision-maker, influencer or evaluator, are missing from deals, according to LinkedIn.

We all live in email.
Doug Camplejohnvice president of products, LinkedIn

The vendor called another new function in the LinkedIn Sales Navigator refresh — Office 365 integration — “Sales Navigator in your inbox.”

“We all live in email,” the blog post said. “Now you can take Sales Navigator actions and see key insights without ever leaving your Outlook for Web Inbox. “

LinkedIn also touted what it called a “new search experience” in the Sales Navigator update, saying it redesigned the search function to surface search results pages faster and easier.

Also as part of the LinkedIn Sales Navigator refresh, LinkedIn added mobile-optimized lead pages for sales people working on mobile devices. LinkedIn also named Adobe Sign the fourth partner to its Sales Navigator Application Platform (SNAP). Other SNAP partners include Salesforce, Microsoft Dynamics and SalesLoft.

IBM DS8882F converges array and mainframe in one rack

Talk about converged infrastructure — IBM just embedded an all-flash array inside mainframe server racks.

IBM today launched a rack-mounted IBM DS8882F array for IBM Z ZR1 and LinuxOne Rockhopper II “skinny” mainframes that rolled out earlier in 2018. The 16U DS8882F is the smallest of IBM’s high-end DS8880 enterprise storage family designed for mainframes. The new mainframes install in a standard 19-inch rack. The IBM DS8882F array inserts into the same rack and scales from 6.4 TB to 368.64 TB of raw capacity.

The IBM DS8882F is part of a large IBM storage rollout that features mostly software and cloud storage updates, including the following:

  • IBM Spectrum Protect1.6 data protection software now supports automatic tiering to object storage and ransomware protection for hypervisor workloads. The software generates email warnings pointing to where an infection may have occurred. Spectrum Protect supports Amazon Web Services, IBM Cloud and Microsoft Azure.
  • IBM Spectrum Protect Plus1.2 virtual backup now supports on-premises IBM Cloud Object Storage, IBM Cloud and AWS S3. It also supports VMware vSphere 6.7, encryption of vSnap repositories, and IBM Db2 databases.
  • IBM Spectrum Scale0.2 added file audit logging, a watch folder and other security enhancements, along with a GUI and automated recovery features. Spectrum Scale on AWS now enables customers to use their own AWS license and supports a single file system across AWS images.
  • The IBM DS8880 platform supports IBM Cloud Object Storage and automatically encrypts data before sending it to the cloud.

The products are part of IBM’s third large storage rollout this year. It added an NVMe FlashSystem 9100 and Spectrum software in July, and cloud-based analytics and block-based deduplication in May.

Steve McDowell, senior technology analyst at Moor Insights & Strategy, said IBM has become the most aggressive of the large storage vendors when it comes to product delivery.

“IBM storage is marching to a cadence and putting out more new products faster than its competitors,” McDowell said. “We’re seeing announcements every quarter, and their products are extremely competitive.”

IBM ended a string of 22 straight quarters of declining storage revenue in early 2017 and put together four quarters of growth until declining again in the first quarter of 2018. IBM’s storage focus has been around its Spectrum software family and all-flash arrays.

IBM’s focus on footprint

McDowell called the IBM DS8882F “a nice piece of hardware.” “The zSeries is moving towards a more standard rack, and this fits right in there with almost 400 TB of raw capacity in a 19-inch rack,” he said. “It’s about capacity density and saving floor space. If I can put a zSeries and a rackmount of storage unit side by side, it makes a nice footprint in my data center.”

“The days of an EMC VMAX spanning across your data center are gone. With flash, it’s how many terabytes or petabytes I can put into half a rack and then co-locate all of that with my servers.”

Eric Herzog, chief marketing officer for IBM storage, said reducing the footprint was the main driver of the array-in-the-mainframe.

“We created a mini-array that literally screws into the same 19-inch mainframe rack,” Herzog said. “This frees up rack space and floor space, and gives you a smaller, lower-cost entry point.”

Competing in a crowded market

IBM’s DS8880 series competes with the Dell EMC PowerMax — the latest version of the VMAX — and the Hitachi Vantara Virtual Storage Platform as mainframe storage platforms.

IBM storage revenue rebounded to grow in the second quarter this year, but the market remains crowded.

IBM’s Herzog said the storage market “is fiercely competitive in all areas, including software. It’s a dog-eat-dog battle out there. Software is just as dog-eat-dog as the array business now, which is unusual.”

The new products are expected to ship by the end of September.