Tag Archives: FRANCISCO

Juniper Contrail battles Cisco ACI, VMware NSX in the cloud

SAN FRANCISCO — Juniper Networks has extended its Contrail network virtualization platform to multicloud environments, competing with Cisco and VMware for the growing number of enterprises running applications across public and private clouds.

The Juniper Contrail Enterprise Multicloud, introduced this week at the company’s NXTWORK conference, is a single software console for orchestrating, managing and monitoring network services across applications running on cloud-computing environments. The new product, which won’t be available until early next year, would compete with the cloud versions of Cisco’s ACI and VMware’s NSX.

Also at the show, Juniper announced that it would contribute the codebase for OpenContrail, the open source version of the software-defined networking (SDN) overlay, to The Linux Foundation. The company said the foundation’s networking projects would help drive OpenContrail deeper into cloud ecosystems.

Contrail Enterprise Multicloud stems, in part, from the work Juniper has done over several years with telcos building private clouds, Juniper CEO Rami Rahim told analysts and reporters at the conference.

“It’s almost like a bad secret — how embedded we have been now with practically all — many — telcos around the world in helping them develop the telco cloud,” Rahim said. “We’ve learnt the hard way in some cases how this [cloud networking] needs to be done.”

Is Juniper’s technology enough to win?

Technologically, Juniper Contrail can compete with ACI and NSX, IDC analyst Brad Casemore said. “Juniper clearly has put considerable thought into the multicloud capabilities that Contrail needs to support, and, as you’d expect from Juniper, the features and functionality are strong.”

Cisco and VMware have marketed their multicloud offerings aggressively. As such, Juniper will have to raise and sustain the marketing profile of Contrail Enterprise Multicloud.
Brad Casemoreanalyst, IDC

However, Juniper will need more than good technology when competing for customers. A lot more enterprises use Cisco and VMware products in data centers than Juniper gear. Also, Cisco has partnered with Google to build strong technological ties with the Google Cloud Platform, and VMware has a similar deal with Amazon.

“Cisco and VMware have marketed their multicloud offerings aggressively,” Casemore said. “As such, Juniper will have to raise and sustain the marketing profile of Contrail Enterprise Multicloud.”

Networking with Juniper Contrail Enterprise Multicloud

Contrail Enterprise Multicloud comprises networking, security and network management. Companies can buy the three pieces separately, but the new product lets engineers manage the trio through the software console that sits on top of the centralized Contrail controller.

For networking in a private cloud, the console relies on a virtual network overlay built on top of abstracted hardware switches, which can be from Juniper or a third party. The system also includes a virtual router that provides links to the physical underlay and Layer 4-7 network services, such as load balancers and firewalls. Through the console, engineers can create and distribute policies that tailor the network services and underlying switches to the needs of applications.

Contrail Enterprise Multicloud capabilities within public clouds, including Amazon Web Services, Google Cloud Platform and Microsoft Azure, are different because the provider controls the infrastructure. Network operators use the console to program and control overlay services for workloads through the APIs made available by cloud providers. The Juniper software also uses native cloud APIs to collect analytics information. 

Other Juniper Contrail Enterprise Multicloud capabilities

Network managers can use the console to configure and control the gateway leading to the public cloud and to define and distribute policies for cloud-based virtual firewalls.

Also accessible through the console is Juniper’s AppFormix management software for cloud environments. AppFormix provides policy monitoring and application and software-based infrastructure analytics. Engineers can configure the product to handle routine networking tasks.

The cloud-related work of Juniper, Cisco and VMware is a recognition that the boundaries of the enterprise data center are being redrawn. “Data center networking vendors are having to redefine their value propositions in a multicloud world,” Casemore said.

Indeed, an increasing number of companies are reducing the amount of hardware and software running in private data centers by moving workloads to public clouds. Revenue from cloud services rose almost 29% year over year in the first half of 2017 to more than $63 billion, according to IDC.

Juniper Junos Space Security Director gets automation boost

SAN FRANCISCO — Juniper Networks has made its security products more responsive to threats, thereby reducing the amount of manual labor required to fend off attacks.

On Tuesday at the Juniper NXTWORK conference, the company introduced “dynamic policy management” in the Junos Space Security Director. The central software console for Juniper network security manages the vendor’s firewalls and enforces security policies on Juniper’s EX and QFX switches.

The latest improvement to Junos Space Security Director lets security pros define variables that will trigger specific rules in Juniper SRX Series next-generation firewalls. For example, if a company is under a ransomware attack that has planted malware in employees’ PCs, then Director could activate rules restricting access to critical applications that handle sensitive data. The rules could also tell firewalls to cut off internet access for those applications.

The new Junos Space Security Director features can lower the response time to security threats from hours to minutes, said Mihir Maniar, vice president of security product management at Juniper, based in Sunnyvale, Calif. “It’s completely dynamic, completely user-intent-driven.”

Vendors trending toward automated security threat response

Automating the response to security threats is a trend among vendors, including Juniper rival Cisco. Companies can configure products to take specific actions against threats, which removes the time security pros would have to spend deploying new firewall rules manually.

Automation means 10 different things to 10 different people.
Dan Condeanalyst at Enterprise Strategy Group

“You have to mitigate very quickly and not just inform somebody and hope for the best,” said Dan Conde, an analyst at Enterprise Strategy Group, based in Milford, Mass. “Manual procedures do not work very quickly.”

But the ultimate goal, which eludes vendors today, is to have products that detect and mitigate threats on their own and then continue to monitor the network to ensure the steps taken were successful.

Vendor marketing tends to play down the fact that the level of automation is rudimentary, which has led to confusion over the definition of automation across different products. “Automation means 10 different things to 10 different people,” Conde said.

Juniper network security stronger with new SRX4600 firewall

Juniper has integrated a new firewall with the latest iteration of Junos Space Security Director. The SRX4600 is designed to protect data flowing in multi-cloud environments found in an increasing number of companies. The SRX4600 is a 1RU appliance with a throughput of 80 Gbps.

Juniper also unveiled at NXTWORK an on-premises malware detection appliance that uses analytics and remediation technology built by Cyphort, which Juniper acquired this year. Cyphort has developed security analytics that spots malware based on its abnormal activity in the network.

The new Advanced Threat Prevention Appliance in Juniper’s network security portfolio is designed for companies with “strict data sovereignty requirements,” the company said. The on-premises hardware has been certified by ISCA Labs, which is an independent division of Verizon that conducts testing and certification of security and health IT products.

Salesforce Quip gets a facelift

SAN FRANCISCO — Salesforce launched a major overhaul of the Quip collaboration tool it acquired in July 2016….

“;
}
});

/**
* remove unnecessary class from ul
*/
$(“#inlineregform”).find( “ul” ).removeClass(“default-list”);

/**
* Replace “errorMessageInput” class with “sign-up-error-msg” class
*/
function renameErrorMsgClass() {
$(“.errorMessageInput”).each(function() {
if ($(this).hasClass(“hidden”)) {
$(this).removeClass(“errorMessageInput hidden”).addClass(“sign-up-error-msg hidden”);
} else {
$(this).removeClass(“errorMessageInput”).addClass(“sign-up-error-msg”);
}
});
}

/**
* when validation function is called, replace “errorMessageInput” with “sign-up-error-msg”
* before return
*/
function validateThis(v, form) {
var validateReturn = urValidation.validate(v, form);
renameErrorMsgClass();
return validateReturn;
}

/**
* DoC pop-up window js – included in moScripts.js which is not included in responsive page
*/
$(“#inlineRegistration”).on(“click”,”a.consentWindow”, function(e) {
window.open(this.href, “Consent”, “width=500,height=600,scrollbars=1”);
e.preventDefault();
});

The core concept behind the new version is organizing everything related to a project in one tab.

This, Salesforce hopes, can reduce the friction associated with clicking through different browser tabs associated with email, chat, cloud storage, shared spreadsheets and other Salesforce apps a company may have integrated into its CRM platform

First launched in 2013, Salesforce Quip enables users to collaboratively chat and work on shared documents and spreadsheets. Salesforce calls the latest update, announced at the annual Dreamforce user conference, the Salesforce Quip Collaboration Platform. It enables users to bring a wide variety of live applications onto a single canvas.

A project manager can customize the widgets associated with a project and provide team members with the permissions required to make changes. All the updates to this page can be automatically reflected in the appropriate Salesforce database in an auditable and, if necessary, reversible manner.

Focus on a single canvas

Collaborative interfaces are certainly not new, but the team behind Quip has a lot of experience in launching some of the most successful apps on the web, including Google Maps, FriendFeed and the Google App Engine.

The team leveraged this experience to create a core set of Quip widgets called Live Apps, as well as an API that enables third-party developers to add new widgets to the platform. The individual apps were developed by DocuSign, Lucidchart, New Relic Inc., Altify and others. Now that the platform is live, more apps are expected to be developed. Current Salesforce apps include Salesforce records, calendars, Kanban boards, shared documents and chat.

View of Quip dashboard and examples of mobile layout
A screenshot of the Salesforce Quip dashboard and mobile features

The Altify app enables teams to include a widget to map out the relationships inside a customer opportunity. The New Relic app enables a marketing team to track website performance during big events, like Black Friday sales, so that the sales and engineering teams can collaboratively make changes during the campaigns.

A project manager can also create a Quip workbook that best matches their team’s process. A single workbook can include a marketing budget, marketing goals and marketing documents, all in one place.

Collaborating on a better film

Salesforce Quip is used by 29,000 employees at 21st Century Fox Inc. to manage film production, sales and marketing. Creatives use it to track scripts or call sheets associated with TV and movie productions. All changes are made to a document of record in one place so that everyone is working on the same version. This reduces the burden of trying to weave changes made to different versions of a document into the master.

What’s particularly intriguing is the level of granularity with which participants can reference data in the apps. For example, 21th Century Fox producers use Quip for reviewing film dailies, and they can tie a chat to an arrow pointing to a specific object in a video frame. This saves them time because everyone involved can look at the exact video frame in the footage without having to open another window and manually look for it.

Creating a new experience layer to drive process

Salesforce Quip represents an example of driving better workflow by improving the user experience layer.

Twenty years ago, enterprises talked about process. Now, we have moved to engagement. If I create the right engagement mechanism, the process is a byproduct of that.
Paul Gaynorpartner, PricewaterhouseCoopers

headshot of Paul GaynorPaul Gaynor

“The experience could be a customer, employee or partner experience,” said Paul Gaynor, partner at PricewaterhouseCoopers LLC, at Dreamforce. “A focus on the experience layer allows enterprises not to focus so much on the process, [but on] how to bring about engagement. Twenty years ago, enterprises talked about process. Now, we have moved to engagement. If I create the right engagement mechanism, the process is a byproduct of that.”

The key is to hide the complexity from users.

“Behind the scenes, we want to apply AI, machine learning and the capability to bring multiple data repositories together, either in the public or private cloud, and have them merge,” Gaynor said. “If I create the right enablement, then the process naturally follows.”

Turning business into a team sport

“Complex enterprise selling is a team sport,” said Anthony Reynolds, CEO of Altify, referring to the difficulty of a company selling its products or services to large organizations.

It’s too easy for teams on all kinds of projects to get bogged down in the minutia and friction of moving between different apps. The promise of Quip is to make any enterprise process a team sport. The idea of a team sitting around a single screen related to a campaign sounds a lot more exciting than separate individuals trying to keep up with a flurry of emails, chats and various app notifications.

Leading sales organizations are starting to adopt a more collaborative approach to selling to larger customers. Account-based marking (ABM) has emerged as a way of customizing the marketing message to address the unique needs of all the stakeholders in a target opportunity. But this requires a high level of collaboration between all the employees involved in customizing the marketing communication and sales strategy for the target customer.

headshot of Anthony ReynoldsAnthony Reynolds

“A company can’t really be successful with their ABM strategy unless it is tightly coupled with an account-based selling strategy,” Reynolds explained. “Account-based marketing starts with [a] better understanding of a company’s unique needs to enable a custom engagement. Altify allows an organization to cleanly execute the handoff from marketing to sales teams so they can effectively position value, connect to power and get a deal done.”

Salesforce Quip is still in its early phases compared to traditional communication channels, like email and chat. Reynolds estimates that about 10% of Altify’s customers are using Quip today, while another 25% are exploring it.

Note: TechTarget offers ABM and project intelligence data and tool services.

DevOps value stream mapping plots course at Nationwide

SAN FRANCISCO — After a decade of change, Nationwide Insurance sees DevOps value stream mapping as its path to achieve IT nirvana, with an orderly flow of work from lines of business into the software delivery pipeline.

Since 2007, Nationwide Mutual Insurance Co., based in Columbus, Ohio, has streamlined workflows in these corporate groups according to Lean management principles, among software developers with the Agile method and in the software delivery pipeline with DevOps. Next, it plans to bring all those pieces together through an interface that creates a consistent model of how tasks are presented to developers, translated into code and deployed into production.

That approach, called value stream mapping, is a Lean management concept that originated at Toyota to record all the processes required to bring a product to market. Nationwide uses a feature called Landscape View in Tasktop Technologies’ Integration Hub model-based software suite to create its own record of how code artifacts flow through its systems, as part of an initiative to quicken the pace of software delivery.

Other DevOps software vendors, such as XebiaLabs and CollabNet, offer IT pros information about the health of the DevOps pipeline and its relationship to business goals. But Tasktop applies the Lean management concept of value stream mapping to DevOps specifically.

“It’s a diagram that shows all your connectivity and shows the flow of work,” said Carmen DeArdo, the technology director responsible for the software delivery pipeline at Nationwide, in an interview at DevOps Enterprise Summit here last week. “You can see how artifacts are flowing … What we’re hoping for in the future is more metrics and analytics around things like lead time.”

DevOps value stream mapping boosts pipeline consistency

Before Landscape View, Nationwide used Tasktop’s Sync product to integrate the tools that make up its DevOps pipeline. These tools include the following:

  • IBM Rational Doors Next Generation and Rational Team Concert software for team collaboration;
  • HP Quality Center  — now Micro Focus Quality Center Enterprise — for defect management;
  • Jenkins, GitHub and IBM UrbanCode for continuous integration and continuous delivery;
  • ServiceNow for IT service management;
  • New Relic and Splunk for monitoring;
  • IBM’s ChangeMan ZMF for mainframe software change management; and
  • Microsoft Team Foundation Server for .NET development.

One Tasktop Sync integration routes defects from HP Quality Center directly into a backlog for Agile teams in Rational Team Concert. Another integration feeds requirements in IBM Doors Next Generation into HP Quality Center to generate automated code tests.

However, the business still lacked a high-level understanding of how its products were brought to market, especially where business requirements were presented to the DevOps teams to be translated into software features and deployed.

Without that understanding, teams unsuccessfully tried to hasten software delivery with additional developers and engineers. However, that didn’t get to the root of delays in the creation of business requirements. Other attempts to bridge that gap with whiteboards, team meetings and consultants produced no sustainable improvements, DeArdo said.

The Landscape View value stream mapping software tool, however, presents a more objective view than anecdotal descriptions in a team meeting of how work flows to the DevOps team, from artifacts to deployments and incident responses. The software also helps the DevOps team understand lessons learned from incidents and apply them to application development backlogs.

Landscape View’s objective analysis of the DevOps pipeline, complete with its flaws, forces the IT team to set aside biases and misunderstandings and think about process improvement in a new way, DeArdo said. “It’s one thing to talk about value stream, and another to show a picture of what it could look like when things are connected.”

A screenshot of Tasktop Integration Hub's Landscape View feature, which helps Nationwide with DevOps value stream mapping.
A screenshot of Tasktop Integration Hub’s Landscape View feature, which helps Nationwide with DevOps value stream mapping.

A more accurate sense of how its processes work will help Nationwide more effectively improve those processes, DeArdo said. For example, the company has already amended how product defects move to the developer backlog, from an error-prone manual process that relied on email messages to an automated set of handoffs between software APIs.

DevOps to-do list and wish list still full

DevOps value stream mapping doesn’t mean Nationwide’s DevOps work is done. The company aims to use infrastructure as code more broadly and bring that aspect of IT under GitHub version control, as well as migrate more on-premises workloads to the public cloud. And even with the addition of value stream mapping software as an aid, it still struggles to introduce companywide systems thinking to a traditionally siloed set of business units and IT disciplines.

“We don’t really architect the value stream around the major DevOps metrics, [such as] frequency of deployment, reducing lead time or [mean time to resolution],” DeArdo said. “Maybe we do, in some sense, but not as intentionally as we could.”

To address this disparity, Nationwide will tie traditionally separate environments, which include a mainframe, into the same DevOps pipeline as the rest of its workloads.

Anything that has a request and a response and an SLA has a target on its back to be automated.
Carmen DeArdotechnology director, Nationwide

“We don’t buy in to the whole [bimodal] IT concept,” DeArdo said, in reference to a Gartner term that describes a DevOps approach limited to new applications, while legacy applications are managed separately. “[To say DevOps] is just for the cool kids, and if you’re on a legacy system, you need not apply, sends the wrong message.”

DeArdo would like Tasktop to extend DevOps value stream mapping on Integration Hub with the ability to run simulations of different value stream models to see what will work best. He’d also like to see more metrics and recommendations from Integration Hub to help identify what’s causing bottlenecks in the process and how to resolve them.

“Anything that has a request and a response and an SLA [service-level agreement] has a target on its back to be automated from a value stream perspective,” he said. “How can we make it self-service and improve it? If you can’t see it, you’re only touching part of the elephant.”

Beth Pariseau is senior news writer for TechTarget’s Data Center and Virtualization Media Group. Write to her at bpariseau@techtarget.com or follow @PariseauTT on Twitter.

Container security platforms diverge on DevSecOps approach

SAN FRANCISCO — Container security platforms have begun to proliferate, but enterprises may have to watch the DevSecOps trend play out before they settle on a tool to secure container workloads.

Two container security platforms released this month — one by an up-and-coming startup and another by an established enterprise security vendor — take different approaches. NeuVector, a startup that introduced an enterprise edition at DevOps Enterprise Summit 2017, supports code and container-scanning features that integrate into continuous integration and continuous delivery (CI/CD) pipelines, but its implementation requires no changes to developers’ workflow.

By contrast, a product from the more established security software vendor CSPi, Aria Software Defined Security, allows developers to control the insertion of libraries into container and VM images that enforce security policies.

There’s still significant overlap between these container security platforms. NeuVector has CSPi’s enterprise customer base in its sights, with added support for noncontainer workloads and Lightweight Directory Access Protocol. Software-defined security includes network microsegmentation features for policy enforcement that are NeuVector’s primary focus. And while developers inject software-defined security code into machine images, they aren’t expected to become security experts. Enterprise IT security pros set the policies enforced by software-defined security, and a series of wizards guide developers through the integration process for software-defined security libraries.

Both vendors also agree on this: Modern IT infrastructures with DevOps pipelines that deliver rapid application changes require a fundamentally different approach to security than traditional vulnerability detection and patching techniques.

There’s definitely a need for new security techniques for containers that rely less on layers of VM infrastructure to enforce network boundaries, which can negate some of the gains to be had from containerization, said Jay Lyman, analyst with 451 Research.

However, even amid lots of talk about the need to “shift left” and get developers involved with IT security practices, bringing developers and security staff together at most organizations is still much easier said than done, Lyman said.

NeuVector 1.3 container security platform
NeuVector 1.3 captures network sessions automatically when container threats are detected, a key feature for enterprises.

Container security platforms encounter DevSecOps growing pains

As NeuVector and CSPi product updates hit the market, enterprise IT pros at the DevOps Enterprise Summit (DOES) here this week said few enterprises use containers at this point, and the container security discussion is even further off their radar. By the time containers are widely used, DevSecOps may be more mature, which could favor CSPi’s more hands-on developer strategy. But for now, developers and IT security remain sharply divided.

Eventually, we’ll see more developer involvement in security, but it will take time and probably be pretty painful.
Jay Lymananalyst, 451 Research

“Everyone needs to be security-conscious, but to demand developers learn security and integrate it into their own workflow, I don’t see how that happens,” said Joan Qafoku, a risk consulting associate at KPMG LLP in Seattle who works with an IT team at a large enterprise client also based in Seattle. That client, which Qafoku did not name, gives developers a security-focused questionnaire, but security integration into their process goes no further than that.

NeuVector’s ability to integrate into the CI/CD pipeline without changes to application code or the developer workflow was a selling point for Tobias Gurtzick, security architect for Arvato, an international outsourcing services company based in Gütersloh, Germany.

Still, this integration wasn’t perfect in earlier iterations of NeuVector’s product, Gurtzick said in an interview before DOES. Gurtzick’s team polled an API every two minutes to trigger container and code scans with previous versions. NeuVector’s 1.3 release includes a new webhooks notification feature that more elegantly triggers code scans as part of continuous integration testing, without the performance overhead of polling the API.

“That’s the most important feature of the new version,” Gurtzick said. He also pointed to added support for detailed network session snapshots that can be used in forensic analysis. Software-defined security offers a similar feature with its first release.

While early adopters of container security platforms, such as Gurtzick, have settled the debate about how developers and IT security should bake security into applications, the overall market has been slower to take shape as enterprises hash out that collaboration, Lyman said.

“Earlier injection of security into the development process is better, but that still usually falls to IT ops and security [staff],” Lyman said. “Part of the DevOps challenge is aligning those responsibilities with application development. Eventually, we’ll see more developer involvement in security, but it will take time and probably be pretty painful.”

Beth Pariseau is senior news writer for TechTarget’s Data Center and Virtualization Media Group. Write to her at bpariseau@techtarget.com or follow @PariseauTT on Twitter.

Salesforce-Google integration bolsters CRM, analytics functionality

SAN FRANCISCO — Salesforce and Google cozied up a little more closely on several new product integrations: One offers a cloud alternative to Amazon Web Services, and another pushes Salesforce deeper into G Suite, Google’s subscription business applications cloud, which may send shock waves through the thriving ecosystem of partners that already provide small-business CRM to those apps.

The Salesforce-Google integration includes the naming of Google Cloud as Salesforce’s latest preferred public cloud for international customers, joining AWS, with which Salesforce partnered earlier this year.

The Salesforce-Google integration, announced earlier this month at Salesforce’s Dreamforce user conference, also includes Salesforce Lightning for Gmail and Google Sheets, as well as Quip Live Apps for Google Drive and Google Calendar. Salesforce’s Sales and Marketing Clouds will both have Google Analytics 360 embedded.

As part of the partnership, Google will also be offering existing Salesforce customers a one-year free trial of G Suite.

Salesforce is no stranger to partnering with other enterprise tech companies — even if some products compete — to offer its customers an enhanced experience, and that appears to be the reason behind the Salesforce-Google integration.

“In the past, [Salesforce] has announced some pretty big partnerships that turned out to maybe be not so big, but this is from a different angle,” said Michael Fauscette, chief research officer for G2 Crowd, based in Chicago. “They have a go-to-market strategy together.”

Integration to help SMBs

In the past, [Salesforce] has announced some pretty big partnerships that turned out to maybe be not so big, but this is from a different angle.
Michael Fauscettechief research officer, G2 Crowd

Small- and medium-sized businesses will benefit from the Salesforce-Google integration, because many of them already use Gmail and G Suite, according to analysts, and the Salesforce tie-in could make the San Francisco-based company attractive as a CRM option.

“I have a client, and the No. 1 challenge is adoption: They have these great tools and insights, but if people aren’t in there and feeding the engine and taking action, it doesn’t matter,” said Lisa Hager, global head of Salesforce practices for Mumbai-based Tata Consulting Services. “But if I go in to get my mail and the Salesforce platform prepopulates my email and spreadsheets, I’m more likely to go into that tool.”

Voices.com, which works with brands to find voice actors for campaigns and is based in London, Ont., has been a Salesforce customer for 12 years, and its CEO, David Ciccarelli, was enthusiastic about some of the Salesforce-Google integrations.

“Salesforce is a great system of record, but where it can improve upon is mass editing,” he said. “So, being able to one-click export from Salesforce into something manipulative like Google Sheets, make changes and one-click import back — that’s where you’ll see huge time savings.”

Marketing is where the data is

Salesforce has made a concerted effort to increase market share for its Marketing Cloud to match that of Sales Cloud and Service Cloud, and it looks to do that through data.

Just weeks after releasing B2B Lead Analytics for Facebook, Salesforce is embedding Google Analytics 360 into Marketing Cloud, giving marketers insights at the two leading data points on the internet.

“If you’re not in a separate data silo and you can ingest the Google Analytics 360 data on website visitors and keyword ad buys, with that integrated, you don’t have to take that data out and manually process it,” said Cindy Zhou, principal analyst for Constellation Research. “And there’s always data lost when you have to move it from one place to another. So, having it embedded natively will help you get deeper insights, and you can still apply Einstein on top to do audience segmentation and analysis.”

Ciccarelli of Voices.com said, as a smaller business, a license for Google Analytics 360 was always too much to budget for, but with it integrated into Salesforce at no additional cost, smaller companies will be able to receive enterprise-level insights.

Salesforce adds another storage cloud

The news of Salesforce adding another preferred public cloud for international expansion comes just months after Salesforce formed a similar partnership with AWS. The addition of Google Cloud is to address customer needs, according to Ryan Aytay, executive vice president for business development and strategic accounts for Salesforce.

“AWS continues to be an important part of our infrastructure, so nothing’s changing there,” Aytay said. “We’re just adding another preferred cloud and moving forward to address customer needs.”

Google and AWS are two of the three leaders in the cloud space, with the other being Microsoft’s Azure. Salesforce CEO Marc Benioff has been outspoken about his abhorrence toward former partner Microsoft, so it’s unlikely Salesforce will be partnering with the Seattle-based company anytime soon.

The move toward Google could have been a response to customers’ demands, according to Hager, as AWS is costly when it comes to cloud storage.

“Just being able to have that option of Google storage instead of AWS is important; I had three clients this morning complaining about the cost of AWS,” Hager said. “If you’re storing a lot of documents on Salesforce, it can get expensive. So, integrating with Google is a nice option.”

There’s some potential overlap with the integrated products, especially between Salesforce’s Quip and Google’s G Suite, but Salesforce executives aren’t worried about the overlap, with Aytay saying internally at Salesforce the company has used both products.

Zhou can see the products coexisting, but there’s also some “friendly competition” between G Suite and Quip, with the Salesforce product being a good alternative for companies creating contracts or requests for proposal.

Several of the Salesforce-Google integrations are already in market, including Lightning for Gmail and integrations with Calendar and Google Drive, with deeper integrations rolling out in 2018, according to the press release. Quip Live Apps integration with Google Drive is expected to be generally available in the first half of 2018 for $25 per user, per month with any Quip Enterprise License. And the integrations between Salesforce and Google Analytics 360 are expected in the first half of 2018 at no additional cost to licensed customers.

DevOps transformation in large companies calls for IT staff remix

SAN FRANCISCO — A DevOps transformation in large organizations can’t just rely on mandates from above that IT pros change the way they work; IT leaders must rethink how teams are structured if they want them to break old habits.

Kaiser Permanente, for example, has spent the last 18 months trying to extricate itself from 75 years of organizational cruft through a consumer digital strategy program led by Alice Raia, vice president of digital presence technologies. With the Kaiser Permanente website as its guinea pig, Raia realigned IT teams into a squad framework popularized by digital music startup Spotify, with cross-functional teams of about eight engineers. At the 208,000-employee Kaiser Permanente, that’s been subject to some tweaks.

“At our first two-pizza team meeting, we ate 12 pizzas,” Raia said in a session at DevOps Enterprise Summit here. Since then, the company has settled on an optimal number of 12 to 15 people per squad.

The Oakland, Calif., company decided on the squads approach when a previous model with front-end teams and systems-of-record teams in separate scrums didn’t work, Raia said. Those silos and a focus on individual projects resulted in 60% waste in the application delivery pipeline as of a September 2015 evaluation. The realignment into cross-functional squads has forced Kaiser’s website team to focus on long-term investments in products and faster delivery of features to consumers.

IT organizational changes vary by company, but IT managers who have brought about a DevOps transformation in large companies share a theme: Teams can’t improve their performance without a new playbook that puts them in a better position to succeed.

We had to break the monogamous relationships between engineers and [their] areas of interest.
Scott Nasellosenior manager of platforms and systems engineering, Columbia Sportswear Co.

At Columbia Sportswear Co. in Portland, Ore., this meant new rotations through various areas of focus for engineers — from architecture design to infrastructure building to service desk and maintenance duties, said Scott Nasello, senior manager of platforms and systems engineering, in a presentation.

“We had to break the monogamous relationships between engineers and those areas of interest,” Nasello said. This resulted in surprising discoveries, such as when two engineers who had sat next to each other for years discovered they’d taken different approaches to server provisioning.

Short-term pain means long-term gain

In the long run, the move to DevOps results in standardized, repeatable and less error-prone application deployments, which reduces the number of IT incidents and improved IT operations overall. But those results require plenty of blood, sweat and tears upfront.

“Prepare to be unpopular,” Raia advised other enterprise IT professionals who want to move to DevOps practices. During Kaiser Permanente’s transition to squads, Raia had the unpleasant task to inform executive leaders that IT must slow down its consumer-facing work to shore up its engineering practices — at least at first.

Organizational changes can be overwhelming, Nasello said.

“There were a lot of times engineers were running on empty and wanted to tap the brakes,” he said. “You’re already working at 100%, and you feel like you’re adding 30% more.”

IT operations teams ultimately can be crushed between the contradictory pressures of developer velocity on the one hand and a fear of high-profile security breaches and outages on the other, said Damon Edwards, co-founder of Rundeck Inc., a digital business process automation software maker in Menlo Park, Calif.

Damon Edwards, co-founder of Rundeck Inc., shares the lessons he learned from customers about how to reduce the impact of DevOps velocity on IT operations.
Damon Edwards, co-founder of Rundeck Inc., shares the lessons he learned from customers about how to reduce the impact of DevOps velocity on IT operations.

A DevOps transformation means managers must empower those closest to day-to-day systems operations to address problems without Byzantine systems of escalation, service tickets and handoffs between teams, Edwards said.

Edwards pointed to Rundeck customer Ticketmaster as an example of an organizational shift toward support at the edge. A new ability to resolve incidents in the company’s network operations center — the “EMTs” of IT incident response — reduced IT support costs by 55% and the mean time to response from 47 minutes to 3.8 minutes on average.

“Silos ruin everything — they’re proven to have a huge economic impact,” Edwards said.

And while DevOps transformations pose uncomfortable challenges to the status quo, some IT ops pros at big companies hunger for a more efficient way to work.

“We’d like a more standardized way to deploy and more investment in the full lifecycle of the app,” said Jason Dehn, systems analyst for a large U.S. retailer he asked not to be named. But some lines of business at the company are happy with a status quo, where they aren’t entangled in day-to-day application maintenance.

“Business buy-in can be the challenge,” Dehn said.

Beth Pariseau is senior news writer for TechTarget’s Data Center and Virtualization Media Group. Write to her at bpariseau@techtarget.com or follow @PariseauTT on Twitter.

By aligning sales and marketing, Mizuho OSI could sell faster

SAN FRANCISCO — To take on larger competitors with more resources, medical equipment manufacturer Mizuho OSI had to create a faster track from lead generation to sales.

To work smarter and faster to identify leads and close sales, the Union City, Calif., company broke down its internal silos, deciding that aligning sales and marketing departments would be its best bet.

“We had a gap in collaboration,” said Greg Neukirch, vice president of sales and marketing at Mizuho OSI, during a session at Dreamforce 2017 this week. “We needed to be smarter and faster and improve our customer experience beyond what we did in the past.”

Neukirch added that the company did extensive research to see which software tools could aid in aligning sales and marketing. It ultimately chose Salesforce for CRM and Salesforce Pardot for marketing automation.

“We had a sales team wanting more and a marketing team trying to give more, and we looked at how we could leverage Salesforce and Pardot to close the gap between those two functions,” Neukirch said.

Mizuho OSI adopted Salesforce in February 2016 and Pardot a year later, working to ensure close collaboration between the sales and marketing departments.

Bringing sales, marketing together

We had a sales team wanting more and a marketing team trying to give more, and we looked at how we could leverage Salesforce and Pardot to close the gap.
Greg Neukirchvice president of sales and marketing, Mizuho OSI

Breaking down internal silos for businesses is a common problem, because sales and marketing departments have historically had different objectives. But as consumers have become more educated through the buying process, aligning sales and marketing is a strategy that can bring a company more customers — it’s not an easy process, however.

“There was skepticism in our sales department,” Neukirch said. “They didn’t know the products or understand why they needed to do something different. But it was up to us to help communicate that value.”

New Salesforce Sales Cloud features are designed to make it easier for customers to better align sales and marketing. With the Lightning Data feature, for example, companies can discover and import new potential customers, according to Brooke Lane, director of product management for Sales Cloud.

“In today’s setting, we want to quickly close deals and also better understand customers,” Lane said. “With [the new feature] Campaign Management, it can help you show the impact of marketing activities on the sales pipeline. We want to continue bridging Salesforce and Pardot so you’re not troubled with tasks.”

Addressing implementation challenges

Mizuho OSI’s transition to a more efficient, modern customer journey — one that shortened the time for a prospect to become a customer — hasn’t come without challenges.

“Sales can’t do things on its own,” said Chris Lisle, director of North American sales at Mizuho OSI. “But the biggest hurdle was getting sales to adopt a new tool.”

Mizuho OSI ran into some hurdles during the implementation — mainly the time it takes to successfully change how the organization is run.

“We took time to identify the problems we wanted to solve — mainly that our customer journey was outdated,” said Kevin McCallum, director of marketing at Mizuho OSI. “We needed an aggressive timeline for our deployment, but however long you think it’ll take, it takes longer than that.”

But by aligning sales and marketing departments at the start of the project, Mizuho OSI was able to start modernizing its customer journey.

“Sales had full visibility with what we were doing and what we were working on and helped through the journey,” McCallum said.

Neukirch agreed, calling the alignment essential.

“To get that collaboration and see the departments come together, we were able to move faster,” Neukirch said.

And while the company is still aligning sales and marketing, it has seen anecdotal benefits of the change.

“What we did in the last nine months exceeded our expectations,” Neukirch said. “We were following that vision and executing on the deliverables and making sure we kept focus with how the customer could interact with us better and faster, so we’d have the opportunity to outpace the folks we’re in market against.”

Experts: Time is nigh for a Salesforce Lightning migration

SAN FRANCISCO — Two years ago at Dreamforce, Salesforce unveiled Lightning, a platform-wide upgrade that paved the way for other new features, including the abundance of Einstein products. It also — although not overtly — paved the way for a future Salesforce Lightning migration as Salesforce Classic fades away.

And that future may be here now. The topic of a Salesforce Lightning migration was a popular one on the opening day of Dreamforce Monday — two different Lightning migration sessions were at capacity during the morning, with dozens of attendees turned away in both cases.

“At the end of the day, Salesforce won’t be doing anything on the old style anymore,” said Alan Lepofsky, a principal analyst at Constellation Research. “If a customer is hesitant to move to Lightning, I don’t want to imply that Salesforce is pushing customers along, but everything forward will be Lightning.”

At a Service Cloud roadmap session at Dreamforce, that sentiment was solidified by Salesforce executives.

“Some features aren’t possible in Salesforce Classic,” said Jon Aniano, senior vice president of product management for Service Cloud.

Dreamforce 2017 kicked off Monday
Sessions focusing on how to prepare for a Salesforce Lightning migration were popular during the first day at Dreamforce in San Francisco Monday.

‘A stealth deployment’

The lack of new features in Salesforce Classic was a big reason why CommScope, a network infrastructure provider based in Hickory, N.C., recently migrated from Salesforce Classic to Lightning.

The company began using Salesforce in 2012. This past spring, it launched its Salesforce Lightning migration, giving itself six months to complete the adoption, said Danelle Lockwood, an analyst of sales operations at CommScope.

If we wanted to use new features, it was in Lightning. Salesforce isn’t doing anything in Classic — everything is in Lightning.
Danelle Lockwoodanalyst of sales operations, CommScope

“If we wanted to use new features, it was in Lightning,” Lockwood said. “Salesforce isn’t doing anything in Classic — everything is in Lightning.

“We gave ourselves six months, but it wasn’t needed,” she added. “A lot of things worked during the migration.”

To ensure processes would still work after the migration to Lightning, Lockwood said CommScope was able to test everything before turning the switch on.

“We moved everything to production, and then didn’t turn it on,” Lockwood said. “It was a stealth deployment and we retested everything.”

The move was a result of Salesforce slowly fading out Salesforce Classic — almost forcing customers to launch a Salesforce Lightning migration.

Training your workforce

While new features will most likely only be available on Lightning, another impediment keeping longtime users from implementing a Salesforce Lightning migration is the potential training involved with a new user interface.

But by giving your workforce enough time and freedom to explore Lightning, according to Lockwood, users can mostly train themselves.

“We didn’t do much training. We sent out a two-minute introduction video that focused on the parts that were new,” Lockwood said. “When we did our testing, we had our users do it and had them go in and create an opportunity and add a contact. We didn’t tell them how to, so that we could figure out where they had problems and tailor the little bits of training toward those problems.”

End-user security requires a shift in corporate culture

SAN FRANCISCO — An internal culture change can help organizations put end-user security on the front burner.

If an organization only addresses security once a problem arises, it’s already too late. But it’s common for companies, especially startups, to overlook security because it can get in the way of productivity. That’s why it’s important for IT departments to create a company culture where employees and decision-makers take security seriously when it comes to end-user data and devices.

“Security was definitely an afterthought,” said Keane Grivich, IT infrastructure manager at Shorenstein Realty Services in San Francisco, at last week’s BoxWorks conference. “Then we saw some of the high-profile [breaches] and our senior management fully got on board with making sure that our names didn’t appear in the newspaper.”

How to create a security-centric culture

Improving end-user security starts with extensive training on topics such as what data is safe to share and what a malicious website looks like. That forces users to take responsibility for their actions and understand the risks of certain behaviors.

Plus, if security is a priority, the IT security team will feel like a part of the company, not just an inconvenience standing in users’ way.

“Companies get the security teams they deserve,” said Cory Scott, chief information security officer at LinkedIn. “Are you the security troll in the back room or are you actually part of the business decisions and respected as a business-aligned person?”

Finger-pointing is a complete impediment to learning.
Brian Roddyengineering executive, Cisco

When IT security professionals feel that the company values them, they are more likely to stick around as well. With the shortage of qualified security pros, retaining talent is key.

Keeping users involved in the security process helps, too. Instead of locking down a user’s PC when a user accesses a suspicious file, for example, IT can send him a message checking if he performed a certain action. If the user says he accessed the file, then IT knows someone is not impersonating the user. If he did not, then IT knows there is an intruder and it must act.

To keep end-user security top of mind, it’s important to make things such as changing passwords easy for users. IT can make security easier for developers as well by setting up security frameworks that they can apply to applications they’re building.

It’s also advisable to take a blameless approach when possible.

“Finger-pointing is a complete impediment to learning,” said Brian Roddy, an engineering executive who oversees the cloud security business at Cisco, in a session. “The faster we can be learning, the better we can respond and the more competitive we can be.”

Don’t make it easy for attackers

Once the end-user security culture is in place, IT should take steps to shore up the simple things.

Unpatched software is one of the easiest ways for attackers to enter a company’s network, said Colin Black, COO at CrowdStrike, a cybersecurity technology company based in Sunnyvale, Calif.

IT can also make it harder for hackers by adding extra security layers such as two-factor authentication.