Tag Archives: operations

Transforming IT infrastructure and operations to drive digital business

It’s time for organizations to modernize their IT infrastructure and operations to not just support, but to drive digital business, according to Gregory Murray, research director at Gartner.

But to complete that transformation, organizations need to first understand their desired future state, he added.

“The future state for the vast majority of organizations is going to be a blend of cloud, on prem and off prem,” Murray told the audience at the recent Gartner Catalyst conference. “What’s driving this is the opposing forces of speed and control.”

From 2016 to 2024, the percentage of new workloads that will be deployed through on-premises data centers is going to plummet from about 80% to less than 20%, Gartner predicts. During the same period, cloud adoption will explode — going from less than 10% to as much as 45% — with off-premises, colocation and managed hosting facilities also picking up more workloads.

IT infrastructure needs to provide capabilities across these platforms, and operations must tackle the management challenges that come with it, Murray said.

How to transform IT infrastructure and operations

Once organizations have defined their future state — and Murray urged organizations to start with developing a public cloud strategy to determine which applications will be in the cloud — they should begin modernizing their infrastructure, he told the audience at the Gartner Catalyst conference. 

“Programmatic control is the key to enabling automation and automation is, of course, critical to addressing the disparity between the speed that we can deliver and execute in cloud, and improving our speed of execution on prem,” he said. 

Organizations will also need developers with the skills to take advantage of it, he said. Another piece of the automation equation when modernizing the infrastructure to gain speed is standardization, he said.

The future state for the vast majority of organizations is going to be a blend of cloud, on prem and off prem.
Gregory Murrayresearch director, Gartner

“We need to standardize around those programmatic building blocks, either by using individual components of software-defined networking, software-defined compute and software-defined storage, or by using a hyper-converged system.”

Hyper-converged simplifies the complexity associated with establishing programmatic control and helps create a unified API for infrastructure, he said.

Organizations also need to consider how to uplevel their standardization, according to Murray. This is where containers come into play. The atomic unit of deployment is specific to an application and it abstracts much of the dependencies and complications that come with moving an application independent of its operating system, he explained.

“And if we can do that, now I have a construct that I can standardize around and deploy into cloud, into on prem, into off prem and give it straight to my developers and give them the ability to move quickly and deploy their applications,” he said.

Hybrid is the new normal

To embrace this hybrid environment, Murray said organizations should establish a fundamental substrate to unify these environments.

“The two pieces that are so fundamental that they precede any sort of hybrid integration is the concept of networks — specifically your WAN and WAN strategy across your providers — and identity,” Murray said. “If I don’t have fundamental identity constructs, governance will be impossible.”

Organizations looking to modernize their network for hybrid capabilities should resort to SD-WAN, Murray said. This provides software-defined control that extends outside of the data center and allows a programmatic approach and automation around their WAN connectivity to help keep that hybrid environment working together, he explained.

But to get that framework of governance in place across this hybrid environment requires a layered approach, Murray said. “It’s a combination of establishing principles, publishing the policies and using programmatic controls to bring as much cloud governance as we can.”

Murray also hinted that embracing DevOps is the first step in “a series of cultural changes” that organizations are going to need to truly modernize IT infrastructure and operations. For those who aren’t operating at agile speed, operations still needs to get out of the business of managing tickets and delivering resources and get to a self-service environment where operations and IT are involved in brokering the services, he added.

There is also need to have a monitoring framework in place to gain visibility across the environment. Embracing AIOps — which uses big data, data analytics and machine learning — can help organizations become more predictive and more proactive with their operations, he added.

Kubernetes networking expands its horizons with service mesh

Enterprise IT operations pros who support microservices face a thorny challenge with Kubernetes networking, but service mesh architectures could help address their concerns.

Kubernetes networking under traditional methods faces performance bottlenecks. Centralized network resources must handle an order of magnitude more connections once the user migrates from VMs to containers. As containers appear and disappear much more frequently, managing those connections at scale quickly can create confusion on the network, and stale information inside network management resources can even misdirect traffic.

IT pros at KubeCon this month got a glimpse at how early adopters of microservices have approached Kubernetes networking issues with service mesh architectures. These network setups are built around sidecar containers, which act as a proxy for application containers on internal networks. Such proxies offload networking functions from application containers and offer a reliable way to track and apply network security policies to ephemeral resources from a centralized management interface.

Proxies in a service mesh better handle one-time connections between microservices than can be done with traditional networking models. Service mesh proxies also tap telemetry information that IT admins can’t get from other Kubernetes networking approaches, such as transmission success rates, latencies and traffic volume on a container-by-container basis.

“The network should be transparent to the application,” said Matt Klein, a software engineer at San Francisco-based Lyft, which developed the Envoy proxy system to address networking obstacles as the ride-sharing company moved to a microservices architecture over the last five years.

“People didn’t trust those services, and there weren’t tools that would allow people to write their business logic and not focus on all the faults that were happening in the network,” Klein said.

With a sidecar proxy in Envoy, each of Lyft’s services only had to understand its local portion of the network, and the application language no longer factored in its function. At the time, only the most demanding web application required proxy technology such as Envoy. But now, the complexity of microservices networking makes service mesh relevant to more mainstream IT shops.

The National Center for Biotechnology Information (NCBI) in Bethesda, Md., has laid the groundwork for microservices with a service mesh built around Linkerd, which was developed by Buoyant. The bioinformatics institute used Linkerd to modernize legacy applications, some as many as 30 years old, said Borys Pierov, a software developer at NCBI.

Any app that uses the HTTP protocol can point to the Linkerd proxy, which gives NCBI engineers improved visibility and control over advanced routing rules in the legacy infrastructure, Pierov said. While NCBI doesn’t use Kubernetes yet — it uses HashiCorp Consul and CoreOS rkt container runtime instead of Kubernetes and Docker — service mesh will be key to container networking on any platform.

“Linkerd gave us a look behind the scenes of our apps and an idea of how to split them into microservices,” Pierov said. “Some things were deployed in strange ways, and microservices will change those deployments, including the service mesh that moves us to a more modern infrastructure.”

Matt Klein speaks at KubeCon
Matt Klein, software engineer at Lyft, presents the company’s experiences with service mesh architectures at KubeCon.

Kubernetes networking will cozy up with service mesh next year

Linkerd is one of the most well-known and widely used tools among the multiple open source service mesh projects in various stages of development. However, Envoy has gained notoriety because it underpins a fresh approach to the centralized management layer, called Istio. This month, Buoyant also introduced a better performing and efficient successor to Linkerd, called Conduit.

Linkerd gave us a look behind the scenes of our apps … Some things were deployed in strange ways, and microservices will change those deployments, including the service mesh that moves us to a more modern infrastructure.
Borys Pierovsoftware developer, National Center for Biotechnology Information

It’s still too early for any of these projects to be declared the winner. The Cloud Native Computing Foundation (CNCF) invited Istio’s developers, which include IBM, Microsoft and Lyft, to make the Istio CNCF project, CNCF COO Chris Aniszczyk said at KubeCon. But Buoyant also will formally present Conduit to the CNCF next year, and multiple projects could coexist within the foundation, Aniszczyk said.

Kubernetes networking challenges led Gannett’s USA Today Network to create its own “terrible, over-orchestrated” service mesh-like system, in the words of Ronald Lipke, senior engineer on the USA Today platform-as-a-service team, who presented on the organization’s Kubernetes experience at KubeCon. HAProxy and the Calico network management system have supported Kubernetes networking in production so far, but there have been problems under this system with terminating nodes cleanly and removing them from Calico quickly so traffic isn’t misdirected.

Lipke likes the service mesh approach, but it’s not yet a top priority for his team at this early stage of Kubernetes deployment. “No one’s really asking for it yet, so it’s taken a back seat,” he said.

This will change in the new year. The company plans to rethink the HAproxy approach to reduce its cloud resource costs and improve network tracing for monitoring purposes. The company has done proof-of-concept evaluations around Linkerd and plans to look at Conduit, he said in an interview after his KubeCon session.

Beth Pariseau is senior news writer for TechTarget’s Data Center and Virtualization Media Group. Write to her at bpariseau@techtarget.com or follow @PariseauTT on Twitter.

Procurement transformation a main focus at CPO Rising Summit

BOSTON — Corporate procurement and supply chain operations must undergo a modern digital transformation, or the companies will be left behind.

This procurement transformation will be driven by real-time processes and next-generation technologies that allow procurement professionals to see what’s ahead and react immediately to any changes in the conditions, according to Tom Linton, chief procurement officer and supply chain officer for Flex, a company that designs and builds intelligent devices for a variety of industries.

Linton spoke at the CPO Rising Summit, a conference for procurement and supply chain professionals sponsored by the research firm Ardent Partners.

“We have to operate in real time and have systems and business processes that operate in real time, because the velocity of the business is going to continue to get faster,” Linton said. “Everything, whether you’re looking at technology or medicine or information systems, is moving faster. If we can’t communicate or conduct business in real time, we actually consider ourselves failing or falling behind.”

Every generation of every product today is smarter than the one that came before, Linton explained, and the average generational change is just nine months. Procurement needs to keep up with this increase in intelligence and start to take advantage of the new opportunities.

“How do we operate in an age of intelligence?” Linton asked. “How do we operate in a world which is not about the internet of things, because the things themselves are getting more intelligence? How do you develop a system of intelligence in procurement that helps us identify where we are in this progression?”

Visualization helps show where you’re going

One way to do this is through visualization, where information is presented in more digestible ways for procurement.

“What if everything you need to know about your business is available to you in the same time that you can open Uber on your smartphone?” Linton asked.

Flex built a procurement environment, called Flex Pulse, which uses a 100-foot wall of interactive monitors that display up to 58 applications that tell what’s going on with purchases and transactions in real time, according to Linton.

“The idea with Flex Pulse is to take that data and actually make it actionable,” Linton said. “It’s not doing anything truly different; it’s just taking information and restructuring it to make it more digestible for the users.”

The need for the procurement transformation to get up to speed was echoed at a subsequent expert panel.

Need to build trust in transactions

Mike Palackdharry, president and CEO of Aquiire, a Cincinnati-based B2B purchasing and supply chain process technology company, said real-time and next-generation technologies will drive the transformation.

“Things like blockchain, machine learning, AI and natural language processing are all about increasing the speed, the transparency and the trust within the supply chain. And all of that is about real time and how we create communications between buyers and sellers in real time, where we can trust the transaction and the accuracy of the data,” Palackdharry said.

The ultimate goal will be to provide systems that guide buyers to where you want them to go.

“It’s about how you use all of this real-time information that you’re gathering to guide your users to the items that you want them to buy,” said Paul Blake, technology product marketing leader for GEP, a provider of procurement technology in Clark, N.J. “It’s not just about cost savings; it’s about all the value you can bring into the supply chain and how we guide the users to those items.”

Procurement software will need to be fully functional to allow users to do everything they need to do, but underlying complexity must fall under a simple user experience, according to Blake.

“Increasingly, because of our changing expectations and innovations in technology, it has to be able to be used in the same way as all the other technologies around us,” Blake said. “The user experience, ease of use, seamless and formless interface with the technology is a major driving force in what’s going to deliver value in the future. It’s simplicity and complexity represented in the single whole — difficult to achieve, but that’s where I see it going today.”

The future is now — maybe

However, Blake cautioned the procurement transformation may not happen in the immediate future.

It’s extremely difficult to change. If you have a supertanker of a mammoth corporation, you need 100 miles to slow down and change direction.
Paul Blaketechnology product marketing leader at GEP

“In the 1990s, there were major corporations that said, ‘We think we need software that helps us to buy stuff more effectively.’ And today, there are still corporations saying the same thing,” Blake said. “There’s enormous inertia in the corporate world toward adopting new technologies, not because there isn’t the will to do something or the technology isn’t there, but because it’s extremely difficult to change. If you have a supertanker of a mammoth corporation, you need 100 miles to slow down and change direction.”

The procurement transformation is interesting and has potential, but real time may not be quite ready for the real world of procurement today, according to conference attendee Lynn Meltzer, director of sourcing for Staples, the office supply retailer based in Framingham, Mass.

Staples transitioned from a largely paper- and spreadsheet-based procurement system to Coupa, a cloud-based procurement SaaS platform, in the past year, Meltzer said.

“If you are just now getting a procure-to-pay system and you’re working to pull in your processes and your data and get there, then the timeline is highly compressed from where you are today to what they’re saying about the next 10 years,” she said. “It doesn’t mean that it can’t happen; you’ve just got to show the value and senior management fully buys in.”

It will be important to define the next step on the procurement transformation journey, said Jaime Steele, Staples’ senior director of procurement operations, and that probably won’t involve advanced AI or blockchain yet.

“The next step, not only for us but in the procurement industry, is that you’ve got to punch this out to every system and company next,” Steele said. “So, the realistic next step might be a simple chatbot, and nobody has done that well yet, so you need to solve the more basic things first.”

Meltzer agreed that certain basic things need to be taken care of before procurement organizations can use technology like blockchains.

“When you think about blockchain, you can’t move yourself to that until you figure how you can get that into a place where a robot can grab it or AI can figure out how to make some kind of decision on it,” she said. “I think those are some of the things that need to get sorted through, and it’s going to take a little bit of time. I would probably put it in five to 10 years, but I don’t see full automation getting in there anytime soon.”

Are companies with a SOC team less likely to get breached?

Companies outsource functions of security operations centers. But most agree that management of strategic activities — security planning, alignment to the business, performance assessments — should stay in-house.

Are companies that have information security operations centers (SOCs)
less likely to get breached? That data is hard to come by. Target did not respond to automated warnings about suspicious activity during its 2013 breach. The SOC manager left the retailer in October. The breach occurred in November and was publicly acknowledged by Target on December 19, 2013, after Brian Krebs reported it on his Krebs on Security blog. According to reports by Bloomberg Businessweek and others, alerts issued by FireEye malware detection were noted by Target’s security staff in India but then ignored by the SOC team in the United States.

Today, the retail company runs a 24/7 Cyber Fusion Center at its Northern Campus in Brooklyn Park, Minnesota. A recent job posting for an event analyst noted that the future SOC team member would work with the company’s Cyber Threat Intelligence team and participate in “cyber hunt activities” as needed, in addition to security information and event management, log management and a host of other duties to assess and detect cyberthreats in the retailer’s global operations.

In this issue, technology journalist Steve Zurier looks at information security operations centers and reports on tools integration, future automation and SOC team staffing — in May, he covered the role of threat hunters in modern SOCs. What is it going to take to improve SOC capabilities going forward? A 2017 SANS Institute report found that lack of visibility is a major problem, especially detection of unknown threats. Of the 309 IT professionals surveyed worldwide, 61% indicated that their security operations were centralized, but only 32% reported close integration between the SOC team and network operations center. Better information sharing and automation of SOC performance metrics — 69% of those surveyed who compile metrics said they must do a lot of the data collection and analysis manually — could help take security operations to the “next level,” according to SANS.

Vulnerability management and patch management are also getting increased scrutiny at many organizations after the Equifax breach and global ransomware attacks that some speculate could have been avoided. CISO James Ringold looks at risk-based vulnerability management strategies and explains why investing in this process is worth consideration.

Two security leaders who moved to the private sector after working on cybersecurity initiatives in Washington, D.C., during the Obama administration are also profiled this month: Phyllis Schneck, managing director of Promontory Financial Group, now an IBM company, and Alissa Johnson, the CISO at Xerox.

“I learned that there really isn’t a lot of difference between there and here,” Johnson said. “Xerox has no nuclear secrets, but hackers are still attacking us and trying to get data using the same tools and technology.”

Nuance global communication and collaboration transform with Office 365

Today’s post was written by Craig Preston, IT vice president of Infrastructure and Operations at Nuance Communications.

Picture of Craig Preston, IT vice president of infrastructure and operations at Nuance Communications.You might not know it, but our technology is embedded in a whole lot of devices and products that folks use every day. There’s a good chance that Nuance developed the predictive texting feature on your smartphone and smartwatch. Our speech recognition software is used by most automakers to transform a voice command into driving directions or to ask your TV remote to change the channel.

What we develop here is people-focused and empowering. Once you start using it, you don’t realize how easy and effortless it can be, nor can you imagine living without it.

Just like Office 365 cloud computing.

We have more than 8,000 employees who need to communicate and collaborate across 63 global offices. When we got to the point where we could put our on-premises solution out to pasture, we wanted something different that simply worked, that employees would use because it was easy and not tethered to certain devices and technologies.

We did a cost modeling exercise comparing Office 365 against our on-premises 2010 Microsoft Exchange, Office, and SharePoint platforms, as well as our Skype for Business solution. It wasn’t hard at all to sort out that Office 365 would save us a lot of money over the next five years. We even considered Google at one point, but we really liked Office 365’s integrated services as part of Microsoft’s cloud offerings.

All employees’ distribution lists and public folders from our on-premises environment were moved over to Office 365 in just 90 days. We used Office 365 ProPlus to acquire the latest Office apps. We deployed Exchange Online, OneDrive for Business, SharePoint Online, and Planner. We’re also using Microsoft Teams, Skype for Business Online, and Yammer.

We planned ahead thoughtfully for deployment and training. As a result, the process was low-impact and practically a seamless migration. In fact, migrating from Exchange and Skype on-premises to Office 365 Exchange Online and Skype for Business Online was so transparent that most users didn’t even notice it happened.

Our Office 365 deployment is just the first phase of moving to the Microsoft Cloud. We are checking out other Microsoft Cloud offerings to see what makes sense and when.

For Nuance, Skype for Business is one of the most popular Office 365 services. We had been using an audio and web conferencing system patched together from multiple vendors. It wasn’t easy to manage, it didn’t always work that well, and compared to Office 365, it cost us several million dollars more a year to operate.

Skype for Business conferencing, with real-time collaborating and screen sharing, is incredibly simple to use. The interface is intuitive and familiar in its design. We’ve had very few requests for help using it, and we are far from nostalgic for what we used to have in place.

Team collaboration and mobility have noticeably improved. I hardly ever use my desk phone anymore. Skype for Business plays nice with other technology in our IT infrastructure. Its telephone and desktop video conferencing are super quick and simple to initiate.

With just one click, I am on a conference call with my colleagues in our office in Germany. There’s no lag time; the connection is clear. And it eliminates a good amount of our former third-party costs. We’re now planning to evaluate our on-premises PBX solution for replacement with the Phone System feature in Office 365.

With this great experience under our belts, we are proceeding to eliminate the need for personal and departmental on-premises file shares and SharePoint sites. We are moving all personal data to OneDrive for Business, and all departmental file shares and SharePoint sites to SharePoint Online.

I don’t even get charged for additional storage or clients to use SharePoint Online because it’s all included in Office 365. We can make it better and easier for employees to access data from anywhere, on any device. I won’t miss on-premises storage and backup costs. I have so many other things to spend my IT budget on.

Folks from around the company have been using SharePoint Online, OneDrive for Business, and Microsoft Teams to stay productive when they’re on the road. Some employees even prefer to use Office 365 Portal when they are at the office. The other day, I co-authored a document with a colleague from a different office. The content was targeted to a group of employees who travel a lot for work. As soon as we were done with the document, I saved it to a shared team site, which made it immediately accessible from mobile devices. Things like this are really big productivity gains for us.

If something were to go wrong with our email and collaboration systems, I’d be the first to hear about it. I needed something that just works as advertised and won’t make the hair on my head turn prematurely gray. I place a lot of trust in the Microsoft Cloud for good reason—it has earned it. The folks at Microsoft are forward-thinking and built a cloud that IT folks like me will use with confidence.

Here at Nuance, the security of our networks and systems is absolutely critical. As part of our Office 365 subscription, we’re taking advantage of advanced security features that support compliance requirements. Microsoft’s security roadmap is solid and smart. And that’s not very common these days.

Back when we started, we talked about the transitional aspect to the Microsoft Office 365 platform. Now that we have been using it for a while, I’d say it’s been more like transformational.

—Craig Preston