Tag Archives: organizations

Five questions to ask before purchasing NAC products

As network borders become increasingly difficult to define, and as pressure mounts on organizations to allow many different devices to connect to the corporate network, network access control is seeing a significant resurgence in deployment.

Often positioned as a security tool for the bring your own device (BYOD) and internet of things (IoT) era, network access control (NAC) is also increasingly becoming a very useful tool in network management, acting as a gatekeeper to the network. It has moved away from being a system that blocks all access unless a device is recognized, and is now more permissive, allowing for fine-grained control over what access is permitted based on policies defined by the organization. By supporting wired, wireless and remote connections, NAC can play a valuable role in securing all of these connections.

Once an organization has determined that NAC will be useful to its security profile, it’s time for it to consider the different purchasing criteria for choosing the right NAC product for its environment. NAC vendors provide a dizzying array of information, and it can be difficult to differentiate between their products.

When you’re ready to buy NAC products and begin researching your options — and especially when speaking to vendors to determine the best choice for your organization — consider the questions and features outlined in this article.

NAC device coverage: Agent or agentless?

NAC products should support all devices that may connect to an organization’s network. This includes many different configurations of PCs, Macs, Linux devices, smartphones, tablets and IoT-enabled devices. This is especially true in a BYOD environment.

NAC agents are small pieces of software installed on a device that provide detailed information about the device — such as its hardware configuration, installed software, running services, antivirus versions and connected peripherals. Some can even monitor keystrokes and internet history, though that presents privacy concerns. NAC agents can either run scans as a one-off — dissolvable — or periodically via a persistently installed agent.

If the NAC product uses agents, it’s important that they support the widest variety of devices possible, and that other devices can use agentless NAC if required. In many cases, devices will require the NAC product to support agentless implementation to detect BYOD and IoT-enabled devices and devices that can’t support NAC agents, such as printers and closed-circuit television equipment. Agentless NAC allows a device to be scanned by the network access controller and be given the correct designation based on the class of the device. This is achieved with aggressive port scans and operating system version detection.

Agentless NAC is a key component in a BYOD environment, and most organizations should look at this as must-have when buying NAC products. Of course, gathering information via an agent will provide more information on the device, but it’s not viable on a modern network that needs to support many different devices.

Does the NAC product integrate with existing software and authentication?

This is a key consideration before you buy an NAC product, as it is important to ensure it supports the type of authentication that best integrates with your organization’s network. The best NAC products should offer a variety of choices: 802.1x — through the use of a RADIUS server — Active Directory, LDAP or Oracle. NAC will also need to integrate with the way an organization uses the network. If the staff uses a specific VPN product to connect remotely, for example, it is important to ensure the NAC system can integrate with it.

Supporting many different security systems that do not integrate with one another can cause significant overhead. A differentiator between the different NAC products is not only what type of products they integrate with, but also how many systems exist within each category.

Consider the following products that an organization may want to integrate with, and be sure that your chosen NAC product supports the products already in place:

1. Security information and event management

2. Vulnerability assessment

3. Advanced threat detection

4. Mobile device management

5. Next-generation firewalls

Does the NAC product aid in regulatory compliance?

NAC can help achieve compliance with many different regulations, such as the Payment Card Industry Data Security Standard, HIPAA, International Organization for Standardization 27002 — ISO 27002 — and the National Institute of Standards and Technology. Each of these regulations stipulates certain controls regarding network access that should be implemented, especially around BYOD, IoT and rogue devices connecting to the network.

By continually monitoring network connections and performing actions based on the policies set by an organization, NAC can help with compliance with many of these regulations. These policies can, in many cases, be configured to match those of the compliance regulations mentioned above. So, when buying NAC products, be sure to have compliance in mind and to select a vendor that can aid in this process — be it through specific knowledge in its support team or through predefined policies that can be tweaked to provide the compliance required for your individual business.

What is the true cost of buying an NAC product?

The price of NAC products can be the most significant consideration, depending on the budget you have available for procurement. Most NAC products are charged per endpoint (device) connected to the network. On a large network, this can quickly become a substantial cost. There are often also hidden costs with NAC products that must be considered when assessing your purchase criteria.

Consider the following costs before you buy an NAC product:

A differentiator between the different NAC products is not only what type of products they integrate with, but also how many systems exist within each category.

1. Add-on modules. Does the basic price give organizations all the information and control they need? NAC products often have hidden costs, in that the basic package does not provide all the functionality required. The additional cost of add-on modules can run into tens of thousands of dollars on a large network. Be sure to look at what the basic NAC package includes and investigate how the organization will be using the NAC system. Specific integrations may be an additional cost. Is there extra functionality that will be required in the NAC product to provide all the benefits required?

2. Upfront costs. Are there any installation charges or initial training that will be required? Be sure to factor these into the calculation, on top of the price per endpoint — of course.

3. Support costs. What level of support does the organization require? Does it need one-off or regular training, or does it require 24/7 technical support? This can add significantly to the cost of NAC products.

4. Staff time. While not a direct cost of buying NAC products, consider how much monitoring an NAC system requires. Time will need to be set aside not only to learn the NAC system, but to manage it on an ongoing basis and respond to alerts. Even the best NAC systems will require staff to be trained so if problems occur, there will be people available to address the issues.

NAC product support: What’s included?

Support from the NAC manufacturer is an important consideration from the perspective of the success of the rollout and assessing the cost. Some of the questions that should be asked are:

  1. What does the basic support package include?
  2. What is the cost of extended support?
  3. Is support available at all times?
  4. Does the vendor have a significant presence in the organization’s region? For example, some NAC providers are primarily U.S.-based, and if an organization is based in EMEA, it may not provide the same level of support.
  5. Is on-site training available and included in the license?

Support costs can significantly drive up the cost of deployment and should be assessed early in the procurement process.

What to know before you buy an NAC system

When it comes to purchasing criteria for network access control products, it is important that not only is an NAC system capable of detecting all the devices connected to an organization’s network, but that it integrates as seamlessly as possible. The cost of attempting to shoehorn existing processes and systems into an NAC product that does not offer integration can quickly skyrocket, even if the initial cost is on the cheaper side.

NAC should also work for the business, not against it. In the days when NAC products only supported 802.1x authentication and blocked everything by default, it was seen as an annoyance that stopped legitimate network authentication requests. But, nowadays, a good NAC system provides seamless connections for employees, third parties and contractors alike — and to the correct area of the network to which they have access. It should also aid in regulatory compliance, an issue all organizations need to deal with now.

Assessing NAC products comes down to the key questions highlighted above. They are designed to help organizations determine what type of NAC product is right for them, and accordingly aid them in narrowing their choices down to the vendor that provides the product that most closely matches those criteria.

Once seldom used by organizations, endpoint protection is now a key part of IT security, and NAC products have a significant part to play in that. From a hacker’s perspective, well-implemented and managed NAC products can mean the difference between a full network attack and total attack failure.

Microsoft customers and partners envision smarter, safer, more connected societies – Transform

Organizations around the world are transforming for the digital era, changing how businesses, cities and citizens work. This new digital era will address many of the problems created in the earlier agricultural and industrial eras, making society safer, more sustainable, more efficient and more inclusive.

But an infrastructure gap is keeping this broad vision from becoming a reality. Digital transformation is happening faster than we expected — only in pockets. Microsoft and its partners seek to help city and other public infrastructures close the gaps, with advanced technologies in the cloud, data analytics, machine learning and artificial intelligence (AI).

Microsoft’s goal is to be a trusted partner to both public and private organizations in building connected societies. This summer, an IDC survey named Microsoft the top company for trust and customer satisfaction in enabling smart-city digital transformations.

Last week at a luncheon in New York City, Microsoft and executives from three organizations participating in the digital transformation shared how they are helping to close the infrastructure gap.

A photo of Arnold Meijer, TomTom's strategic business development manager.

Arnold Meijer, TomTom’s strategic business development manager, at the Building Digital Societies salon lunch. (Photo by John Brecher)

TomTom NV, based in Amsterdam, traditionally focused on providing consumers with personal navigation. Now, “the need for locations surpasses the need for navigation — it’s everywhere,” said Arnold Meijer, strategic business development manager. “Managing a fleet of connected devices or ordering a ride from your phone — these things weren’t possible five years ago. We’re turning to cloud connectivity and the Internet of Things as tools to keep our maps and locations up to date.”

Sensors from devices and vehicles on the road deliver condition and usage data essential to highway planners, infrastructure managers and fleet operators to make well informed decisions.

Autonomous driving is directly in TomTom’s sights, a way to cut down on traffic accidents, one of the top 10 causes of death worldwide, and to reduce emissions through efficient routing. “You probably won’t own a vehicle 20 years from now, and the one that picks you up won’t have a driver,” Meijer said. “If you do go out driving yourself, it will be for fun.”

With all that time freed up from driving, travelers can do something else such as relax or work. Either option presents new business opportunities for companies that offer entertainment or enable productivity for a mobile client, who is almost certainly connected to the internet. “There will be new companies coming out supporting that, and I definitely foresee Microsoft and other businesses active there,” Meijer said.

“Such greatly eased personal transport may decrease the need to live close to work or school, changing settlement patterns and reduce the societal impacts of mobility. All because we can use location- and cloud technology.” he added.

A photo of George Pitagorsky, CIO for the New York City Department of Education Office of School Support Services.

George Pitagorsky, CIO for the New York City Department of Education Office of School Support Services. (Photo by John Brecher)

The New York City Dept. of Education is using Microsoft technology extensively in a five-year, $25-million project that will tell parents their children’s whereabouts while the students are in transit, increase use of the cafeterias and provide access to information about school sports.

The city’s Office of Pupil Transportation provides rides to more than 600,000 students per day, with more than 9,000 buses and vehicles. For a preliminary version of the student-tracking system, the city has equipped its leased buses with GPS devices.

“When the driver turns on the GPS and signs in his bus, we can find out where it is at any time,” said George Pitagorsky, executive director and CIO for the department’s Office of School Support Services. If parents know what bus their child is on, they can more easily meet it at the stop or be sure to be there when the child is brought home.

A next step will be GPS units that don’t require driver activation. To let the system track not just the vehicle but its individual occupants, drivers will still need to register students into the GPS when they get on the bus.

“Biometrics like facial recognition that automate check-in when a student steps onto a bus — we’re most likely going to be there, but we’re not there yet,” Pitagorsky said.

Further out within the $25-million Illumination Program, a new bus-routing tool will replace systems developed more than 20 years ago, allowing the creation of more efficient routes, making course corrections to avoid problems, easily gathering vehicle-maintenance costs and identifying problem vehicles.

Other current projects include a smartphone app to advise students of upcoming meal choices in the school cafeterias, with an eye to increasing cafeteria use, enhancing students’ nutritional intake and offering students a voice in entree choices. The department has also created an app that displays all high school sports games, locations and scores.

A new customer-relations management app will let parents update their addresses and request special transport services on behalf of their children, with no more need to make a special visit to the school to do so. A mobile app will allow parents and authorized others to locate their children or bus, replacing the need for a phone call to the customer service unit. And business intelligence and data warehousing will get a uniform architecture, to replace the patchwork data, systems and tools now in place.

A photo of Christy Szoke CMO and co-founder of Fathym.

Christy Szoke CMO and co-founder of Fathym. (Photo by John Brecher)

Fathym, a startup in Boulder, Colorado, is directly addressing infrastructure gaps through a rapid-innovation platform intended to harmonize disparate data and apps and facilitate Internet of Things solutions.

“Too often, cities don’t have a plan worked out and are pouring millions of dollars into one solution, which is difficult to adjust to evolving needs and often leads to inaccessible, siloed data,” said co-founder and chief marketing officer Christy Szoke. “Our philosophy is to begin with a small proof of concept, then use our platform to build out a solution that is flexible to change and allows data to be accessible from multiple apps and user types.” Fathym makes extensive use of Azure services but hides that complexity from customers, she said.

To create its WeatherCloud service, Fathym combined data from roadside weather stations and sensors with available weather models to create a road weather forecast especially for drivers and maintenance providers, predicting conditions they’ll find precisely along their route.

“We’re working with at least eight data sets, all completely different in format, time intervals and spatial resolutions,” said Fathym co-founder and CEO Matt Smith. “This is hard stuff. You can’t have simplicity on the front end without a complicated back-end system, a lot of math, and a knowledgeable group of different types of engineers helping to make sense of it all.”

Despite the ease that cloud services have brought to application development, even 20 years from now foresees a need for experts to wrangle data.

“When people say, ‘the Internet of Things is here’ and ‘the robots are going to take over,’ I don’t think they have the respect they should have for how challenging it will remain to build complex apps,” Smith said.

Added Szoke, “You can’t just say ‘put an AI on it’ or ‘apply machine learning’ and expect to get useful data. You will still need creative minds, and data scientists, to understand what you’re looking at, and that will continue to be an essential industry.”

Micro data centers garner macro hype and little latency

LAS VEGAS — The large growth of data in many organizations is piquing IT interest in edge computing.

Edge computing is a process that places data processing closer to the data sources and its end users to reduce latency, and micro data centers are one way to achieve that. Micro data centers, also called edge data centers, are typically modular devices that house all infrastructure elements, from servers and storage to uninterruptible power supply and cooling. As data centers receive a flood of information from IoT devices, the two concepts took center stage for IT pros at Gartner’s data center conference last week in Las Vegas.

“As we start to have billions of things that are expecting instantaneous response and generating 4k video, the traffic is such that it doesn’t make sense to take all of that and shove it into a centralized cloud,” said Bob Gill, a research vice president at Gartner.

Gill compared the importance of edge to the effect that U.S. president Dwight D. Eisenhower’s highway system had on declining communities in the Midwest. Towns that were otherwise defunct of commerce could easily connect to cities with more promise. Similarly, organizations can place a micro data center in whichever location will maximize its value: an office, warehouse, factory floor or colocation facility.

“If you build the infrastructure based only on the centralized cloud models, data centers and colocation facilities without thinking about the edge, we’re going to find ourselves in three to four years with suboptimal infrastructure,” Gill said.

Edge computing
Edge computing enables data to be processed closer to its source.

A growing market

The market for micro data centers is still small, but it’s growing quickly. By 2021, 25% of enterprise organizations will have deployed a micro data center, according to Gartner.

If you build the infrastructure based only on the centralized cloud models … we’re going to find ourselves in three to four years with suboptimal infrastructure.
Bob Gillresearch vice president, Gartner

Some organizations are ahead of the edge computing game. Frank Barrett, IT operations director for Spartan Motors in Charlotte, Mich., inherited multiple data centers from a company acquisition. Now, the company has two primary data centers in Michigan and Indiana, and three smaller, micro data centers in Nebraska. All of the data centers currently have traditional hub-and-spoke networking infrastructure, but Barrett is considering improving the networks to better support the company’s edge data centers. He is currently in the process of moving to one provider for all of the company’s services, as well as updating switches, routers and firewalls throughout the enterprise.

“Latency is a killer,” Barrett said. “We’ve got a lot of legacy systems that people in different locations need access to. It can be a nightmare for those remote locations, regardless of how much bandwidth I throw between sites.”

Other organizations interested in adopting this technology have a choice between three types of micro data center providers. Infrastructure providers offer a one-stop shop with a single vendor for all hardware. Facilities specialists are often hardware-agnostic and provide a range of modular options, but they may require separate hardware. Regional providers are highly focused in a given region, providing strong local customer service. But a smaller business base can lead to less stability for those providers, with a higher risk of acquisitions and mergers, said Jeffrey Hewitt, a research vice president at Gartner.

One data center engineer at an audit company for retail and healthcare services is interested in the facilities provider approach because his company has a dedicated, in-house IT team to handle the other aspects of a data center. The engineer requested anonymity because he wasn’t authorized to speak to the media.

“With the facilities [option], you can install whatever you want,” he said. “Most offices have a main distribution facility, so they already have a circuit coming in, cooling in place and security. We don’t need any of that; it’d just be a dedicated rack for the micro data center.”

Micro data centers, not micro problems

Since micro data centers are often in a different physical location than a company’s traditional data center, IT needs to ensure that the equipment is secure, reliable and able to operate without constant repairs, said Daniel Bowers, a research director at Gartner. Industrial edge data centers in particular need to ruggedize the equipment so that it can withstand elements such as excessive vibrations and dust.

The distributed nature of micro data centers means that management is another concern, said Steven Carlini, senior director of data center global solutions at Schneider Electric, an energy management provider based in France.

“You don’t want to dispatch service to thousands of sites; you want the notifications that you get to be very specific,” he said. “Hopefully you can resolve an issue without sending someone on site.”

Vendor lock-in is a concern, particularly with all-in-one providers such as Dell EMC, HPE and Hitachi. It’s important to choose the right vendor from the start, which can be overwhelming due to an oversaturated market. The reality is that micro data centers have been around for years. At last year’s conference, Schneider Electric and HPE unveiled the HPE Micro Datacenter, but Schneider Electric has offered micro data centers for at least three years prior, for instance. This year, the company introduced Micro Data Center Xpress, which allows customers or partners to configure IT equipment before installing the system.

Hewitt recommends a four-step process to choose a micro data center vendor: Identify the requirements, score the vendors based on strengths and weaknesses, use those to create a shortlist and negotiate a contract with at least two comparable vendors.

Third-party E911 services expand call management tools

Organizations are turning to third-party E911 services to gain management capabilities they can’t get natively from their IP telephony provider, according to a report from Nemertes Research.

IP telephony providers may offer basic 911 management capabilities, such as tracking phone locations, but organizations may have needs that go beyond phone tracking. The report, sponsored by telecom provider West Corporation, lists the main reasons why organizations would use third-party E911 services.

Some organizations may deploy third-party E911 management for call routing to ensure an individual 911 call is routed to the correct public safety answering point (PSAP). Routing to the correct PSAP is difficult for organizations with remote and mobile workers. But third-party E911 services can offer real-time location tracking of all endpoints and use that information to route to the proper PSAP, according to the report.

Many larger organizations have multivendor environments that may include multiple IP telephony vendors. Third-party E911 services offer a single method of managing location information across endpoints, regardless of the underlying telephony platform.

The report also found third-party E911 management can reduce costs for organizations by automating the initial setup and maintenance of 911 databases in the organization. Third-party E911 services may also support centralized call routing, which could eliminate the need for local PSTN connections at remote sites and reduce the operating and hardware expenses at those sites.

Genesys unveils Amazon integration

Contact center vendor Genesys, based in Daly City, Calif., revealed an Amazon Web Services partnership that integrates AI and Genesys’ PureCloud customer engagement platform.

Genesys has integrated PureCloud with Amazon Lex, a service that lets developers build natural language, conversational bots, or chatbots. The integration allows businesses to build and maintain conversational interactive voice response (IVR) flows that route calls more efficiently.

Amazon Lex helps IVR flows better understand natural language by enabling IVR flows to recognize what callers are saying and their intent, which makes it more likely for the call to be directed to the appropriate resource the first time without error.

The chatbot integration also allows organizations to consolidate multiple interactions into a single flow that can be applied over different self-service channels. This reduces the number of call flows that organizations need to maintain and can simplify contact center administration.

The chatbot integration will be available to Genesys customers in 2018.

Conference calls face user, security challenges

A survey of 1,000 professionals found that businesses in the U.S. and U.K. are losing $34 billion due to delays and distractions during conference calls, a significant increase from $16 billion in a 2015 survey.

The survey found employees waste an average of 15 minutes per conference call getting it started and dealing with distractions. More than half of respondents said distractions have a moderate-to-major negative effect on productivity, enthusiasm to participate and the ability to concentrate.

The survey was conducted by remote meetings provider LoopUp and surveyed 1,000 professionals in the U.S. and U.K. who regularly participate in conference calls at organizations ranging from 50 to more than 1,000 employees.

The survey also found certain security challenges with conference calls. Nearly 70% of professionals said it’s normal to discuss confidential information over a call, while more than half of respondents said it’s normal to not know who is on a call.

Users are also not fully comfortable with video conferencing, according to the survey. Half of respondents said video conferencing is useful for day-to-day calls, but 61% still prefer to use the phone to dial in to conference calls.

DevOps transformation in large companies calls for IT staff remix

SAN FRANCISCO — A DevOps transformation in large organizations can’t just rely on mandates from above that IT pros change the way they work; IT leaders must rethink how teams are structured if they want them to break old habits.

Kaiser Permanente, for example, has spent the last 18 months trying to extricate itself from 75 years of organizational cruft through a consumer digital strategy program led by Alice Raia, vice president of digital presence technologies. With the Kaiser Permanente website as its guinea pig, Raia realigned IT teams into a squad framework popularized by digital music startup Spotify, with cross-functional teams of about eight engineers. At the 208,000-employee Kaiser Permanente, that’s been subject to some tweaks.

“At our first two-pizza team meeting, we ate 12 pizzas,” Raia said in a session at DevOps Enterprise Summit here. Since then, the company has settled on an optimal number of 12 to 15 people per squad.

The Oakland, Calif., company decided on the squads approach when a previous model with front-end teams and systems-of-record teams in separate scrums didn’t work, Raia said. Those silos and a focus on individual projects resulted in 60% waste in the application delivery pipeline as of a September 2015 evaluation. The realignment into cross-functional squads has forced Kaiser’s website team to focus on long-term investments in products and faster delivery of features to consumers.

IT organizational changes vary by company, but IT managers who have brought about a DevOps transformation in large companies share a theme: Teams can’t improve their performance without a new playbook that puts them in a better position to succeed.

We had to break the monogamous relationships between engineers and [their] areas of interest.
Scott Nasellosenior manager of platforms and systems engineering, Columbia Sportswear Co.

At Columbia Sportswear Co. in Portland, Ore., this meant new rotations through various areas of focus for engineers — from architecture design to infrastructure building to service desk and maintenance duties, said Scott Nasello, senior manager of platforms and systems engineering, in a presentation.

“We had to break the monogamous relationships between engineers and those areas of interest,” Nasello said. This resulted in surprising discoveries, such as when two engineers who had sat next to each other for years discovered they’d taken different approaches to server provisioning.

Short-term pain means long-term gain

In the long run, the move to DevOps results in standardized, repeatable and less error-prone application deployments, which reduces the number of IT incidents and improved IT operations overall. But those results require plenty of blood, sweat and tears upfront.

“Prepare to be unpopular,” Raia advised other enterprise IT professionals who want to move to DevOps practices. During Kaiser Permanente’s transition to squads, Raia had the unpleasant task to inform executive leaders that IT must slow down its consumer-facing work to shore up its engineering practices — at least at first.

Organizational changes can be overwhelming, Nasello said.

“There were a lot of times engineers were running on empty and wanted to tap the brakes,” he said. “You’re already working at 100%, and you feel like you’re adding 30% more.”

IT operations teams ultimately can be crushed between the contradictory pressures of developer velocity on the one hand and a fear of high-profile security breaches and outages on the other, said Damon Edwards, co-founder of Rundeck Inc., a digital business process automation software maker in Menlo Park, Calif.

Damon Edwards, co-founder of Rundeck Inc., shares the lessons he learned from customers about how to reduce the impact of DevOps velocity on IT operations.
Damon Edwards, co-founder of Rundeck Inc., shares the lessons he learned from customers about how to reduce the impact of DevOps velocity on IT operations.

A DevOps transformation means managers must empower those closest to day-to-day systems operations to address problems without Byzantine systems of escalation, service tickets and handoffs between teams, Edwards said.

Edwards pointed to Rundeck customer Ticketmaster as an example of an organizational shift toward support at the edge. A new ability to resolve incidents in the company’s network operations center — the “EMTs” of IT incident response — reduced IT support costs by 55% and the mean time to response from 47 minutes to 3.8 minutes on average.

“Silos ruin everything — they’re proven to have a huge economic impact,” Edwards said.

And while DevOps transformations pose uncomfortable challenges to the status quo, some IT ops pros at big companies hunger for a more efficient way to work.

“We’d like a more standardized way to deploy and more investment in the full lifecycle of the app,” said Jason Dehn, systems analyst for a large U.S. retailer he asked not to be named. But some lines of business at the company are happy with a status quo, where they aren’t entangled in day-to-day application maintenance.

“Business buy-in can be the challenge,” Dehn said.

Beth Pariseau is senior news writer for TechTarget’s Data Center and Virtualization Media Group. Write to her at bpariseau@techtarget.com or follow @PariseauTT on Twitter.

Industrial IoT adoption rates high, but deployment maturity low

Industrial organizations have embraced IoT, that much is clear. But a new study found current deployments aren’t very advanced yet — though that will come in time, the organizations said.

Bsquare’s 2017 Annual IIoT Maturity Study surveyed more than 300 senior-level employees with operational responsibilities from manufacturing, transportation, and oil and gas companies with annual revenues of more than $250 million.

Eighty-six percent of respondents said they have IIoT technologies in place, with an additional 12% planning to deploy IIoT within the next year. And of the 86% of industrial organizations that have completed IoT adoption, 91% said the IIoT deployments were important to business operations.

However, while IoT is catching on, most industrial organizations are still in the early stages.

The state of IoT adoption in industrial organizations

The study outlined five levels of IoT adoption: device connectivity and data forwarding, real-time monitoring, data analytics, automation and on-board intelligence.

Seventy-eight percent of survey respondents, with transportation leading the pack, self-identified their companies at the first stage, transmitting sensor data to the cloud for analytics, and 56%, again with transportation in the lead, reached the second stage, monitoring sensor data in real time for visualization.

Dave McCarthy, BsquareDave McCarthy

Dave McCarthy, senior director of products at Bsquare, said he had predicted the gap between the first two stages would be smaller; no surprise there. What really surprised him, however, was the small gap between the second stage and third: Forty-eight percent of respondents said they were using data analytics for insight, predictions and optimization with applied analytics such as machine learning or artificial intelligence.

“What it indicates to me,” McCarthy said, “Is that people who have gone down the visualization route have figured out, to some degree, some use of the data they’re collecting, and they know that analytics is going to play a part in helping them understand more closely what that data is going to mean for them.”

McCarthy wasn’t surprised to see the drop in the fourth and fifth stages: Twenty-eight percent said they were automating actions across internal systems with their IoT deployments, and only 7% had reached the edge analytics level.

“Just as expected, there’s a large drop-off from people doing analytics to people who are automating the results,” McCarthy said. “And in my mind, the highest amount of ROI comes when you can get to those levels.”

IIoT Maturity Model
Maturity of IoT adoption in industrial organizations

IIoT adoption and satisfaction

Not reaching the highest levels of ROI isn’t deterring IoT adoption, though: Seventy-three percent of respondents said they expect to increase IIoT deployments over the next year, with higher IoT adoption rates in transportation and manufacturing (85% and 78%, respectively) than oil and gas (56%). Additionally, 35% of all industrial organizations believe they will reach the automation stage, and 29% are aiming to reach the real-time monitoring stage in the same time period.

Nor will ROI always be calculated the same by analysts and companies as it is by the organizations using IIoT technologies, McCarthy noted. Respondents cited machine health- (90%) and logistics- (67%) related goals as top IoT adoption drivers, while lowering operating costs came in at 24%.

“The number one motivation that all operations-oriented companies have is improving and increasing uptime of their equipment,” McCarthy said. “I hear this over and over again. They know they eventually have to do maintenance on equipment and take things down for repairs, but it is so much more manageable when they can get ahead of that and plan for it.”

“The reality for these types of businesses is that if there are plant shutdowns or line shutdowns that last for extended periods of time, they often don’t have the ability to make up that loss in production,” McCarthy added. “You can’t just run another shift on a Saturday to pick up the slack. Oftentimes the value of the product they’re producing far outweighs the cost of operating the equipment. What this indicates to me is, ‘I’ll spend more if that means I can keep that line running because of the production value.'”

With or without traditional ROI, the majority of survey respondents said they were happy with the results they’re seeing: Eighty-four percent said their products and services were extremely or very effective, with the transportation sector seeing a 96% satisfaction rate.

Additionally, 99% of oil and gas, 98% of transportation and 90% of manufacturing organizations said IIoT would have a significant impact on their industry at a global level. Perhaps those predictions of IIoT investments reaching $60 trillion in the next 15 years and the number of internet-connected IIoT devices exceeding 50 billion by 2020 will become a reality.

GDPR requirements put end-user data in the spotlight

As organizations around the world prepare for major new data privacy rules to take effect, their biggest challenge is taking stock of data and how they use it.

The General Data Protection Regulation (GDPR), which goes into effect in May 2018, governs the storage and processing of individuals’ personal data. For IT departments, this regulation means they must review their handling of employees’ and customers’ information to ensure it meets new security requirements. Endpoint management products play an important role in helping IT get ready for GDPR requirements, but many of these tools don’t yet have all the capabilities they need, experts said.

“It’s going to be a huge risk if the organization is not able to control data that’s part of GDPR,” said Danny Frietman, co-founder of MobileMindz, an enterprise mobility consultancy in the Netherlands. “A lot of companies will not be able to cope with the magnitude of that change.”

GDPR is a European Union (EU) regulation that aims to protect Europe residents’ data, but it has worldwide ramifications. U.S.-based companies that have branches in the EU, use consultants based in the EU or have customers in the EU, for example, will all have to comply. GDPR would come into play for most organizations when it comes to protecting their employees’ and customers’ personally identifiable information (PII), such as home address, IP address or bank account details.

The role of endpoint management tools in GDPR

Some of the end-user computing (EUC) technologies IT can use to ensure GDPR compliance include information and identity management and enterprise mobility management (EMM).

Mobile and desktop management tools allow administrators to implement the following technologies and features:

  • encryption;
  • multifactor authentication;
  • application blacklisting;
  • per-user security policies; and
  • alerts that identify noncompliant activities.

Specifically for mobile devices, IT can use capabilities such as the following:

  • remote wipe to remove a user’s information once they leave the company;
  • containerization to separate personal and corporate information and ensure that IT only accesses the identifiable data it really needs; and
  • threat defense tools to be proactive about potential breaches.

MobileMindz, for instance, uses Apperian for mobile application management and adopted its enterprise app store to ensure all employees’ apps that deal with sensitive data are secure, Frietman said.

But EMM tools lack the ability to allow for clear and efficient logging, reporting and auditing of what personal data an organization has. That’s the bigger challenge for IT, said Frietman, whose firm is preparing clients and itself for GDPR.

“This is a huge opportunity for EMM vendors,” he said. “It could solve a lot of questions for customers.”

VMware, for one, has aimed over the past year to improve upon its existing data-reporting capabilities in Workspace One, a company spokesperson said. Workspace One Intelligence, announced at this year’s VMworld, can help IT document information for GDPR requirements by gaining deeper insight into its data and running reports based on historical and future big data. It should be generally available before the regulation goes into effect in May, the spokesperson said.

GDPR concerns
Businesses list their concerns about the upcoming GDPR.

Preparing a paper trail

The biggest change EUC administrators will need to enact to comply with GDPR requirements is around governance and data inventory — an approach to managing information that’s based on clear processes and roles. The regulation requires the entities that collect personal data be able to identify exactly what data they have, whose it is, why they have it, the purpose of keeping it and what they are going to do with it.

Clear documentation of all data will be key, said Chris Marsh, research director at 451 Research.

“You can point to that straight away if anyone came to you, and you can say, ‘This is what the purpose was, and here’s what we’re doing with the data,'” Marsh said.

Organizations should also develop clear, written security and compliance policies that state who has access to what data and how they can use it. Can a human resources manager view employees’ bank account information? Can IT administrators view GPS location from a user’s mobile device? Can a salesperson who deals with customer information share data from a corporate app to a personal one?

“We are living with decentralized data, and companies should have thought about the impact of that data a while ago,” Frietman said.

How the GDPR works

We are living with decentralized data.
Danny Frietmanco-founder, MobileMindz

GDPR differs from its predecessor, the Data Protection Directive, in that it has tighter requirements for documenting and defining what data an organization processes and why. It also has a stricter definition of consent, which says companies must get “freely given, specific, informed and unambiguous” agreement from individuals to process their data. In addition, authorities that regulate GDPR will do so in standard fashion across the EU, rather than enforcing the regulation differently in each member state.

But what makes GDPR so complex is its wide-ranging classification of what constitutes personal data. The European definition of personal data is much wider than the U.S. definition of PII. It can even include biometric data, political opinions, health information, sexual orientation, trade union membership and more.

“Those things, according to the European view, are particularly susceptible to misuse in discrimination against individuals,” said Tatiana Kruse, of counsel at global law firm Dentons.

GDPR includes dozens of requirements and suggested security guidelines for how to comply. For instance, certain companies must appoint a data protection officer and report breaches to authorities. They may also have to take data privacy into account when building IT systems and applications by using technologies such as pseudonymization, which masks data so it can’t be attributed to a specific person — an approach called privacy by design.

But the GDPR requirements do not include many specific security measures that IT must implement; a lot of the law will be figured out in litigation as regulators check into companies’ compliance, said Joseph Jerome, a policy counsel on the Privacy and Data Project at the Center for Democracy and Technology in Washington, D.C.

“Everyone needs to be inventorying their personal data and take a broad characterization of this,” Jerome said. “If you’re putting things in writing, that’s good. GDPR is going to lead to lots and lots of documentation.”

Oil and gas IT leaders drilling for AI benefits

When Gartner analyst Peter Sondergaard advised organizations to put digital technologies at the heart of their business strategies, he put no small emphasis on artificial intelligence. AI would help businesses struggling to make sense of a sea of data collected by sensors and other devices hooked into the rapidly growing internet of things; the application of AI in digital projects could lead to whole new business models and even aid in the uphill battle known as cybersecurity in the 21st century.

“It will be an essential defense, creating a continuously adaptive risk and trust response,” Sondergaard said at Gartner’s annual Symposium/ITxpo in Orlando, Fla., earlier this month.

No head of IT today would say no to state-of-the-art security — or any AI benefits — and by 2020, Gartner predicts AI will be a top priority for 30% of CIOs.

Two and half years before that date, though, AI is still a developing technology that many organizations — and indeed whole industries — are just starting to evaluate. At the Gartner event, SearchCIO talked to IT leaders in the conservative oil and gas industry, where the allure of new business value is slowly wearing down a traditional reluctance for change.

‘Digital laggard’

Bill Schneider, vice president of IT at Pioneer Energy Services, said oil and gas has historically been “a laggard in digital.” And tech investments slowed in 2014, when the price of oil started its precipitous, two-year decline. “So we’ve got a lot of ground to make up,” Schneider said.

The San Antonio-based company provides drilling and well services for oil and gas companies in the U.S. and Colombia, and it has sensors affixed to wells and field equipment “which generate a tremendous amount of data,” Schneider said.

But just a fraction of the data coursing through the internet of things (IoT) and collected by the company is analyzed, he said — and that presents a huge opportunity.

“There’s just a mountain of information to be harnessed there,” Schneider said. “And so part of our focus is first understanding what we have, and then trying to go back and figure out, OK, how can we apply AI to that to help understand how to improve our services.”

As Pioneer Energy Services assesses AI technologies — and AI benefits — it is now working with its customers to determine what they’d like those data-enhanced services to look like.

“They’re clearly here trying to figure out their digital journey. And so we need to be able to partner with them to figure that out — what’s important to them,” he said.

Gartner analyst Peter Sondergaard at the Gartner Symposium/ITxpo.
Gartner analyst Peter Sondergaard speaks during the Gartner Symposium/ITxpo kickoff on Oct. 2.

Optimizing refineries with AI

Yael Urman is another IT leader from the oil and gas sector. She’s director of applications at PBF Energy, a petroleum refining company in Parsippany, N.J. Like Pioneer Energy Services, her company also deploys a vast network of connected sensors and pulls in tons of data on field equipment — but that’s it.

“We collect the data, but we don’t do much more with that,” Urman said.

With AI, though, the company could. Every few years, she explained, PBF Energy does big, expensive replacements, or “turnarounds,” of its equipment. And it uses historical data and statistics to determine what gets replaced.

“Whatever you did over the last 30 years, more or less, that’s what you continued to do,” she said. “So if you know that the life of the equipment is five years, every five years you replace it.”

But by integrating historical equipment data with the vast amounts of IoT data on how certain machines are holding up — and putting AI to work on all of it — the company could make decisions on whether to replace specific parts, Urman said. The analysis could show, for example, that a piece of machinery is in particularly bad shape after three years instead of the expected five. That would save the company the cost of it failing or breaking down over two years.

“Or maybe you can keep these specific parts for seven years, because you have all the data about these pieces of equipment,” she said.

The company is gathering information about how AI is being used in the refining business — Urman recently learned about a company in Australia using IBM Watson for a similar aim — and how to eventually attain AI benefits. For now, it’s “small steps,” but she has her eyes trained on the road ahead.

“We believe that we can do better than any other company in terms of optimizing refineries,” she said, pointing to the company’s aggressive automation of business processes such as accounts payable, invoices and others. “We can be even more efficient, and we can even optimize better if we add a digital aspect.”

For a look at how tight the labor market is for AI talent, read this SearchCIO blog post.

End-user security requires a shift in corporate culture

SAN FRANCISCO — An internal culture change can help organizations put end-user security on the front burner.

If an organization only addresses security once a problem arises, it’s already too late. But it’s common for companies, especially startups, to overlook security because it can get in the way of productivity. That’s why it’s important for IT departments to create a company culture where employees and decision-makers take security seriously when it comes to end-user data and devices.

“Security was definitely an afterthought,” said Keane Grivich, IT infrastructure manager at Shorenstein Realty Services in San Francisco, at last week’s BoxWorks conference. “Then we saw some of the high-profile [breaches] and our senior management fully got on board with making sure that our names didn’t appear in the newspaper.”

How to create a security-centric culture

Improving end-user security starts with extensive training on topics such as what data is safe to share and what a malicious website looks like. That forces users to take responsibility for their actions and understand the risks of certain behaviors.

Plus, if security is a priority, the IT security team will feel like a part of the company, not just an inconvenience standing in users’ way.

“Companies get the security teams they deserve,” said Cory Scott, chief information security officer at LinkedIn. “Are you the security troll in the back room or are you actually part of the business decisions and respected as a business-aligned person?”

Finger-pointing is a complete impediment to learning.
Brian Roddyengineering executive, Cisco

When IT security professionals feel that the company values them, they are more likely to stick around as well. With the shortage of qualified security pros, retaining talent is key.

Keeping users involved in the security process helps, too. Instead of locking down a user’s PC when a user accesses a suspicious file, for example, IT can send him a message checking if he performed a certain action. If the user says he accessed the file, then IT knows someone is not impersonating the user. If he did not, then IT knows there is an intruder and it must act.

To keep end-user security top of mind, it’s important to make things such as changing passwords easy for users. IT can make security easier for developers as well by setting up security frameworks that they can apply to applications they’re building.

It’s also advisable to take a blameless approach when possible.

“Finger-pointing is a complete impediment to learning,” said Brian Roddy, an engineering executive who oversees the cloud security business at Cisco, in a session. “The faster we can be learning, the better we can respond and the more competitive we can be.”

Don’t make it easy for attackers

Once the end-user security culture is in place, IT should take steps to shore up the simple things.

Unpatched software is one of the easiest ways for attackers to enter a company’s network, said Colin Black, COO at CrowdStrike, a cybersecurity technology company based in Sunnyvale, Calif.

IT can also make it harder for hackers by adding extra security layers such as two-factor authentication. 

The financial services industry banks on the Microsoft Cloud for digital transformation – The Official Microsoft Blog

Financial services organizations are at a transformational tipping point. Faced with fierce market pressures – nimble disruptors, complex regulations, digital native customers – technology transformation in the industry is essential, and it’s increasingly becoming a competitive edge. Firms leading the charge into this new era are transforming customer experiences, fostering a new culture of work, optimizing operations and driving product innovation, and they are using Microsoft cloud, AI and blockchain technologies in their journey to become digital leaders of the future. TD Bank, Sumitomo Mitsui and Credito Agricola are among the organizations that have embraced this digital transformation happening in every part of the world.

Next week at Sibos, the premier financial services industry event, Microsoft CEO Satya Nadella, myself and others from Microsoft will speak alongside other innovative business leaders driving transformation. Here are some of the scenarios and solutions Microsoft customers and partners are implementing along their journey:

Fostering a new culture of work

Fostering a digital culture is a paradigm shift, one which financial institutions must embark. Jumpstarting this culture shift means designing a workplace where every work style can thrive – one that harnesses digital intelligence to improve experiences, unleashes creativity and enables mobility while keeping organizations, people and information secure. With modern tools, employees from the first line to the back office can do their best work.

State Bank of India, a 200-year-old institution, is working with Microsoft to create a modern workplace for its 263,000 employees across 23,423 branches servicing more than 500 million customer accounts, including some in the most remote locations of the country. SBI has adopted Office 365 to enhance communication and collaboration among its workforce, and believes that Microsoft is delivering the best productivity experience for its employees while ensuring the highest standards of security, compliance and adherence to regulatory requirements.

Emirates NBD is creating more creative and collaborative workspaces for their employees, and has selected Office 365 as well as Surface Hub to deliver on their digital transformation efforts.

Transforming customer experiences

Customer expectations have changed and will continue to evolve at the speed of technology. The imperative for financial services firms is to engage customers in a way that is natural, tailored and delightful at every turn. The most innovative firms will create experiences that harness data from customers to derive actionable insights and deliver greater market success.

TD Bank is known for legendary customer service and aims to match their marquee in-person experience through digital wherever customers may be – at home, at work or on-the-go. With more than 12 million digitally active customers and expectations evolving at the speed of technology, TD has turned to Microsoft Azure and data platform services to help deliver on their promise of legendary service at every touchpoint.

Sumitomo Mitsui Banking Corporation is transforming their employee workstyles and enhancing their customer experience with the Microsoft Cloud. In addition to building a secure and highly productive office environment utilizing Azure and Office 365, SMBC has also introduced an interactive automatic response system for customer service, powered by the Microsoft Cognitive Toolkit and AI services.

Optimizing operations

Shifting line-of-business applications and other capabilities to the cloud is a foundational step along the digital transformation journey that enables banks to save big and reinvest dollars in more innovative, value-add banking services. It also enables firms to be more agile like their industry disruptor counterparts.

UBS is using Azure to power its risk management platform, technology that requires enormous computing power, to run millions of calculations daily on demand. The result — speeding calculation time by 100 percent, saving 40 percent in infrastructure costs, gaining nearly infinite scale within minutes — means the firm can have more working capital on hand and employees can make quicker, more informed decisions for their clients.

Société Générale is using Azure’s high-performance computing capabilities to power the credit value adjustment (CVA) intraday calculation, a critical front-office function, enabling cost-effective, almost limitless on-demand computing power.

Driving product innovation

Driving product and business model innovation allows organizations to harness data as a strategic asset, shifting from hindsight to foresight, automating manual processes, delivering personalization to customers, and innovating with new business models, services, products and experiences to differentiate and capture emerging opportunities.

Crédito Agricola considers open banking as a future standard for the financial services industry. The bank’s digital strategy on open banking seeks to leverage the cloud to provide seamless and frictionless interactions that delight their customers, while complying with regulation. Credito Agricola is using Microsoft Azure to deliver open banking capabilities like API management gateway, security and identity, as well as analytics and insights, with plans to extend their core banking services to third-party service providers.

Microsoft is collaborating with The Monetary Authority of Singapore and The Association of Banks in Singapore to support Project Ubin 2, using Azure to explore the use of blockchain for the clearing and settlement of payments and securities, a key milestone in Singapore’s ambition of becoming a Smart Financial Center.

For the financial services industry, a firm’s deliberate and strategic move to the cloud hinges on security, privacy and regulatory compliance. Microsoft is committed to earning the trust of our financial services customers. We engage with regulators in key markets to share information about our services, the related controls and our approach to enabling customer compliance. We also take input from leading banks and insurers across the globe on an ongoing basis to improve our offerings to help meet regulatory requirements. We do all of this to continue building our customers’ trust in cloud technology.

It’s incredible and humbling to be on this transformational journey with so many ambitious digital leaders. Come visit us at booth No. C46 if you’re in Toronto or follow our stories at our press site. You can also join my session at the Innotribe, “Creating Space for Innovation,” on Tuesday, Oct. 17, and see Microsoft CEO Satya Nadella provide the Sibos closing plenary on Thursday, Oct. 19. Satya’s plenary will also be livestreamed here.

TTW

Tags: financial services, Sibos