Category Archives: Enterprise IT news

Enterprise IT news

Colorado builds API integration platform to share data, speed services

Integrating data across corporate departments can be challenging. There invariably are technical and cultural hurdles that must be cleared.

The state of Colorado’s ambitious plan to integrate data across dozens of agencies by using APIs had to contend with another issue: It was being launched under intense media scrutiny.

“We typically think of IT as a back-office function,” said Jon Gottsegen, chief data officer for the state of Colorado. Not so in Colorado. A deeply troubled benefits eligibility system — more than a decade in development and charged with making improper payments — had put IT in the limelight — and not in a good way, he said. The newspaper term above the fold became part of his vocabulary, Gottsegen grimly joked.

Work on the state’s new API integration platform began in 2017 with a major transformation of the infamous benefits system. Partnering with Deloitte, IT rewrote the system’s code, migrated services to Salesforce and AWS, and used APIs to drive integration into various databases, Gottsegen said. This helped reduce the amount of time to determine eligibility for benefits from days to minutes — a major boon for state employees.

Today, the API integration platform — while still a work in progress — has dramatically sped up a number of state processes and is paving the way for better data sharing down the road, Gottsegen said.

Speaking at the recent MuleSoft Connect conference in San Francisco, Gottsegen shared the objectives of Colorado’s API integration strategy, the major challenges his team has encountered and the lessons learned.

People, of course, should come first in projects of this scope: Delivering better services to the people of Colorado was aim No. 1. of his team’s API integration platform, Gottsegen said. Security, data governance and corporate culture also demand attention.

Becoming the ‘Amazon of state services’

The task before Gottsegen and his group was to create a process for rolling out APIs that work seamlessly across dozens of different agencies. “Ideally, we want to be the Amazon of state services,” he said of IT’s grand mission.

Developers had to learn how to connect systems to databases that were regulated in different ways. Gottsegen’s team spent a lot of time putting together a comprehensive platform, which was important for integration, he said. It was also important to deliver the APIs in a way that they could be easily consumed by the various state agencies. One goal was to ensure that new APIs were reusable.

Part of the work also involved looking at how services relate to each other. For example, if someone is getting Medicaid, there is a good chance they are also eligible for house services. The API platform had to support the data integration that helps automate these kinds of cross-agency processes, he said.

Getting the API program off the ground was not just about solving the technical problems. When communicating with technology personnel across agencies, Gottsegen said it was important to convey that the API integration platform is about better serving the residents of Colorado.

Learning from contractors

IT did not go it alone. Gottsegen said the state worked with a variety of contractors to speed up its API development process. This included working with MuleSoft to roll out a more expansive API management tier. IT also hired some contractors with integration expertise to kick-start the project. But he added that it was important to ensure the knowledge involved in building the APIs was retained after the contract ends.

“We want our teams to sit next to those contractors to ensure the knowledge of those contractors gets internalized. There have been many cases where state workers did not know how to maintain something after the contractor has left,” he said.

Good metrics, communication critical to API integration success

Before Gottsegen’s team launched a formal API integration program, no one was tracking how long it took agencies to set up a working data integration process. Anecdotal examples of problems would emerge, including stories of agencies that spent over a year negotiating how to set up a data exchange.

The team now has formal metrics to track time to implementation, but the lack of past metrics precludes precise measurements on how the new API integration platform speeds up data exchange compared to before.

In any case, expediting the data exchange process is not just about having a more all-encompassing integration tier, Gottsegen stressed. Better communication between departments is also needed.

As it rolls out the API integration platform, IT is working with the agencies to identify any compliance issues and find a framework to address them.

Centralizing security

Each agency oversees its own data collection and determines where it can be used, Gottsegen said. There are also various privacy regulations to conform to, including HIPAA and IRS 1075.

“One of the reasons we pursued MuleSoft was so we could demonstrate auditable and consistent security and governance of the data,” he said.

Navigating the distinctions between privacy and security is a big challenge, he said. Each agency is responsible for ensuring restrictions on how its data is used; it is not a task assigned to a centralized government group because the agency is the expert on privacy regulations. At the same time, Gottsegen’s group can provide better security into API integration mechanisms used to exchange data between agencies.

To provide API integration security, Gottsegen created a DevOps toolchain run by a statewide center for enablement. This included a set of vetted tools and practices that agencies could adopt to speed the development of new integrations (dev) and the process of pushing them into production (ops) safely.

Gottsegen said the group is developing practices to build capabilities that can be adopted across the state, but progress is uneven. He said the group has seen mixed results in getting buy-in from agencies.

Improving data quality across agencies

Gottsegen’s team has also launched a joint agency interoperability project for the integration of over 45 different data systems across the state. The aim is to build a sturdy data governance process across groups. The first question being addressed is data quality, in particular to ensure a consistent digital ID of citizens. “To be honest, I’m not sure we have a quality measure across the state,” Gottsegen said.

Gottsegen believes that data quality is not about being good or bad, but about fitness for use. It’s not easy articulating what particular data set is appropriate across agencies.

“Data quality should be a partnership between agencies and IT,” he said. His team often gets requests to integrate data across agencies. The challenge is how to provide the tools to do that. The agencies need to be able to describe the idiosyncrasies of how they collect data in order to come up with a standard. Down the road, Gottsegen hopes machine learning will help improve this process.

Building trust with state IT leaders

A lot of state initiatives are driven from the top down. But, if workers don’t like a directive, they can often wait things out until a new government is elected. Gottsegen found that building trust among IT leaders across state agencies was key in growing the API program. “Trust is important — not just in technology changes, but in data sharing as well,” he said.

And face-to face connections matter. In launching its API integration platform, he said, it was important for IT leaders across organizations to learn each other’s names and to meet in person, even when phone calls or video conferences might be more convenient.

As for the future, Gottsegen has a vision that all data sharing will eventually happen through API integrations. But getting there is a long process. “That might be 10 years out — if it happens. We keep that goal in mind while working with our collaborators to build things out.”

Go to Original Article
Author:

Bluescape releases newest version of its mobile app

Bluescape has launched its newest mobile app to enable users to access their content on the go.

The app, available in the Apple App Store and Google Play store, connects to Bluescape workspaces from mobile devices, such as cellphones or tablets. According to the vendor, it enables users to give presentations without a laptop by launching a Bluescape session from the app onto larger touchscreens.

Users can also access their content and workspace anytime and from anywhere and search and view content. According to Bluescape, the app provides a visual collaboration workspace that integrates day-to-day applications, content and tools.

The Bluescape platform is cloud-based software, with applications designed for collaboration in the workplace. Available applications include mobile and personal workstations, huddle rooms, innovation centers, collaboration suites, conference rooms, training rooms, executive briefing centers, command centers and control centers. Search, messaging and file sharing are also built into the platform.

Bluescape lists professionals in jobs such as architecture, consulting, designing, filmmaking, marketing and product development as ideal users for its product, as these are often groups of people working collaboratively and visually.

Bluescape is among the vendors offering visual collaboration software, which works hand in hand with digital collaborative whiteboards. Vendor Mural provides separate workspaces for teams and enables scaling for companywide processes, with frameworks for Agile, Lean and Design Thinking methods. Custom frameworks are also available.

Competitor Miro touts its product development, user experience research and design, and Lean and Agile capabilities, as well as its enterprise-grade security. Available applications include Google Drive, Box, Dropbox, Slack, OneDrive and Microsoft Teams.

Go to Original Article
Author:

IBM Cloud Paks open new business for channel partners

IBM said its latest push into the hybrid and multi-cloud market is setting the stage for new channel opportunities. 

The company this week revealed IBM Cloud Paks, a new set of IBM software offerings containerized on Red Hat OpenShift. According to IBM, Cloud Paks aim to help customers migrate, integrate and modernize applications in multiple-cloud environments.

Those environments include public clouds such as AWS, Azure, Google Cloud Platform, Alibaba and IBM Cloud, as well as private clouds. The Cloud Paks launch follows on the heels of IBM closing its $34 billion Red Hat acquisition in July and is part of a broader strategy to make its software portfolio cloud-native and OpenShift-enabled.

“The strategy has been to containerize the middleware on a common Kubernetes platform. That common Kubernetes platform is Red Hat OpenShift,” said Brian Fallon, director of worldwide digital and partner ecosystems, IBM Cloud and cognitive software.

For IBM business partners, Cloud Paks offer a modular approach to solving common problems faced by customers in their journeys to cloud, he said. The company released five Cloud Pak products, addressing data virtualization; application development; integration of applications, data, cloud services and APIs; process automation; and multi-cloud management.

IBM Cloud Paks can be mixed and matched to address different customer scenarios. For example, the Cloud Pak for Applications and Cloud Pak for Integration “together represent a great opportunity for partners to help their clients move and modernize workloads to a cloud environment,” Fallon said.

Dorothy Copeland, global vice president of programs and business development for the IBM partner ecosystem, said IBM’s push into hybrid and multi-cloud products is creating new opportunities for IBM and Red Hat partners, respectively.

“We are enabling the market to drive hybrid, multi-cloud solutions and, in that, enabling our business partners to be able to do that, as well. … There is a huge opportunity for partners, especially around areas where partners can add specific knowledge, services and management of the deployment,” she said.

Cloud Paks may also serve as an entry point for Red Hat partners to build IBM practices. Copeland noted that Red Hat partners have shown increasing interest in joining the IBM partner ecosystem since the acquisition was announced. IBM has stated it intends Red Hat to operate independently under its ownership.

Logically acquires New York-area IT services company

Christopher Claudio, CEO at LogicallyChristopher Claudio

Logically, a managed IT services provider based in Portland, Maine, has acquired Sullivan Data Management, a 10-employee outsourced IT services firm located north of New York City.

Launched earlier this year, Logically formed from the merger of Winxnet Inc., an IT consulting and outsourcing firm in Portland, and K&R Network Solutions, a San Diego-based managed service provider (MSP). 

Christopher Claudio, Logically’s CEO, said in an April 2019 interview that the company was looking for additional acquisitions. The Sullivan Data Management deal is the first of those transactions. Claudio said two or three acquisitions may follow by the end the year.

Sullivan Data Management fits Logically’s strategy of focusing on locations outside of top-tier metropolitan areas, where MSP competition is thickest. The company is based in Yorktown Heights in Westchester County, an hour’s drive from New York. Sullivan Data Management services customers in Westchester and neighboring counties.

This is the type of acquisition we are looking for.
Christopher ClaudioCEO, Logically

“This is the type of acquisition we are looking for,” Claudio said.

He also pointed to an alignment between Logically’s vertical market focus and Sullivan Data Management’s local government customer base.

“We do a lot of work within the public sector,” Claudio said, noting the bulk of Sullivan Data Management’s clients are in the public safety and municipal management sectors.

Barracuda reports growth in email security market

Barracuda Networks said its email security business is booming, with a $200 million annual revenue run rate for fiscal year 2019, which ended Feb. 28.

For the first quarter of its fiscal year 2020, Barracuda said its email security product, Barracuda Sentinel, saw 440% year-over-year growth in sales booking. In the same time frame, its email security, backup and archiving package, Barracuda Essentials, saw 46% growth in sales bookings, the vendor said.

Meanwhile, Barracuda’s business unit dedicated to MSPs reported the annual recurring revenue for its email protection business increased 122% year over year for the first quarter of fiscal year 2020.

Ezra Hookano, vice president of channels at Barracuda, based in Campbell, Calif., cited conditions of the email security market as one driver behind the company’s growth. Phishing attacks have become more sophisticated in their social-engineering tactics. Email security threats are “very specific and targeted to you and your industry, and [no one is] immune — big or small,” he said.

Hookano also pointed to Barracuda’s free threat scan tool as an important business driver. He said many Barracuda resellers are using the threat scans to drive the sales process.

“We, to this point, have never run a threat scan that didn’t come back with at least some things that were wrong in the [email] network. … About 30% to 40% of the time, something is so bad that [customers] have to purchase immediately.”

Barracuda is looking to differentiate itself from its pure-play email security competitors by tapping into its portfolio, he noted. The company’s portfolio includes web filtering and firewall products, which feed threat data into Barracuda email security.

“If I’m a pure-play email security vendor now, I no longer have the ability to be as accurate as a portfolio company,” Hookano said.

Data from Barracuda’s remote monitoring and management product, Managed Workplace, which the vendor acquired from Avast earlier this year, also bolster email security capabilities.

“Our goal in the email market … is to use all of our other products and the footprint from our other products to make our email security signatures better and to continue to build on our lead there,” he said.

Equinix cites channel in Q2 bookings, customer wins

Equinix Inc., an interconnection and data center company based in Redwood City, Calif., cited channel sales as a key driver behind second-quarter bookings.

The company said channel partners contributed more than 25% of the bookings for the quarter ended June 30. And those bookings accounted for 60% of Equinix’s new customer wins during the quarter, according to the company. Equinix’s second-quarter revenue grew 10% year over year to $1.385 billion.

In a statement, Equinix said it has “deepened its engagement with high-priority partners to drive increased productivity and joint offer creation across its reseller and alliance partners.”

Those partners include Amazon, AT&T, Microsoft, Oracle, Orange, Telstra, Verizon and World Wide Technology, according to a company spokeswoman. New channel wins in the second quarter include a deal in which Equinix is partnering with Telstra to provide cloud connectivity at Genomics England.

Equinix has also partnered with Assured DP, a Rubrik-as-a-service provider, in a disaster recovery offering.

Other news

  • Microsoft partners could see a further boost in momentum for the company’s Teams calling and collaboration platform. Microsoft said it will retire Skype for Business Online on July 31, 2021, a move that should pave the way for Skype-to-Teams migrations. Microsoft officials speaking last month at the Inspire conference cited Teams as among the top channel opportunities for Microsoft’s 2020 fiscal year. Partners expect to find business in Teams training and governance. The Skype for Business Online retirement does not affect Skype Consumer services or Skype for Business Server, according to a Microsoft blog post.
  • Google said more than 90 partners have obtained Google Cloud Partner specializations in the first half of 2019. Partners can earn specializations in areas such as applications development, cloud migration, data analytics, education, work transformation, infrastructure, IoT, location-based services, machine learning, marketing analytics and security. Partners that acquired specializations during the first half of the year include Accenture, Agosto, Deloitte Consulting and Maven Wave Partners. The specialization announcement follows the launch of the Google Cloud Partner Advantage Program, which went live July 1.
  • Mimecast Ltd., an email and data security company based in Lexington, Mass., unveiled its Cyber Alliance Program, which aims to bring together security vendors into what it terms a “cyber-resilience ecosystem.” The program includes cybersecurity technology categories such as security information and event management; security orchestration, automation and response; firewall, threat intelligence and endpoint security. Mimecast said the program offers customers and partners purpose-built, ready-to-use integrations; out-of-the-box APIs; documented guides with sample code; and tutorials that explain how to use the integrations.
  • Two-thirds of channel companies said they have changed their customer experience tactics, with 10% reporting a move to an omnichannel approach for customer interaction in the past year. Those are among the results of a CompTIA survey of more than 400 channel companies. The survey cited customer recruitment and customer retention as areas where the most respondents reported deficiencies.
  • CompTIA also revealed this week that it will acquire Metacog, an assessment, certification and training software vendor based in Worcester, Mass. CompTIA said Metacog’s technology will be incorporated into the upcoming release of its online testing platform. Metacog’s technology uses AI, big data and IoT APIs.
  • Naveego unveiled Accelerator, a tool for partners to analyze data accuracy across a variety of sources. Accelerator “is a great starting point for our partners to get a feel for just how ready a customer’s data is to participate in [data-based projects]. In general, that makes the projects have a higher degree of success, as well as go much more smoothly,” said Derek Smith, CTO and co-founder of the company. Naveego works with about 10 partners, recently expanding its roster with Frontblade Systems, H2 Integrated Solutions, Mondelio and Narwal.
  • US Signal, a data center services provider based in Grand Rapids, Mich., said its DRaaS for VMware offering is generally available. The offering, based on VMware vCloud Availability, provides disaster recovery services for multi-tenant VMware clouds.
  • Synchronoss Technologies Inc., a cloud and IoT product provider based in Bridgewater, N.J., is working with distributor Arrow Electronics to develop and market an IoT smart building offering. The IoT offering is intended for telecom operators, service providers and system integrators.
  • Distributor Synnex Corp. inked a deal with Arista Networks to provide its data center and campus networking products.
  • Peerless Networks, a telecom services provider, named Ryan Patterson as vice president of channel sales. Patterson will oversee the recruitment of new master and direct agents for Peerless’ national channel program, the company said.
  • Nintex, a process management and workflow automation vendor, hired Michael Schultz as vice president of product, channel and field marketing, and Florian Haarhaus as vice president of sales for EMEA.

Market Share is a news roundup published every Friday.

Go to Original Article
Author:

Merger and acquisition activity is altering the BI landscape

From Qlik’s acquisition of Podium Data in July 2018 through Salesforce’s purchase of Tableau Software in June 2019, the last year in BI has been characterized by a wave of consolidation.

Merger and acquisition activity is altering the BI landscape

Capped by Salesforce’s $15.7 billion acquisition of Tableau Software on June 10, 2019, and Google’s $2.6 billion purchase of Looker just four days earlier, the BI market over the last year has been marked by a wave of merger and acquisition activity.

Qlik kicked off the surge with its acquisition of Podium Data in July 2018, and, subsequently, made two more purchases.

“To survive, you’re going to have to reach some kind of scale,” said Rick Sherman, founder and managing partner of Athena IT Solutions, in a SearchBusinessAnalytics story in July 2019. “Small vendors are going to be bought or merge with more focused niche companies to build a more complete product.”

It was a little more than a decade ago that a similar wave of merger and activity reshaped the BI landscape, highlighted by IBM buying Cognos Analytics and SAP acquiring Business Objects.

After the flurry of deals in the spring ending with the premium Salesforce paid for Tableau, the pace of mergers and acquisition activity has slowed since the start of the summer, but more could be coming soon as more vendors with a specialized purpose seek partners with complementary capabilities in an attempt to keep pace with competitors that have already filled out their analytics stack.

Go to Original Article
Author:

Cohesity CyberScan scans backup copies for security risks

The latest application added to Cohesity Marketplace is designed to trawl through backups to look for security vulnerabilities.

Cohesity CyberScan is a free application available on the Cohesity Marketplace. Using Tenable.io, it compares applications in a backup environment against the public Common Vulnerabilities and Exposures (CVE) database to detect possible security flaws. The admin can then address these flaws, whether it’s an out-of-date patch, software bugs or vulnerabilities around open ports.

Cohesity CyberScan doesn’t affect the production environment when it performs its scan. The application boots up a backup snapshot and performs an API call to Tenable.io in order to find vulnerabilities. This process has the added benefit of confirming whether a particular snapshot is recoverable in the first place.

Raj Dutt, director of product marketing at Cohesity, said because the vulnerability scan happens in Cohesity’s runtime environment and not the live environment, many customers may be prompted to perform these scans. Citing a recent study performed by independent IT research firm Ponemon Institute, Dutt said 37% of respondents who suffered a security breach did not scan their environments for vulnerabilities.

“They work in an environment where the organization is expected to run 24/7/365, so essentially, there is no downtime to run these scans or make the patches,” Dutt said.

Dutt said even organizations that do perform vulnerability scans don’t do them often enough. Vulnerabilities and exposures published on the CVE database are known to bad actors, so weekly or monthly scans still leave organizations with a wide window in which they can be attacked. Dutt said since Cohesity CyberScan doesn’t interfere with the production environment, customers are free to run scans more frequently.

screenshot of Cohesity CyberScan dashboard
The Security Dashboard houses the vulnerability scan and anti-ransomware capabilities.

Phil Goodwin, a research director at IT analyst firm IDC, said there are applications that scan backup copies or secondary data but scan mostly to detect out-of-date drivers or other roadblocks to a successful restore or failover. Running it against a CVE database to look for potential security problems is unique.

End users are talking about data protection and security in the same sentence. It really is two sides of the same coin.
Phil GoodwinResearch director, IDC

Goodwin said Cohesity CyberScan is the latest example of backup vendors adding security capabilities. Security and data protection are different IT disciplines that call for different technology, but Goodwin said he has encountered customers conflating the two.

“End users are talking about data protection and security in the same sentence,” Goodwin said. “It really is two sides of the same coin.”

Security is the proactive approach of preventing data loss, while data protection and backup are reactive. Goodwin said organizations should ideally have both, and backup vendors are starting to provide proactive features such as data masking, air gapping and immutability. But Goodwin said he has noticed many vendors stop shy of malware detection.

Indeed, Cohesity CyberScan does not have malware detection. Dutt stressed that the application’s use cases are focused on detecting cyberattack exposures and ensuring recoverability of snapshots. He pointed out that Cohesity DataPlatform does have anti-ransomware capabilities, and they can be accessed from the same dashboard as CyberScan’s vulnerability scan.

Cohesity CyberScan is generally available now to customers who have upgraded to the latest Cohesity Pegasus 6.4 software. The application itself is free, but customers are required to have their own Tenable license.

Go to Original Article
Author:

Capital One breach suspect may have hit other companies

A new report looking into the attacker accused in the Capital One breach discovered references to other potential victims, but no corroborating evidence has been found yet.

The FBI accused Paige Thompson, who allegedly went by the name “Erratic” on various online platforms, including an invite-only Slack channel. The Slack channel was first reported on by investigative cybersecurity journalist Brian Krebs, who pointed out that file names referenced in the channel pointed to other organizations potentially being victims of similar attacks.

A new report by cybersecurity firm CyberInt, based in London, regarding the Capital One breach built on the information discovered by Krebs. Jason Hill, lead cybersecurity researcher at CyberInt, said the company was able to gain access to the Slack channel via an open invitation link.

“This link was obtained from the now-offline ‘Seattle Warez Kiddies’ Meetup group (Listed as ‘Organized by Paige Thomson’),” Hill wrote via email. “Based on the publicly available information at the time of report completion, such as Capital One’s statement and the [FBI’s] Criminal Complaint, we were able to conduct open source intelligence gathering to fill in some of the missing detail and follow social media leads to gain an understanding of the alleged threat actor and their activity over the past months.”

According to Hill, CyberInt researchers followed the trail through a GitHub account, GitLab page and a screenshot of a file archival process shared in the Slack channel.

“The right-hand side of the screen appears to show the output of the Linux command ‘htop’ that lists current processes being executed. In this case, under the ‘Command’ heading, we can see a number of ‘tar –remove-files -cvf – ‘ processes, which are compressing data (and then removing the uncompressed source),” Hill wrote. “These files correlate with the directory listing, and potential other victims, as seen later within the Slack channel.”

Between the files named in the screenshot and the corresponding messages in the Slack channel, it appeared as though in addition to the Capital One breach, the threat actor may have stolen 485 GB of data from various other organizations. Some organizations were implied by only file names, such as Ford, but others were named directly by Erratic in messages, including the Ohio Department of Transportation, Michigan State University, Infoblox and Vodafone.

Hill acknowledged that CyberInt did not directly contact any of the organizations named, because the company policy is normally to “contact organizations when our research detects specific vulnerabilities that can be mitigated, or threats detected by our threat intelligence platform.

“However in this case, our research was focused on the Capital One breach to gain an understanding of the threat actor’s tactics, techniques and procedures (TTP) and resulted in the potential identification of additional victims rather than the identification of any specific vulnerability or ongoing threat,” Hill wrote. “Our report offered general advice for those concerned about the TTP based on these findings.”

We contacted some of the organizations either directly named or implied via file name in Erratic’s Slack channel. The Ohio Department of Transportation did not respond to a request for comment. Ford confirmed an investigation is underway to determine if the company was the victim of a data breach.

A spokesperson for Michigan State University also confirmed an investigation is underway and the university is cooperating with law enforcement authorities, but at this point there is “no evidence to suggest MSU was compromised.”

Similarly, an Infoblox spokesperson said the company was “continuing to investigate the matter, however, at this time, there is no indication that Infoblox was in any way involved with the Capital One breach. Additionally, there is no indication of an intrusion or data breach causing Infoblox customer data to be exposed.”

A Vodafone spokesperson claimed the company takes security seriously, but added, “Vodafone is not aware of any information that relates to the Capital One security breach.”

Go to Original Article
Author:

Data ethics issues create minefields for analytics teams

GRANTS PASS, Ore. — AI technologies and other advanced analytics tools make it easier for data analysts to uncover potentially valuable information on customers, patients and other people. But, too often, consultant Donald Farmer said, organizations don’t ask themselves a basic ethical question before launching an analytics project: Should we?

In the age of GDPR and like-minded privacy laws, though, ignoring data ethics isn’t a good business practice for companies, Farmer warned in a roundtable discussion he led at the 2019 Pacific Northwest BI & Analytics Summit. IT and analytics teams need to be guided by a framework of ethics rules and motivated by management to put those rules into practice, he said.

Otherwise, a company runs the risk of crossing the line in mining and using personal data — and, typically, not as the result of a nefarious plan to do so, according to Farmer, principal of analytics consultancy TreeHive Strategy in Woodinville, Wash. “It’s not that most people are devious — they’re just led blindly into things,” he said, adding that analytics applications often have “unforeseen consequences.”

For example, he noted that smart TVs connected to home networks can monitor whether people watch the ads in shows they’ve recorded and then go to an advertiser’s website. But acting on that information for marketing purposes might strike some prospective customers as creepy, he said.

Shawn Rogers, senior director of analytic strategy and communications-related functions at vendor Tibco Software Inc., pointed to a trial program that retailer Nordstrom launched in 2012 to track the movements of shoppers in its stores via the Wi-Fi signals from their cell phones. Customers complained about the practice after Nordstrom disclosed what it was doing, and the company stopped the tracking in 2013.

“I think transparency, permission and context are important in this area,” Rogers said during the session on data ethics at the summit, an annual event that brings together a small group of consultants and vendor executives to discuss BI, analytics and data management trends.

AI algorithms add new ethical questions

Being transparent about the use of analytics data is further complicated now by the growing adoption of AI tools and machine learning algorithms, Farmer and other participants said. Increasingly, companies are augmenting — or replacing — human involvement in the analytics process with “algorithmic engagement,” as Farmer put it. But automated algorithms are often a black box to users.

Mike Ferguson, managing director of U.K.-based consulting firm Intelligent Business Strategies Ltd., said the legal department at a financial services company he works with killed a project aimed at automating the loan approval process because the data scientists who developed the deep learning models to do the analytics couldn’t fully explain how the models worked.

We’ve gone from a bottom-up approach of everybody grabbing data and doing something with it to more of a top-down approach.
Mike FergusonManaging director, Intelligent Business Strategies Ltd.

And that isn’t an isolated incident in Ferguson’s experience. “There’s a loggerheads battle going on now in organizations between the legal and data science teams,” he said, adding that the specter of hefty fines for GDPR violations is spurring corporate lawyers to vet analytics applications more closely. As a result, data scientists are focusing more on explainable AI to try to justify the use of algorithms, he said.

The increased vetting is driven more by legal concerns than data ethics issues per se, Ferguson said in an interview after the session. But he thinks that the two are intertwined and that the ability of analytics teams to get unfettered access to data sets is increasingly in question for both legal and ethical reasons.

“It’s pretty clear that legal is throwing their weight around on data governance,” he said. “We’ve gone from a bottom-up approach of everybody grabbing data and doing something with it to more of a top-down approach.”

Jill Dyché, an independent consultant who’s based in Los Angeles, said she expects explainable AI to become “less of an option and more of a mandate” in organizations over the next 12 months.

Code of ethics not enough on data analytics

Staying on the right side of the data ethics line takes more than publishing a corporate code of ethics for employees to follow, Farmer said. He cited Enron’s 64-page ethics code, which didn’t stop the energy company from engaging in the infamous accounting fraud scheme that led to bankruptcy and the sale of its assets. Similarly, he sees such codes having little effect in preventing ethical missteps on analytics.

“Just having a code of ethics does absolutely nothing,” Farmer said. “It might even get in the way of good ethical practices, because people just point to it [and say], ‘We’ve got that covered.'”

Instead, he recommended that IT and analytics managers take a rules-based approach to data ethics that can be applied to all three phases of analytics projects: the upfront research process, design and development of analytics applications, and deployment and use of the applications.

Go to Original Article
Author:

Cisco security flaw leads to $8.6M payout in whistleblower case

Cisco agreed to pay $8.6 million to settle a whistleblower lawsuit that accused the company of selling video surveillance software to government agencies despite knowing for years that the product suffered from critical security vulnerabilities. The settlement was the first of its kind against a tech company for alleged cybersecurity fraud.

Hackers could have used the Cisco security flaw to gain access to a customer’s local area network, potentially giving them control over physical security systems such as locks and alarms. The hackers also could have exploited the weakness to view, modify and delete video surveillance feeds and to obtain user passwords that would mask their activities.

Federal agencies that used the flawed product to manage video surveillance feeds included the Department of Defense, the Secret Service, the Department of Homeland Security, NASA and the Federal Emergency Management Agency, according to court documents unsealed Wednesday. Major airports, police departments and public transit systems had also deployed the product.

Cisco became aware of flaws in the product, called the Cisco Video Surveillance Manager, no later than May 2008 but did not issue a security advisory until July 2013, according to Cisco’s settlement agreement with 15 states and the District of Columbia. Offices of the state attorneys general provided a copy of the deal.

Cisco did not admit wrongdoing.

Cisco has made security a main selling point of its cloud products in recent years. This week’s revelations risk sullying that reputation at a time when consumers and businesses are becoming leerier of the threats posed by new technologies. The case underscores that vendors need more than just secure software — they need well-enforced protocols for responding to reported defects. 

In a blog post, Cisco said the settlement showed that software companies were increasingly being held to a higher standard on security. “In short, what seemed reasonable at one point no longer meets the needs of our stakeholders today,” said Mark Chandler, Cisco’s executive vice president and chief legal officer.

Whistleblower’s lawsuit

James Glenn, a former employee of Denmark-based Cisco partner NetDesign, sued Cisco in May 2011 on behalf of the federal government and numerous state governments who had purchased the product. Glenn acted as a whistleblower under the provisions of federal and state fraud laws that allow private citizens to file lawsuits on behalf of governments.

James GlennJames Glenn

Glenn alerted Cisco to the vulnerabilities in October 2008. In March 2009, while attempting to get Cisco to patch the flaws, Glenn’s position with NetDesign was terminated because of “economic concerns,” according to the lawsuit. NetDesign did not respond to a request for comment.

Glenn first alerted federal authorities to the security issue in September 2010, asking a family member to tell the FBI that the Los Angeles International Airport was using the software. Glenn later spoke to a detective for the airport who served on the FBI’s Joint Terrorism Task Force.

The settlement marks the first instance of a citizen-initiated whistleblower lawsuit prompting the U.S. government to successfully seek a financial penalty against a tech company for cybersecurity fraud, according to Constantine Cannon LLP, a law firm that represented Glenn.

As part of the $8.6 million settlement, Cisco will pay Glenn $1.6 million. Separately, Glenn is asking a federal judge to order Cisco to reimburse him for attorneys’ fees and other costs related to bringing the action. Nevertheless, the penalty is a tiny drop in the bucket for Cisco, which brought in $49.3 billion in revenue last fiscal year.  

The settlement — representing a partial refund for those who bought the product — covers only the government agencies involved, meaning Cisco could still be subject to lawsuits by private companies that used the software, which the vendor sold between 2008 and 2014.

“My view is that there are likely international governments, as well as domestic and international private companies, who could be impacted here for sure,” said Mary Inman, an attorney for Constantine Cannon LLP’s whistleblower practice group. “I would expect to see follow-on lawsuits from class-action attorneys representing some of the private customers here.”

Cisco’s handling of the security flaw

Cisco inherited the technology behind the product through its 2007 acquisition of Broadware. Cisco released a best practices guide in 2009 that the company claims addressed the security vulnerabilities in question. However, in an interview Thursday, Glenn disputed the guide’s helpfulness. “I didn’t see a version of the guide that would have been effective in mitigating those issues,” he said.

Cisco released an advisory in July 2013, shortly after a security website posted publicly about the vulnerabilities. The company released a software update in December 2012 that eliminated the flaws, but customers were not forced to upgrade. Cisco continued to sell vulnerable versions of the product until September 2014. 

The lawsuit accused Cisco of violating the federal False Claims Act by knowingly selling a product that failed to comply with security standards for government computer systems. The company also allegedly failed to warn customers subscribed to its premium security service about the flaws.

“We’re increasingly seeing whistleblowers from around the world alerting the U.S. authorities to fraud,” Inman said. “[This is] the first of what we believe will be … many whistleblower-initiated lawsuits which are helping to hold the tech community accountable.”

Go to Original Article
Author:

Exten Technologies releases 3.0 version of NVMe platform

Exten Technologies has released the third generation of its HyperDynamic storage software, which was designed with the aim of bringing more resiliency, performance and management to data center customers.

New features in generation three include node-level resiliency with synchronous replicas, shared volumes with replicas for supporting parallel file systems, dual parity resiliency, and integrated drive management and hot swap.

Exten software is deployed on the storage target and does not require proprietary software on compute clients.

According to Exten, HyperDynamic 3.0 aims to improve TCP performance with Solarflare TCP acceleration that provides TCP performance near remote direct memory access.

It also has new features designed to simplify NVMe-oF storage management and deployment, Exten claims. These features include declustered RAID, which enables the configuration of resilient volumes that use Linux multi-path IO software to provide redundancy in both networking and storage. Exten’s interface provides node- and cluster-level telemetry. Users can also set quality-of-service limits in order to manage performance during drive or node rebuilds.

Exten Technologies is part of a batch of newer vendors making their way in the NVMe market.

Apeiron Data Systems offers a handful of NVMe storage products, including enterprise NVMe. It is NVMe over Ethernet, as opposed to over fabric, and was designed with the goal of delivering the performance and cost of server-based scale-out but with the manageability of enterprise storage.

Vendor Vexata also touts its RAID-protected NVMe and claims it has ultralow latency at scale. According to its website, the company was founded in an attempt to provide better performance and efficiency, while at a lower cost than other flash storage solutions.

Exten Technologies’ HyperDynamic 3.0 is available now.

Go to Original Article
Author:

CloudKnox Security adds privileged access features to platform

CloudKnox Security, a vendor in identity privilege management, introduced new features to its Cloud Security Platform, including Privilege-on-Demand, Auto-Remediation for Machine Identities and Anomaly Detection.

The offerings intend to increase enterprise protection from identity and resource risks in hybrid cloud environments. According to CloudKnox Security, the new release is an improvement on its existing Just Enough Privileges Controller, which enables enterprises to reduce overprovisioned identity privileges to appropriate levels across VMware, AWS, Azure and Google Cloud.

Privileged accounts are often targets for attack, and a successful hacking attempt can result in full control of an organization’s data and assets. The 2019 Verizon Data Breach Investigations Report highlighted privileged account misuse as the top threat for security incidents and the third-leading cause of security breaches.

The Privilege-on-Demand feature from CloudKnox Security enables companies to grant privileges to users for a certain amount of time and on a specific resource on an as-needed basis. The options include Privilege-on-Request, Privilege Self-Grant or Just-in-Time Privilege that give users access to a specific resource within a set time to perform an action.

The Auto-Remediation feature can frequently and automatically dismiss unused privileges of machine identities, according to the vendor. For example, the feature can be useful dealing with service accounts that perform repetitive tasks with limited privileges, because when these accounts are overprovisioned, organizations will be particularly vulnerable to privilege misuse.

The Anomaly Detection feature creates risk profiles for users and resources based on data obtained by CloudKnox’s Risk Management Module. According to the vendor, the software intends to detect abnormal behaviors from users, such as a profile carrying out a high-risk action for the first time on a resource they have never accessed.

The company will demonstrate the new features at Black Hat USA in Las Vegas this year for the first time. CloudKnox’s update to its Cloud Security Platform follows competitor CyberArk‘s recent updates to its own privileged access management offering, including zero-trust access, full visibility and control of privileged activities for customers, biometric authentication and just-in-time provisioning. Other market competitors that promise insider risk reduction, identity governance and privileged access management include BeyondTrust and One Identity.

Go to Original Article
Author: