Tag Archives: users

Rackspace colocation program hosts users’ legacy servers

Rackspace’s latest service welcomes users’ legacy gear into Rackspace data centers and once in place, gives the vendor a golden opportunity to sell these customers additional services.

The Rackspace Colocation program primarily targets midsize and larger IT shops that want to launch their first cloud initiative, or sidestep the rising costs to operate their own internal data centers. Many of these IT shops have just begun to grapple with the realities of their first digital transformation projects. They must choose where to position key applications from private clouds to microservices that run on Azure and Google Cloud.

Some Rackspace users run applications on customized hardware and operating systems that are not supported by public clouds, while others have heavily invested in hardware and want to hold onto those systems for another five years to get the full value out of those systems, said Henry Tran, general manager of Rackspace’s managed hosting and colocation business.

Customers that move existing servers into Rackspace’s data centers gain better system performance from closer proximity to Rackspace’s infrastructure. This gives Rackspace a chance to upsell those customers add-on interconnectivity and other higher-margin services.

“[The Rackspace Colocation services program] is a way to get you in the door by handling all the mundane stuff, but longer term they are trying to get you to migrate to their cloud,” said Cassandra Mooshian, senior analyst at Technology Business Research Inc. in Hampton, N.H.

Green light for greenfield colocation services

There are still many enterprise workloads that run in corporate data centers, so there are a lot of greenfield opportunities to pursue in colocation services. Roughly 60% of enterprises don’t use colos today, and the colocation market should grow around 8% annually through 2021, said Dan Thompson, a senior analyst at 451 Research. “There is still a lot of headroom for companies to migrate to colocation and/or cloud,” he said.

This speaks loudly to the multi-cloud and hybrid cloud world we are living in.
Dan Thompsonanalyst, 451 Research

Other colocation service providers have expanded with various higher-margin cloud and other managed services, but Rackspace has chosen a different path.

“They’ve had hosting and cloud services for a while but are now moving in the direction of colocation,” 451 Research’s Thompson said. “This speaks loudly to the multi-cloud and hybrid cloud world we are living in.”

Rackspace’s acquisition of Datapipe in late 2017 initiated its march into colocation, with the ability to offer capabilities and services to Datapipe customers through Microsoft’s Azure Stack, VMware’s Cloud on AWS and managed services on Google’s Cloud platform. In return, Rackspace gained access to Datapipe’s colocation services and data centers to gain a market presence in the U.S. West Coast, Brazil, China and Russia.

Rackspace itself was acquired in late 2016 by private equity firm Apollo Global Management LLC, which gave the company some financial backing and freedom to expand its business.

Chrome site isolation arrives to mitigate Spectre attacks

Version 67 of Google Chrome enabled site isolation by default in an effort to protect users against Spectre-based attacks.

Google has been testing Chrome site isolation since version 63, but has now decided the feature is ready for prime time to help mitigate Spectre attacks. Google described Chrome site isolation as a “large change” to the browser’s architecture “that limits each renderer process to documents from a single site. As a result, Chrome can rely on the operating system to prevent attacks between processes, and thus, between sites.”

“When site isolation is enabled, each renderer process contains documents from at most one site. This means all navigations to cross-site documents cause a tab to switch processes,” Charlie Reis, site isolator at Google, wrote in a blog post. “It also means all cross-site iframes are put into a different process than their parent frame, using ‘out-of-process iframes.’ Splitting a single page across multiple processes is a major change to how Chrome works, and the Chrome Security team has been pursuing this for several years, independently of Spectre.”

This is a major change to the previous multi-process architecture in Chrome in which there were ways to connect to other sites in the same process using iframes or cross-site pop-ups. Reis noted there are still ways an attacker could access cross-site URLs even with Chrome site isolation enabled; he warned developers to ensure “resources are served with the right MIME type and with the nosniff response header,” in order to minimize the risk of data leaks.

A source close to Google described the aim of Chrome site isolation as an effort to protect the most sensitive data, so even if new variants of Spectre or other side-channel attacks are discovered, the attack may be successful but Chrome will keep things worth stealing out of reach.

Brandon Czajka, vice CIO at Switchfast Technologies, said it’s reassuring to see Google “lead the field” by developing new features such as Chrome site isolation.

“Google’s site isolation appears to work as a means of separation. Rather than allowing Chrome to process data for all websites opened under a single renderer, site isolation separates the rendering process to limit a sites access to user data that may have been entered on other sites (or in other words, increases confidentiality),” Czajka wrote via email. “So, while a user could still fall victim to a Spectre attack, its scope should be more limited to just the malicious site rather than affording it unlimited access.”

Chrome site isolation has been enabled for 99% of users on Windows, Mac, Linux and Chrome OS, according to Google, with Android support still in the works. However, the added protection and increased number of processes will require more system resources.

“Site isolation is a significant change to Chrome’s behavior under the hood, but it generally shouldn’t cause visible changes for most users or web developers (beyond a few known issues),” Reis wrote. “Site isolation does cause Chrome to create more renderer processes, which comes with performance tradeoffs: on the plus side, each renderer process is smaller, shorter-lived, and has less contention internally, but there is about a 10-13% total memory overhead in real workloads due to the larger number of processes.”

Czajka said while performance may be one of the most important aspects for any business, “it is just one piece of the puzzle.”

“While Google’s site isolation may require more memory, and thus may slow browser performance, it is these type of security measures that help to secure the confidentiality and integrity of user data,” Czajka wrote.

Have I Been Pwned integration comes to Firefox and 1Password

Have I Been Pwned has been helping users find out if their data was part of a data breach since 2013, and now the service will be integrated into new products from Mozilla and 1Password.

Troy Hunt, the security expert who created and runs the project, announced the new Have I Been Pwned integration and noted the partnership with Firefox will “significantly expand the audience that can be reached.”

“I’m really happy to see Firefox integrating with HIBP in this fashion, not just to get it in front of as many people as possible, but because I have a great deal of respect for their contributions to the technology community,” Hunt wrote in a blog post. “They’ve also been instrumental in helping define the model which HIBP uses to feed them data without Mozilla disclosing the email addresses being searched for.”

This is a key feature featured in both Mozilla’s new Firefox Monitor and 1Password Watchtower: using Have I Been Pwned integration to allow users to search without disclosing email addresses. Hunt said this privacy feature will work in a similar way to the k-anonymity model used by Have I Been Pwned when searching for passwords.

When searching for passwords, Have I Been Pwned matches the first five characters of a SHA-1 hash, which returns, on average, 477 results per search range in a data set of 500 million records, in order to avoid exposing too much information about the password being queried — the results could include the password being queried, or not, but an attacker would not be able to determine the password being queried on the basis of the results returned. With email addresses, Hunt searches on the first six characters of the hash against the database of over 3 billion email addresses, but he added that this shouldn’t result in less secure searches.

“This number [of breached passwords] will grow significantly over time; more data breaches means more new email addresses means larger results in the range search. More importantly though, email addresses are far less predictable than passwords; as I mentioned earlier, if I was to spy on searches for Pwned Passwords, the prevalence of passwords in the system beginning with that hash can indicate the likelihood of what was searched by,” Hunt wrote. “But when we’re talking about email addresses, there’s no such indicator, certainly the number of breaches each has been exposed in divulges nothing in terms of which one is likely being searched for.”

Have I Been Pwned integration

Mozilla has built Have I Been Pwned integration into its Firefox Monitor tool, which will begin as an invitation-only service. Mozilla plans to invite an initial group of 250,000 people to test the feature on the web beginning next week and do a wider release later on.

1Password will include Have I Been Pwned integration in its Watchtower tool as part of the Breach Report feature. The Breach Report will let users know where an account with a user’s email address may have been compromised; show a list of websites where an item saved in 1Password might have been compromised; and show a list of breaches where a 1Password item was found, but the user has already changed the compromised data.

Currently, 1Password Watchtower is only available on the web, but 1Password expects to eventually add the service to all of its apps.

New WPA3 security protocol simplifies logins, secures IoT

Securing Wi-Fi access has long been an Achilles’ heel for users of wireless networks — especially for users of public networks, as well as for internet of things devices — but help is on the way.

Wi-Fi Alliance, the nonprofit industry group that promotes use of Wi-Fi, has begun certifying products supporting the latest version of its security protocol, the Wi-Fi Protected Access (WPA) specification, WPA3.

The new WPA3 security protocol is intended to simplify wireless authentication, especially for IoT devices, while at the same time improving security through the inclusion of new features and removal of legacy cryptographic and security protocols.

The WPA3 security protocol, announced in January, gives enterprises and individuals a better option for securing access to Wi-Fi networks. Support for WPA2 continues to be mandatory for all products in the Wi-Fi Alliance’s “Wi-Fi Certified” program, but the new WPA3 security protocol adds new capabilities for improved security, including stronger encryption and a more secure handshake.

In its press release, Wi-Fi Alliance wrote that the WPA3 security protocol “adds new features to simplify Wi-Fi security, enable more robust authentication, and deliver increased cryptographic strength for highly sensitive data markets.”

The new specification defines both an enterprise option, WPA3-Enterprise, which offers enterprises the “equivalent of 192-bit cryptographic strength,” to protect networks that transmit sensitive data; and an individual option, WPA3-Personal, which offers password-based authentication that can be more resilient against attacks even when “users choose passwords that fall short of typical complexity recommendations,” by using a secure key setup protocol, Simultaneous Authentication of Equals (SAE), to protect against attempts by malicious actors trying to guess passwords.

Wi-Fi Alliance also rolled out Wi-Fi Certified Easy Connect, an initiative for simplifying the secure initialization and configuration of wireless internet of things devices that have little or no display interfaces. The new program permits users to add devices to Wi-Fi networks with a different device — like a smartphone — that can scan a product quick response (QR) code.

Support for the new protocol will be made available as vendors begin incorporating it into their products. Wi-Fi Alliance members that plan to support the WPA3 security protocol include Cisco, Broadcom, Huawei Wireless, Intel and Qualcomm. A Wi-Fi Alliance spokesperson said by email that “Wi-Fi Alliance expects broad industry adoption of WPA3 by late 2019 in conjunction with the next generation of Wi-Fi based on 802.11ax standard.”

Tableau acquisition of MIT AI startup aims at smarter BI software

Tableau Software has acquired AI startup Empirical Systems in a bid to give users of its self-service BI platform more insight into their data. The Tableau acquisition, announced today, adds an AI-driven engine that’s designed to automate the data modeling process without requiring the involvement of skilled statisticians.

Based in Cambridge, Mass., Empirical Systems started as a spinoff from the MIT Probabilistic Computing Project. The startup claims its analytics engine and data platform is able to automatically model data for analysis and then provide interactive and predictive insights into that data.

The technology is still in beta, and Francois Ajenstat, Tableau’s chief product officer, wouldn’t say how many customers are using it as part of the beta program. But he said the current use cases are broad and include companies in retail, manufacturing, healthcare and financial services. That wide applicability is part of the reason why the Tableau acquisition happened, he noted.

Catch-up effort with advanced technology

In some ways, however, the Tableau acquisition is a “catch-up play” on providing automated insight-generation capabilities, said Jen Underwood, founder of Impact Analytix LLC, a product research and consulting firm in Tampa. Some other BI and analytics vendors “already have some of this,” Underwood said, citing Datorama and Tibco as examples.

The Tableau acquisition adds an AI-driven engine that’s designed to automate the data modeling process without requiring the involvement of skilled statisticians.

Empirical’s automated modeling and statistical analysis tools could put Tableau ahead of its rivals, she said, but it’s too soon to tell without having more details on the integration plans. Nonetheless, she said she thinks the technology will be a useful addition for Tableau users.

“People will like it,” she said. “It will make advanced analytics easier for the masses.”

Tableau already has been investing in AI and machine learning technologies internally. In April, the company released its Tableau Prep data preparation software, with embedded fuzzy clustering algorithms that employ AI to help users group data sets together. Before that, Tableau last year released a recommendation engine that shows users recommended data sources for analytics applications. The feature is similar to how Netflix suggests movies and TV shows based on what a user has previously watched, Ajenstat explained.

Integration plans still unclear

Ajenstat wouldn’t comment on when the Tableau acquisition will result in Empirical’s software becoming available in Tableau’s platform, or whether customers will have to pay extra for the technology.

[embedded content]

Empirical CEO Richard Tibbetts on its automated data
modeling technology.

“Whether it’s an add-on or how it’s integrated, it’s too soon to talk about that,” he said.

However, he added that the Empirical engine will likely be “a foundational element” in Tableau, at least partially running behind the scenes, with a goal that “a lot of different things in Tableau will get smarter.”

Unlike some predictive algorithms that require large stores of data to function properly, Empirical’s software works with “data of all sizes, both large and small,” Ajenstat said. When integration does eventually begin to happen, Ajenstat said Tableau hopes to be able to better help users identify trends and outliers in data sets and point them toward factors they could drill into more quickly.

Augmented analytics trending

Tableau’s move around augmented analytics is in line with what Gartner pointed to as a key emerging technology in its 2018 Magic Quadrant report on BI and analytics platforms.

Various vendors are embedding machine learning tools into their software to aid with data preparation and modeling and with insight generation, according to Gartner. The consulting and market research firm said the augmented approach “has the potential to help users find the most important insights more quickly, particularly as data complexity grows.”

Such capabilities have yet to become mainstream product requirements for BI software buyers, Gartner said in the February 2018 report. But they are “a proof point for customers that vendors are innovating at a rapid pace,” it added.

The eight-person team from Empirical Systems will continue to work on the software after the Tableau acquisition. Tableau, which didn’t disclose the purchase price, also plans to create a research and development center in Cambridge.

Senior executive editor Craig Stedman contributed to this story.

Research claims ‘widespread’ Google Groups misconfiguration troubles

A new report claims a significant number of G Suite users misconfigured Google Groups settings and exposed sensitive data, but the research leave unanswered questions about the extent of the issue.

According to Kenna Security research, there is a “widespread” Google Groups misconfiguration problem wherein Groups are set to public and are exposing potentially sensitive email data that could lead to “spearphishing, account takeover, and a wide variety of case-specific fraud and abuse.” Last year, Redlock Cloud Security Intelligence also found Google Groups misconfiguration responsible for exposure of data from hundreds of accounts.

Kenna said it sampled 2.5 million top-level domains and found 9,637 public Google Groups. Of those public Groups, the researchers sampled 171 and determined 31% of those organizations “are currently leaking some form of sensitive email” with a confidence level of 90%.

“Extrapolating from the original sample, it’s reasonable to assume that in total, over 10,000 organizations are currently inadvertently exposing sensitive information,” Kenna wrote in its blog post. “The affected organizations including Fortune 500 organizations; Hospitals; Universities and Colleges; Newspapers and Television stations; Financial Organizations; and even U.S. government agencies.”

For context, there are currently more than 3 million paid G Suite accounts and an unknown number of free G Suite accounts, and Kenna acknowledged via email that they “do not believe [they] tested the vast majority of G Suite enabled domains.” Additionally, Google confirmed that Groups are set to private by default and an administrator would need to actively choose to make a Group public or allow other users to create public Groups.

It is unclear how many G Suite accounts are set to public, but a source close to the situation said the vast majority of Google Groups are set to private, and Google has sent out messages to users who may be affected with instructions on how to fix the Google Groups misconfiguration.

Specifics versus extrapolation         

Kenna Security’s research likened the Google Groups misconfiguration issue to the recent spate of Amazon Web Server (AWS) exposures where S3 buckets were accidentally left public.

“Ultimately, each organization is responsible for the configuration of their systems. However, there are steps that can be taken to ensure organizations can easily understand the public/private state for something as critical as internal email,” a Kenna spokesperson wrote via email. “For example, when the AWS buckets leak occurred, AWS changed its UX, exposing a ‘Public’ badge on buckets and communicated proactively to owners of public buckets. In practice, public Google Group configurations require less effort to find than public S3 buckets, and often have more sensitive information exposed, due to the nature of email.”

However, a major difference between the research from Kenna and that done by UpGuard in uncovering multiple public AWS buckets is in the details. Kenna is extrapolating from a sample to claim approximately 10,000 of 3 million Google Groups (0.3%) are misconfigured, and the examples of exposed emails reveal the potential for spearphishing attacks or fraud.

On the other hand, UpGuard specifically attributed the exposed data it found, including Republican National Committee voter rolls for 200 million individuals, info on 14 million Verizon customers, data scraped from LinkedIn and Facebook, and NSA files detailing military projects.  

Alex Calic, chief strategy and revenue officer of The Media Trust, said Google “made the right call by making private the default setting.”

“At the end of the day, companies are responsible for collaborating with their digital partners/vendors on improving and maintaining their security posture,” Calic wrote via email. “This requires developing and sharing their policies on what information can be shared on workplace communication tools like Google Groups and who can access that information, keeping in mind that — given how sophisticated hackers are becoming and the ever-present insider threat, whether an attack or negligence — there is always some risk that the information will see the light of day.”

BI vendors aim to ease visual data analysis by business users

As analytics teams look to make it easier for business users to analyze and visualize data on their own, more BI vendors are adding visual data analysis capabilities designed to meet those needs — and new software releases from GoodData and Periscope Data provide fresh evidence.

GoodData last week launched a UI design framework called Spectrum for use with its embedded BI and analytics platform. The additions include a JavaScript library that BI developers can use to build custom applications with built-in data visualizations, plus functionality for creating alert-driven dashboards and enabling end users to visually explore data sets prepared for them by data analysts.

A week earlier, Periscope Data also rolled out new features that let business users analyze curated data sets in its data modeling and analytics platform. Instead of looking at static charts and dashboards, users can now tap a visual query editor with a drag-and-drop interface to create their own data visualizations without having to do any SQL coding.

Visual data analysis has been a trending topic for years, with self-service BI tools like Tableau, Qlik Sense and Microsoft’s Power BI setting the pace. Gartner again listed interactive visual exploration as one of the core capabilities of modern BI and analytics platforms in its 2018 Magic Quadrant report on them, published in February. However, making the use of BI software in business operations more pervasive — and more user-driven — remains a challenge in many organizations.

Screenshot of sample application built with GoodData Spectrum
GoodData’s Spectrum technology adds new capabilities for embedding BI functionality and data visualizations in applications.

Room to grow on visual data exploration

For example, Fivestars, which runs a customer loyalty program for local businesses, has 10 Tableau users — but they’re data analysts who do “heavy-lifting analytics” with the software, said Matt Curl, the San Francisco company’s vice president of business operations. Periscope Data’s platform offered less of a learning curve for business users as a data visualization and dashboard tool, Curl said. On the other hand, end-user data exploration in it has largely been limited to people with SQL skills, he added.

Matt Curl, FivestarsMatt Curl

Curl, who manages data infrastructure and analytics at Fivestars, said he sees “a lot of upside potential” for using Periscope’s new capabilities to expand visual data discovery and analysis by end users. As a trial, he built a table that holds data on the use of the Fivestars program by ZIP code; business managers can drag and drop the different data elements they want to examine into a chart template.

That gives the end users more control over the analytics process than they get in a set of dashboards Curl’s team built to track key business metrics and company initiatives. “It lets them look at what they want to look at when they want to look at it,” without having to call the data analysts for help or deal with “a wall of text in SQL,” he said.

Fivestars runs the Periscope Data platform on top of a repository from Treasure Data that pulls together data from 20 or so source systems for analysis. Curl is still assessing the new visual data analysis technology, but he expects to use it in real applications at the company, which is formally known as Five Stars Loyalty Inc. “The early signals are that people like it,” he said.

Periscope Data CEO Harry Glaser said the new features are meant to give business users “a sort of mini-Tableau” to explore data sets curated for them by data management and analytics teams. From a competitive standpoint, he added, they’re aimed at putting the San Francisco-based vendor on an equal footing for end-user analysis and visualization with Looker Data Sciences, its closest rival.

Screenshot of Periscope Data's visual data discovery tool
Business users can now drag and drop elements to create charts and other visualizations in Periscope Data’s analytics platform.

Not just a job for data scientists

GoodData’s Spectrum framework is similarly designed to enable users of its embedded BI software to do ad hoc visual data analysis and track key performance indicators (KPIs) in interactive dashboards. That’s in addition to GoodData.UI, the new JavaScript library based on React, an open source library created by Facebook. It gives developers access to thousands of visual components that can be incorporated into applications; they can also use D3.js and other visual libraries to design custom visualizations.

GoodData, also based in San Francisco, hopes Spectrum’s UI design features will grease the skids for broader use of its software by business managers and workers who aren’t particularly data savvy. “Not everyone is cut out to be a data scientist, or even a citizen data scientist,” CEO Roman Stanek said.

Those are precisely the kinds of users Zalando SE wants to reach with an analytics service that the Berlin-based online retailer is testing with about 185 product companies. Zalando, which sells shoes, clothing and beauty products in 15 European countries, has embedded GoodData’s software into the Consumer Insights service’s web portal to visualize data on sales and customer buying behavior for brand managers, marketers and other outside users.

Cody Alton, ZalandoCody Alton

Initially, the GoodData platform offered “a really quick way for us to visualize sort of standard information,” said Cody Alton, product manager for the service. Zalando began using GoodData.UI as a beta tester early this year, and the JavaScript library makes it easier to visualize data in different ways, according to Alton. He said that’s an important capability for his team, which is looking to make data visualizations “more playful and interactive” to better engage users of the service.

“We want to represent data in a way that isn’t complicated and seems a little fun,” Alton said. For example, that might mean “calling out one number they’re interested in, instead of listing 150,” he added. Doing so could also help Zalando justify the cost of the analytics service, which will be offered in two versions: a free one that tracks a few basic KPIs, and a paid one with a much broader set of metrics.

Microsoft Teams updates streamline search, sharing features

Microsoft has rolled out updates to Teams that improve how users share information and access apps within the service. The Microsoft Teams updates range from embedding content in conversations to handling new app integrations.

Users can now share content from apps integrated with Teams by directly embedding the information into the conversation, rather than by hyperlinking or taking a screenshot. Users can search for specific information in an app, such as Trello or YouTube, and include an interactive card with the information in a chat.

The Microsoft Teams updates also include a personal view of user apps within Teams that allow users to access tasks, issues or requests they have been assigned to. Users can view recently accessed items such as OneNote notebooks or videos from Microsoft Stream. Microsoft also added a new app, called Who, which uses capabilities from Microsoft Office Graph to let users search for people within their organization by name or topic.

Microsoft has also integrated its automated workflow service, Flow, with Teams. The integration allows users to create and manage workflows, review approval requests and launch workflows from within Teams.

Constellation Research analyst Alan Lepofsky said in a blog post that the Microsoft Teams updates are good enhancements, but some features, such as the commands for searching and sharing information, will take some getting used to.

However, the updates could lead to increased user adoption of Teams. “The more integration, the more seamless experience, the more time you spend inside these programs,” he said.

The latest Microsoft Teams updates make sharing content easier.
A new Microsoft Teams feature allows users to embed information directly in a conversation.

Verizon and Ribbon partner for SBCs as a service

Verizon has introduced session border controllers as a service (SBCs) to help its customers secure their real-time communications.

The offering, a partnership between Verizon and Ribbon Communications, is a fully managed service that can be deployed either stand-alone or service-chained with other virtual network functions hosted by Verizon’s network service data centers. Individual virtual SBC instances are created for each customer; new SBCs can be spun up as needed to meet spikes in demand.

Virtual SBCs offer benefits for organizations by easing capital and operational expenses, allowing licensing-based subscription models and enabling integrated analytics and enterprise orchestration capabilities.

Ribbon, which was created following the October 2017 merger between Genband and Sonus, is the first vendor to offer virtual SBCs through Verizon’s virtual network services platform.

Midmarket, enterprise cloud adoption drives 8×8 revenues

Cloud communications service provider 8×8 Inc. has reported revenue growth driven by demand for cloud services in midmarket and enterprise organizations.

Service revenue reached nearly $72 million, a 20% year-over-year increase. Service revenue for midmarket and enterprise customers represented 59% of total services revenue, according to 8×8’s fiscal third quarter earnings report.

New monthly recurring revenue from midmarket and enterprise customers increased 70% year over year and comprised 65% total bookings in the quarter. Nearly half of new monthly recurring revenue from midmarket and enterprise customers came from organizations purchasing integrated unified-communications-as-a-service and contact-center-as-a-service offerings.

Vik Verma, CEO of 8×8, attributed the spike in revenue to demands by midmarket and enterprise CIOs for integrated enterprise communication services for their employees, customers and partners. He said midmarket and enterprise bookings grew 40% year over year.