Tag Archives: Google

Google location tracking continues even when turned off

Turning off Google location tracking may not be as simple as changing one setting to “off,” according to new research.

The unexpected Google location tracking behavior on Android and iOS devices was revealed by an Associated Press (AP) investigation and confirmed by computer science researchers at Princeton University. The issue was first raised in a blog post by K. Shankari, a graduate researcher at UC Berkley, in May 2018. Shankari kept note of prompts sent by Google to rate places or submit pictures to Google Maps, even though Google Location History was turned off on her device.

The AP investigation found that even with Google location tracking turned off, certain apps will take a timestamped snapshot of the user’s location and store that data when the user performs a search, opens Google Maps, or checks the weather.

The confusion stems from the different ways users have to control Google location tracking services. The Google Location History support page claims, “With Location History off, the places you go are no longer stored.” However, when turning off the Location History setting via a user’s Google My Activity page, a pop-up notes, “This setting does not affect other location services on your device, like Google Location Services and Find My Device. Some location data may be saved as part of your activity on other Google services, like Search and Maps.”

Turning off Google Location Services on a mobile device can cause apps to misbehave, so Google told the AP that the real fix for users would be to also turn off location tracking in Google’s “Web and App Activity” settings.

“Location History is a Google product that is entirely opt in, and users have the controls to edit, delete, or turn it off at any time. As the story notes, we make sure Location History users know that when they disable the product, we continue to use location to improve the Google experience when they do things like perform a Google search or use Google for driving directions,” a Google spokesperson wrote in an email.

Tim Mackey, technology evangelist at Synopsys, said this was an issue akin to saying “if my mother can’t figure out what it does, or how to turn it off, it’s too complicated.”

“The expectation of the consumer for an off switch is what matters most. Users wish their location be kept private indicate this preference through the Location History setting. That any given application might have independent settings for location related data is how an application developer or vendor approaches the problem,” Mackey wrote via email. “When we recognize that our digital footprint is effectively a personally identifying attribute, access to that attribute becomes more valuable. This is true for malicious actors who can use location information to determine not only patterns of behavior for their targets, but know when to best commit their crime. This is also true for law enforcement seeking to identify suspects following the commission of a crime. In each of these examples, the same location and identity data can be used for good or for ill to identify an individual.”

Cisco and Google deepen collaboration partnership

Developers from Cisco and Google have been working together to build native integrations between the Cisco Webex web conferencing and team collaboration platform and the productivity apps in G Suite. The partnership should help both vendors compete against rival Microsoft.

G Suite users will soon be able to schedule and join Webex meetings from Google Calendar with one click. The integration works on Cisco video conferencing hardware and within Google Chrome, without requiring downloads or guest accounts.

Cisco Webex Teams, which competes with Slack and Microsoft Teams, lets users collaborate in real time using Google Docs, Sheets and Slides — the G Suite equivalents of Microsoft Word, Excel and PowerPoint. That eliminates the need for users to upload a revised version of a file to Webex Teams after every edit.

Developers, meanwhile, can add Cisco calling and meeting capabilities to Android apps using the Webex Teams Android software development kit. For example, a pair of augmented reality smart glasses could be connected to Webex, so the wearer can stream a video feed from the device into a web conference.

“While Google and Microsoft compete with full portfolios of personal productivity and team collaboration, Cisco only has the team collaboration elements,” said Alan Lepofsky, principal analyst at Constellation Research, based in Cupertino, Calif. “So, deeper integration between Webex and Gmail, Google Calendar and Google Drive makes a lot of sense.”

Cisco and Google plan future integrations  

While Cisco’s collaboration portfolio also integrates with the productivity tools of Microsoft Office 365, those links are based on Microsoft’s public APIs. In contrast, Cisco and Google have been working directly together to create seamless connections between their portfolios.

Public APIs “tend to be semi-limiting at times,” said Sri Srinivasan, vice president and general manager of Cisco’s team collaboration group. “With Google, it looks like one between Google and Webex.”

Cisco and Google are currently exploring ways to use Google’s AI technology within Webex for tasks such as transcription, translation, meeting notes and project management. Cisco was also one of several vendors to adopt Google’s new AI platform for contact centers last month.

Amy Chang, who replaced Rowan Trollope as head of Cisco’s $5 billion collaboration technology group in May, worked at Google for seven years before founding relationship intelligence firm Accompany.

“For Cisco, it certainly makes a great deal of sense to expand these partnerships to improve their ability to compete with Microsoft,” said Irwin Lazar, analyst at Nemertes Research, based in Mokena, Ill. “Same for Google, who lacks the broad UC [unified communications] suite, video conferencing and contact center offerings that Cisco provides.”

Google expands presence in enterprise market

Google is working on additional Google Calendar integrations with the web conferencing vendors Arkadin, GoToMeeting, LogMeIn, Dialpad, RingCentral, Vidyo and Vonage. Google also recently made its online meetings platform, Hangouts Meet, interoperable with Microsoft Skype for Business and hardware from Cisco and Polycom.

At the same time that Google is tightening partnerships with Cisco and other vendors, the consumer tech giant has been building out its collaboration portfolio with products such as Hangouts Meet and Hangouts Chat, a team collaboration app.

Last month, Google announced a beta program for a new enterprise telephony product based on WebRTC, called Google Voice for G Suite. If that platform proves successful, Google will be in a position to provide all of the core unified communications technologies that businesses require: voice, messaging and web conferencing.

It was surprising to see Google launch a stand-alone voice product, rather than position Google Voice as a virtual service within G Suite that could tie into existing enterprise telephony systems, Lazar said. The move could bring them into closer competition with Cisco, a leading provider of business telephony. 

Google’s Edge TPU breaks model inferencing out of the cloud

Google is bringing tensor processing units to the edge. At the Google Cloud Next conference in San Francisco, the company introduced Edge TPU, an application-specific integrated circuit designed to run TensorFlow Lite machine learning models on mobile and embedded devices.  

The announcement is indicative of both the red-hot AI hardware market, as well as the growing influence machine learning is having on the internet of things and wireless devices. But the Edge TPU also gives Google a more comprehensive edge-to-cloud AI stack to compete against the likes of Microsoft, Amazon and IBM as it looks to attract a new generation of application developers.

Analysts called the move a good one. “This fills in a big gap that Google had,” said Forrester’s Mike Gualtieri.

Spotlight on model inferencing

Google’s cloud environment is a fertile ground for training AI models, a nontrivial process that requires enormous amounts of data and processing power. Once a model has been trained, it’s put into production where it performs what’s known as inferencing, or where it uses its training to make predictions.

Forrester, AI, Machine learning, edge TPUMike Gualtieri

A growing trend is to push inferencing out to edge devices such as wireless thermostats or smart parking meters that don’t need a lot of power or even connectivity to the cloud, according to David Schatsky, managing director at Deloitte LLP. “These applications will avoid the latency that can be present when shuttling data back and forth to the cloud because they’ll be able to perform inferencing locally and on the device,” he said.

But Google customers who wanted to embed their models into edge devices had to turn to another provider — Nvidia or Intel — for that kind of functionality. Until now. The Edge TPU will give Google customers a more seamless environment to train machine learning models in its cloud and then deploy them into production at the edge.

Deloitte, AI, machine learning, GoogleDavid Schatsky

It also appears to be a nod to the burgeoning relationship between AI and IoT. According to Schatsky, venture capital funding in AI-focused IoT startups outpaced funding to IoT startups overall last year. “AI is so useful in deriving insight from IoT data that it may soon become rare to find an IoT application that doesn’t use AI,” he said.

A competitive stack

They’re not just saying this is a TPU and you can run it on the edge. No, they’re saying this is a fundamentally new chip designed specifically for inferencing.
Mike Gualtierianalyst, Forrester

The Edge TPU is in the same vein as an announcement Microsoft made last year with Project Brainwave, a deep learning platform that converts trained models to run more efficiently on Intel’s Field-Programmable Gate Arrays than on GPUs, according to Gualtieri. “There is a fundamental difference in training a model versus inferencing a model,” he said. “Google recognizes this. They’re not just saying this is a TPU and you can run it on the edge. No, they’re saying this is a fundamentally new chip designed specifically for inferencing.”

Indeed, Gualtieri said, the Edge TPU makes Google more competitive with Microsoft, Amazon and even IBM, all of which made moves to differentiate between model training and model inferencing sooner. “This is an effort, I believe, for Google to make its cloud more attractive, oddly by saying, well, yes, we have the cloud, but we also have the edge — the non-cloud,” he said.

James Kobielus, lead analyst at SiliconAngle Wikibon, also sees the Edge TPU as a strategic move. He called the Edge TPU an example of how the internet giant is creating a complete AI stack of hardware, software and tools for its customers while adding a force multiplier to compete against other vendors in the space.

Wikibon, AI, machine learning, GoogleJames Kobielus

“Google is making a strong play to build a comprehensive application development and services environment in the cloud to reach out to partners, developers and so forth to give them the tools they need to build the new generation of apps,” he said.

Kobielus highlighted the introduction of the Edge TPU software development kit as another example of how Google is planning to compete. The dev kit, which is still in beta and available to only those who apply for access, shows a “great effort” to convince developers to build their apps on the Google cloud and to catch up to Amazon and Microsoft, both of which have a strong developer orientation, he said. “They needed to do this — to reach out to the developer market now while the iron is hot,” he said.

What is the Google AI stack missing? It’s too soon to tell, both Kobielus and Gualtieri said. But with innovation in AI happening at breakneck speed, companies should see this as a part of an evolution and not an end point.

“Different applications are going to require even different chips,” Gualtieri said. “Google is not behind on this. It’s just what’s going to happen because there may be very data-heavy applications or power requirements on smaller devices. So I would expect a whole bunch of different chips to come out. Is that a gap? I would say no because of maturity in this industry.”

HYCU moves beyond Nutanix backup with Google Cloud support

HYCU is now well-versed in the Google cloud.

HYCU, which began with a backup application built specifically for hyper-converged vendor Nutanix, today launched a service to back up data stored on Google Cloud Platform (GCP).

HYCU sprung up in June 2017 as an application sold by Comtrade that offered native support for Nutanix AHV and Nutanix customers using VMware ESX hypervisors. The Comtrade Group spun off HYCU into a separate company in March 2018, with both the new company and the product under the HYCU brand.

Today, HYCU for Google became available through the GCP Marketplace. It is an independent product from HYCU for Nutanix, but there is a Nutanix angle to the Google backup: GCP is Nutanix’s partner for the hyper-converged vendor’s Xi public cloud services. HYCU CEO Simon Taylor said his team began working on Google backup around the time Nutanix revealed plans for Xi in mid-2017. HYCU beat Nutanix out of the gate, launching its Google service before any Nutanix Xi Cloud Services have become generally available.

“We believe Nutanix is the future of the data center, and we place our bets on them,” Taylor said. “Everyone’s been asking us, ‘Beyond Nutanix, where do you go from here?’ We started thinking of the concept of multi-cloud. We see people running fixed workloads on-prem, and if it’s dynamic, they’ll probably put it on a public cloud. And Google is the public cloud that’s near and dear to Nutanix’s heart.”

HYCU backup for GCP is integrated into Google Cloud Identity and Access Management, installs without agents and backs up data to Google Buckets. The HYCU service uses native GCP snapshots for backup and recovery.

Subbiah Sundaram, vice president of products at HYCU, based in Boston, said HYCU provides application- and clone-consistent backups, and it allows single-file recovery. Sundaram said because HYCU takes control of the snapshot, data transfers do not affect product systems.

Sundaram said HYCU for GCP was built for Google admins, rather than typical backup admins.

“When customers use the cloud, they think of it as buying a service, not running software. And that’s the experience we want them to have,” Sundaram said. “It’s completely managed by us. We create and provision the backup targets on Google and manage it for you.”

HYCU for GCP uses only GCP to stage backups, backing up data in different Google Cloud zones. Sundaram said HYCU may back up to clouds or on-premises targets in future releases, but the most common request from customers so far is to back up to other GCP zones.

HYCU charges for data protected, rather than total storage allocated for backup. For example, a customer allocating 100 GB to a virtual machine with 20 GB of data protected is charged for the 20 GB. List price for 100 GB of consumed virtual machine capacity starts at $12 per month, or 12 cents per gigabyte, for data backed up every 24 hours. The cost increases for more frequent backups. Customers are billed through the GCP Marketplace.

Industry analysts pointed out HYCU is a brand-new company in name only. Its Comtrade legacy gives HYCU 20-plus years of experience in data protection and monitoring, over 1,000 customers and hundreds of engineers. That can allow it to move faster than typical startups.

“They’re a startup that already has a ton of experience,” said Christophe Bertrand, senior data protection analyst for Enterprise Strategy Group in Milford, Mass. “When you’re a small organization, you have to make strategic calls on what to do next. So, now, they’re getting into Google Cloud, which is evolving to be more enterprise-friendly. Clearly, backup and recovery is one of the functions you need to get right for the enterprise. Combined with the way Nutanix supports Google, it’s a smart move for HYCU.”

Steven Hill, senior storage analyst for 451 Research, agreed that GCP support was a logical step for Nutanix-friendly HYCU.

We believe Nutanix is the future of the data center, and we place our bets on them.
Simon TaylorCEO, HYCU

“Nutanix partnering with Google is a good hybrid cloud play. So, theoretically, what you’re running on-prem runs exactly the same once it’s on Google Cloud,” he said. “HYCU comes in and says, ‘We can do data protection and backup and workload protection that just fits seamlessly in with all of this. Whether you’re on Google Cloud or whether you’re on Nutanix via AHV or Nutanix via ESX, it’s all the same to us.'”

Taylor said HYCU is positioned well for when Nutanix makes Xi available. Nutanix has said its first Xi Cloud Service will be disaster recovery. “We will be delivering for Xi,” Taylor said. “You can imagine that will require a much closer bridge between these two different products. Once Xi is available, we’ll be fast on their heels with a product that will support both purpose-built backup for Nutanix and purpose-built recovery for Xi in a highly integrated fashion.”

Although HYCU for GCP can protect data for non-Nutanix users, Taylor said HYCU remains as dedicated to building a business around Nutanix as ever. He emphasized that HYCU develops its software independently from Nutanix and Google, although he is determined to have a good working relationship with both.

“We believe data protection should be an extension of the platform it serves, not a stand-alone platform,” he said.

Still, Taylor flatly denied his goal is for HYCU to become part of Nutanix.

“Right now, we want to build a brand,” he said. “This is about building a business that matters, not about a quick exit.”

For Sale – Google Pixelbook (Chromebook) i5/128GB/8GB – less than 3 months old!

Folks,

Mint condition Google Pixelbook i5 128GB/8GB Chromebook for sale. Purchased on 14th May 2018 and will be supplied with a copy of the Google Store receipt for warranty purposes.

In full working order, a cracking piece of kit!

Price and currency: £610
Delivery: Delivery cost is included within my country
Payment method: Bank transfer, PPG or cash on collection
Location: London
Advertised elsewhere?: Not advertised elsewhere
Prefer goods collected?: I have no preference

______________________________________________________
This message is automatically inserted in all classifieds forum threads.
By replying to this thread you agree to abide by the trading rules detailed here.
Please be advised, all buyers and sellers should satisfy themselves that the other party is genuine by providing the following via private conversation to each other after negotiations are complete and prior to dispatching goods and making payment:

  • Landline telephone number. Make a call to check out the area code and number are correct, too
  • Name and address including postcode
  • Valid e-mail address

DO NOT proceed with a deal until you are completely satisfied with all details being correct. It’s in your best interest to check out these details yourself.

BigQuery ML moves machine learning into Google BigQuery

Google has released a beta version of BigQuery ML, new software that lets users build some machine learning models inside the Google BigQuery cloud data warehouse with standard SQL commands.

BigQuery ML eliminates the need to move data sets from Google BigQuery to a separate tool to develop and train analytical models. Its SQL support also opens the machine learning process to SQL-savvy data analysts who might not be versed in more-advanced languages like R, Python and Scala that data scientists typically use to build machine learning models.

However, the new technology is limited in what it can do. Google said BigQuery ML initially supports only two types of models: linear regression ones that predict numerical values, such as sales forecasts, and binary logistic regression models that can be used to do two-group customer segmentation, identify email as spam and do other relatively simple classifications in data sets.

In addition, BigQuery ML is based on the standard batch variant of the gradiant descent methodology that drives machine learning algorithms instead of the so-called stochastic version.

The stochastic approach “is far more common in today’s large-scale machine learning systems,” Google acknowledged in a blog post about BigQuery ML. The company added, though, that the batch variant “has numerous practical advantages” on the performance, stability and tuning of machine learning models.

Broadening the machine learning user base

BigQuery ML likely won’t convince many data scientists who analyze data stored in BigQuery to change how they build models, said Daniel Mintz, chief data evangelist at software vendor Looker Data Sciences Inc., which has teamed up with Google to enable its data modeling and analytics platform to function as a front-end tool for BigQuery ML users.

“Professional data scientists, the people who do this all the time, are going to continue to use the tools they’re most comfortable with,” Mintz said.

But, Mintz added, BigQuery ML makes it feasible for the hordes of data analysts “who know SQL but haven’t done much with machine learning yet” to start developing models without having to learn new languages or deploy additional analytics tools.

We have a lean team of data scientists, and it can get a bit challenging to support all of the [marketing] campaigns in all possible cases.
Miguel Angel Campo-Rembadosenior vice president of data science and analytics, 20th Century Fox

And, in some cases, busy data scientists may be able to speed up the model-building process to better support business needs for information by using BigQuery ML.

For example, film studio 20th Century Fox is an early user of the technology. In a keynote session at the Google Cloud Next ’18 conference in San Francisco that was streamed online, Miguel Angel Campo-Rembado, the studio’s senior vice president of data science and analytics, said its marketing team needs analytics input on a continual basis to assess advertising and promotional campaigns for movies.

“But we have a lean team of data scientists, and it can get a bit challenging to support all of the campaigns in all possible cases,” Campo-Rembado said.

Less of a machine learning maze to run

With BigQuery ML, Campo-Rembado added, his team was able to build a linear regression model in just 30 seconds to analyze movie trailers to help pinpoint audiences that should be targeted in promoting the latest Maze Runner movie released in January. All it took was adding a CREATE MODEL statement in BigQuery ML to an existing SQL query for audience analysis, he said.

That, and the ability to keep the entire process inside Google BigQuery, enabled the analytics team to quickly run the model and deliver the results to the Los Angeles-based studio’s marketers “within minutes,” according to Campo-Rembado.

At its core, BigQuery ML is a set of SQL extensions designed to support machine learning and predictive analytics. Google, which announced the technology at Google Cloud Next, said users can build machine learning models in BigQuery ML with simple SQL statements like this:

CREATE MODEL dataset.model_name
  OPTIONS(model_type=’linear_reg’, input_label_cols=[‘input_label’])
AS SELECT * FROM input_table;

More work to do on BigQuery ML

Google didn’t say how long the beta-testing cycle will last or when it expects to make BigQuery ML generally available.

In its blog post, the company said that it plans to do more to boost the technology’s performance and that it will explore adding support for other types of machine learning algorithms to broaden BigQuery ML’s potential uses.

Looker, based in Santa Cruz, Calif., said its integration with BigQuery ML lets analytics teams use its namesake platform to prepare data for analysis, build and run their analytical models in a Google BigQuery data warehouse, and then disseminate the resulting information to business executives and workers.

“From a user’s perspective, it’s all a Looker front end,” Mintz said. “BigQuery ML is running under the hood, but it looks like one tool to users.” He added that BigQuery ML is the first tool Looker has seen that directly integrates machine learning capabilities into a data warehouse’s SQL interface.

Google Cloud security adds data regions and Titan security keys

Multiple improvements for Google Cloud security aim to help users protect data through better access management, more data security options

and
greater transparency.

More than half of the security features announced are either in beta or part of the G Suite Early Adopter Program, but in total the additions should offer better control and transparency for users.

The biggest improvement in Google Cloud security comes in identity and access management. Google has developed its own Titan multi-factor physical security key — similar to a YubiKey — to protect users against phishing attacks. Google previously reported that there have been no confirmed account takeovers in more than one year since requiring all employees to use physical security keys, and according to a Google spokesperson, Titan keys have already been one such key available to employees.

The Titan security keys are FIDO keys that include “firmware developed by Google to verify its integrity.” Google announced it is offering two models of Titan keys for Cloud users: one based on USB and NFC and one that uses Bluetooth in order to support iOS devices as well. The keys are available now to Cloud customers and will come to the Google Store soon. Pricing details have not been released.

“The Titan security key provides a phishing-resistant second factor of authentication. Typically, our customers will place it in front of

high value
users or content administrators and root users, the compromise of those would be much more damaging to an enterprise customer … or specific applications which contain sensitive data, or sort of the crown jewels of corporate environments,” Jess Leroy, director of product management for Google Cloud, told reporters in a briefing. “It’s built with a secure element, which includes firmware that we built ourselves, and it provides a ton of security with very little interaction and effort on the part of

user
.”

However, Stina Ehrensvard, CEO

and
founder of Yubico, the manufacturer of Yubikey two factor authentication keys, headquartered in Palo Alto, Calif., noted in a blog post that her company does not see Bluetooth as a good option for a physical security key.

“Google’s offering includes a Bluetooth (BLE) capable key. While Yubico previously initiated development of a BLE security

key,
and contributed to the BLE U2F standards work, we decided not to launch the product as it does not meet our standards for security, usability

and
durability,” Ehrensvard wrote. “BLE does not provide the security assurance levels of NFC and USB, and requires batteries and pairing that offer a poor user experience.”

In addition to the Titan keys, Google Cloud security will have improved access management with the implementation of the context-aware access approach Google used in its BeyondCorp network setups.

“Context-aware access allows organizations to define and enforce granular access to [Google Cloud Platform] APIs, resources, G Suite, and third-party SaaS apps based on a user’s identity, location, and the context of their request. This increases your security posture while decreasing complexity for your users, giving them the ability to seamlessly log on to apps from anywhere and any device,” Jennifer Lin, director of product management for Google Cloud, wrote in the Google Cloud security announcement post. “Context-aware access capabilities are available for select customers using VPC Service Controls, and are coming soon for customers using Cloud Identity and Access Management (IAM), Cloud Identity-Aware Proxy (IAP), and Cloud Identity.”

Data transparency and control

New features also aim to improve Google Cloud security visibility and control over data. Access Transparency will offer users a “near real-time log” of the actions taken by administrators, including Google engineers.

“Inability to audit cloud provider accesses is often a barrier to moving to

cloud
. Without visibility into the actions of cloud provider administrators, traditional security processes cannot be replicated,” Google wrote in

documentation
. “Access Transparency enables that verification, bringing your audit controls closer to what you can expect

on premise
.”

In terms of Google Cloud security and control over data, Google will also now allow customers to decide in what region data will be stored. Google described this feature as allowing multinational organizations to protect their data with

geo redundancy
, but in a way that organizations can follow any requirements regarding where in the

world
data is stored.

A Google spokesperson noted via email that the onus for ensuring that regional data storage complies with local laws would be on the individual organizations.

Other Google Cloud security improvements

Google announced several features that are still in beta, including Shielded Virtual Machines (VM, which will allow users to monitor and react to changes in the VM to protect against tampering; Binary Authorization, which will force signature validation when deploying container images; Container Registry Vulnerability Scanning, which will automatically scan Ubuntu, Debian and Alpine images to prevent deploying images that contain any vulnerable packages; geo-based access control for Cloud Armor, which helps defend users against DDoS attacks; and Cloud HSM, a managed cloud-hosted hardware security module (HSM) service.

Vendors race to adopt Google Contact Center AI

Google has released a development platform that will make it easier for businesses to deploy virtual agents and other AI technologies in the contact center. The tech giant launched the product in partnership with several leading contact center vendors, including Cisco and Genesys. 

The Google Contact Center AI platform includes three main features: virtual agents, AI-powered assistance for human agents and contact center analytics. Google first released a toolkit for building conversational AI bots in November and updated the platform this week, with additional tools for contact centers.

The virtual agents can help resolve common customer inquiries using Google’s natural language processing platform, which recognizes voice and textual inputs. Genesys, for example, demonstrated how the chatbot could help a customer return ill-fitting shoes before passing the phone call to a human agent, who could help the customer order a new pair.

Google’s agent assistance system scans a company’s knowledge bases, such as FAQs and internal documents, to help agents answer customer questions faster. The analytics tool reviews chats and call recordings to identify customer trends, assisting in the training of live agents and the development of virtual agents.

Vendors rush to adopt Google Contact Center AI

Numerous contact center vendors that directly compete with one another sent out strikingly similar press releases on Tuesday about their adoption of Google Contact Center AI. The Google platform is available through partners Cisco, Genesys, Mitel, Five9, RingCentral, Vonage, Twilio, Appian and Upwire.

“I don’t think I’ve ever heard of a launch like this, where almost every player — except Avaya — is announcing something with the same company,” said Jon Arnold, principal analyst of Toronto-based research and analysis firm J Arnold & Associates.

Avaya was noticeably absent from the list of partners. The company spent most of 2017 in bankruptcy court and was previously faulted by critics for failing to pivot to the cloud quickly enough. The company said at a conference earlier this year it was developing AI capabilities internally, said Irwin Lazar, an analyst at Nemertes Research, based in Mokena, Ill.

An Avaya spokesperson said its platforms integrated with a range of AI technologies from vendors, including Google, IBM, Amazon and Nuance. “Avaya does have a strong relationship with Google, and we continue to pursue opportunities for integration on top of what already exists today,” the spokesperson said.

Google made headlines last month with the release of Google Duplex, a conversational AI bot targeting the consumer market. The company demonstrated how the platform could pass as human during short phone conversations with a hair salon and restaurant. Google’s Contact Center AI was built on some of the same infrastructure, but it’s a separate platform, the company said.

“Google has been pretty quiet. They are not a contact center player. But as AI keeps moving along the curve, everyone is trying to figure out what to do with it. And Google is clearly one of the strongest players in AI, as is Amazon,” Arnold said.

Because it relies overwhelmingly on advertising revenue, Google doesn’t need its Contact Center AI to make a profit. Google will be able to use the data that flows through contact centers to improve its AI capabilities. That should help it compete against Amazon, which entered the contact center market last year with the release of Amazon Connect.

The contact center vendors now partnering with Google had already been racing to develop or acquire AI technologies on their own, and some highlighted how their own AI capabilities would complement Google’s offering. Genesys, for example, said its Blended AI platform — which combines chatbots, machine learning and analytics — would use predictive routing to transfer calls between Google-powered chatbots and live agents.  

“My sense with AI is that it will be difficult for vendors to develop capabilities on their own, given that few can match the computing power required for advanced AI that vendors like Amazon, Google and Microsoft can bring to the table,” Lazar said.

For Sale – Google Pixelbook (Chromebook) i5/128GB/8GB – less than 3 months old!

Folks,

Mint condition Google Pixelbook i5 128GB/8GB Chromebook for sale. Purchased on 14th May 2018 and will be supplied with a copy of the Google Store receipt for warranty purposes.

In full working order, a cracking piece of kit!

Price and currency: £610
Delivery: Delivery cost is included within my country
Payment method: Bank transfer, PPG or cash on collection
Location: London
Advertised elsewhere?: Not advertised elsewhere
Prefer goods collected?: I have no preference

______________________________________________________
This message is automatically inserted in all classifieds forum threads.
By replying to this thread you agree to abide by the trading rules detailed here.
Please be advised, all buyers and sellers should satisfy themselves that the other party is genuine by providing the following via private conversation to each other after negotiations are complete and prior to dispatching goods and making payment:

  • Landline telephone number. Make a call to check out the area code and number are correct, too
  • Name and address including postcode
  • Valid e-mail address

DO NOT proceed with a deal until you are completely satisfied with all details being correct. It’s in your best interest to check out these details yourself.

Physical security keys eliminate phishing at Google

Google claims it has completely eliminated successful phishing attacks against its employees through the use of physical security keys and Universal Second Factor.

Google began introducing and evaluating physical security keys in 2014 and by early 2017 all 85,000-plus Google employees were required to use them when accessing company accounts. In the time since, the company told Brian Krebs, no employee has been successfully phished.

A Google spokesperson said the decision to use the Universal Second Factor (U2F) physical security keys instead of software-based one-time-password (OTP) authentication was based on internal testing.

“We believe security keys offer the strongest protections against phishing,” a Google spokesperson wrote via email. “We did a two-year study that showed that OTP-based authentication had an average failure rate of 3%, and with U2F security keys, we experienced zero percent failure.”

Lane Thames, senior security researcher at Tripwire, based in Portland, Ore., said the main reason these software-based apps are less secure is “because attackers can potentially intercept these OTPs remotely.”

“Another issue is the bulk production of OTPs that users can store locally or even print. This is done in order to make the 2FA [two-factor authentication] process a little easier for end users or so end users can save OTPs for later use, if they don’t have access to their phones when the code is needed,” Thames wrote via email. “This is akin to a similar problem where users write passwords and leave them around their workspace.”

However, John Callahan, CTO at Veridium, an identity and access management software vendor based in Quincy, Mass., noted that there are also benefits to users opting for 2FA via smartphone.

“Some people who use a U2F key fear losing it or damaging it. This is where biometrics can play a key role. Methods using biometrics are helping to prevent attacks,” Callahan wrote via email. “Using biometrics with the Google Authenticator app is a secure solution, because a mobile phone is always nearby to authenticate a transaction.”

Moving companies to physical security keys

Physical security keys implementing U2F was the core part of Google’s Advanced Protection Program, which it rolled out as a way for high-risk users to protect their Google accounts. A physical security key, like a YubiKey, can authenticate a user simply by inserting the key into a computer, tapping it against an NFC-capable smartphone or connecting to an iOS device via Bluetooth.

Nadav Avital, threat research manager at Imperva, based in Redwood Shores, Calif., said, “in an ideal world,” more companies would require multifactor authentication (MFA).

In general, physical keys offer better security, because software-based authentication relies on a shared secret between the client and the provider that can be discovered.
Nadav Avitalthreat research manager at Imperva

“In general, physical keys offer better security, because software-based authentication relies on a shared secret between the client and the provider that can be discovered. Unfortunately, most people don’t use [2FA or MFA], neither physical nor software-based, because they don’t understand the implications or because they prefer simplicity over security,” Avital wrote via email. “Clients can suffer from fraud, data theft or identity theft, while the company can suffer from reputation damage, financial damage from potential lawsuits and more.”

Richard Ford, chief scientist at Forcepoint, a cybersecurity company based in Austin, Texas, said worrying about the best way to implement 2FA might be premature, as “we still have oodles of companies still using simple usernames and password.”

“Getting off that simple combo to something more secure provides an immediate plus up for security. Look at your risk profile, and try and peer a little into the future,” Ford said. “Remember, what you plan today won’t be reality for a while, so you want to skate to where the puck is going. With that said, please don’t let perfect be the enemy of good.”

Petitioning the board

Experts noted that not all IT teams will have as easy a time convincing the board to invest in making physical security keys or another form of multifactor authentication a requirement as Google would.

Matthew Gardiner, cybersecurity expert at Mimecast, a web and email security company based in Lexington, Mass., suggested framing the issue in terms of risk reduction.

“It is hard to quantify risk unless you have experienced a recent breach. Using MFA is not a theoretical idea; it is now a security best practice that is incredibly cheap and easy to use from a multitude vendors and cloud service providers,” Gardiner wrote via email. “I can only assume that if organizations are still only using a single-factor of authentication in support of B-to-B or B-to-E applications that they must think they have nothing of value to attackers.”

Ford said it was probably best not to spear phish the board for effect, “no matter how tempting that might be.”

“I would, however, suggest that the Google data itself can be of tremendous value. Boards understand risk in the scope of the business, and I think there’s plenty of data now out there to support the investment in more sophisticated authentication mechanisms,” Ford wrote. “Start with a discussion around Google and their recent successes in this space, and also have a reasoned — and money-based — discussion about the data you have at risk. If you arm the board with the right data points, they will very likely make the right decision.”