TensorWatch: A debugging and visualization system for machine learning

TensorWatch

The rise of deep learning is accompanied by ever-increasing model complexity, larger datasets, and longer training times for models. When working on novel concepts, researchers often need to understand why training metrics are trending the way they are. So far, the available tools for machine learning training have focused on a “what you see is what you log” approach. As logging is relatively expensive, researchers and engineers tend to avoid it and rely on a few signals to guesstimate the cause of the patterns they see. At Microsoft Research, we’ve been asking important questions surrounding this very challenge: What if we could dramatically reduce the cost of getting more information about the state of the system? What if we had advanced tooling that could help researchers make more informed decisions effectively?

Introducing TensorWatch

We’re happy to introduce TensorWatch, an open-source system that implements several of these ideas and concepts. We like to think of TensorWatch as the Swiss Army knife of debugging tools with many advanced capabilities researchers and engineers will find helpful in their work. We presented TensorWatch at the 2019 ACM SIGCHI Symposium on Engineering Interactive Computing Systems.

Custom UIs and visualizations

The first thing you might notice when using TensorWatch is it extensively leverages Jupyter Notebook instead of prepackaged user interfaces, which are often difficult to customize. TensorWatch provides the interactive debugging of real-time training processes using either the composable UI in Jupyter Notebooks or the live shareable dashboards in Jupyter Lab. In addition, since TensorWatch is a Python library, you can also build your own custom UIs or use TensorWatch in the vast Python data science ecosystem. TensorWatch also supports several standard visualization types, including bar charts, histograms, and pie charts, as well as 3D variations.

With TensorWatch—a debugging and visualization tool for machine learning—researchers and engineers can customize the user interface to accommodate a variety of scenarios. Above is an example of TensorWatch running in Jupyter Notebook, rendering a live chart from multiple streams produced by an ML training application.

With TensorWatch—a debugging and visualization tool for machine learning—researchers and engineers can customize the user interface to accommodate a variety of scenarios. Above is an example of TensorWatch running in Jupyter Notebook, rendering a live chart from multiple streams produced by an ML training application.

Streams, streams everywhere

One of the central premises of the TensorWatch architecture is we uniformly treat data and other objects as streams. This includes files, console, sockets, cloud storage, and even visualizations themselves. With a common interface, TensorWatch streams can listen to other streams, which enables the creation of custom data flow graphs. Using these concepts, TensorWatch trivially allows you to implement a variety of advanced scenarios. For example, you can render many streams into the same visualization, or one stream can be rendered in many visualizations simultaneously, or a stream can be persisted in many files, or not persisted at all. The possibilities are endless!

TensorWatch supports a variety of visualization types. Above is an example of a TensorWatch t-SNE visualization of the MNIST dataset.

TensorWatch supports a variety of visualization types. Above is an example of a TensorWatch t-SNE visualization of the MNIST dataset.

Lazy logging mode

With TensorWatch, we also introduce lazy logging mode. This mode doesn’t require explicit logging of all the information beforehand. Instead, you can have TensorWatch observe the variables. Since observing is basically free, you can track as many variables as you like, including large models or entire batches during the training. TensorWatch then allows you to perform interactive queries that run in the context of these variables and returns the streams as a result. These streams can then be visualized, saved, or processed as needed. For example, you can write a lambda expression that computes mean weight gradients in each layer in the model at the completion of each batch and send the result as a stream of tensors that can be plotted as a bar chart.

Phases of model development

At Microsoft Research, we care deeply about improving debugging capabilities in all phases of model development—pre-training, in-training, and post-training. Consequently, TensorWatch provides many features useful for pre- and post-training phases as well. We lean on several excellent open-source libraries to enable many of these features, which include model graph visualization, data exploration through dimensionality reduction, model statistics, and several prediction explainers for convolution networks.

Open source on GitHub

We hope TensorWatch helps spark further advances and ideas for efficiently debugging and visualizing machine learning and invite the ML community to participate in this journey via GitHub.

Go to Original Article
Author: Microsoft News Center

Quantum launches Distributed Cloud Services

Quantum Corp. has announced a new line of services and storage-as-a-software offerings called Distributed Cloud Services. According to Quantum, it was designed to enable an enterprise’s resources to focus on meeting business goals, rather than spending time managing storage.

Quantum’s new Cloud-Based Analytics software powers Distributed Cloud Services. Products are designed to send data about their environment using the Cloud-Based Analytics software, making them part of Distributed Cloud.

Using the Cloud-Based Analytics software, users or Quantum’s own support team can manage and monitor environments worldwide from one central location. Users can monitor their own environments or choose to have Quantum do it through Distributed Cloud Services.

According to Quantum, the driving force behind the creation of Distributed Cloud Services was the need for businesses to create, study and develop more, while having fewer IT and engineering resources, therefore looking to others to manage data storage infrastructure.

Quantum also announced Quantum Operational Services, which it claims provides cloudlike storage with on-premises control. Users manage daily storage operations with Quantum, with the hope of providing more reliability through monitoring and analysis.

Quantum claims the key benefits of using the Operational Services line are eliminating the weight that managing storage places on IT resources, reducing downtime to improve UX, keeping the control and security of an on-premises storage center and maximizing storage ROI.

Lastly, Quantum has added new storage as a service (SaaS) offerings to its portfolio. This is aimed at those who would prefer a pay-per-use subscription service for the Operational Services features. The benefit to this, according to Quantum, is that users only pay for the storage they use.

Other vendors that offer SaaS include Kaminario, which recently expanded its SaaS offerings with metered usage payments, disaster recovery (DR) and service usage on the public cloud.

SaaS is generally considered a good choice for small or midsize enterprises. It can save money by eliminating the need for personnel to implement and maintain a storage infrastructure, as well as reduce DR risks and provide long-term record retention.

All new offerings are now available.

Go to Original Article
Author:

For Sale – xeon 1220v5 CPU, xeon 2603v4 CPU, g1610t CPU, ECC 2133p 16GB RAM

Discussion in ‘Desktop Computer Classifieds‘ started by dirtypaws, Jun 20, 2019.

  1. dirtypaws

    dirtypaws

    Novice Member

    Joined:
    Sep 29, 2015
    Messages:
    38
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    8
    Location:
    Sheffield
    Ratings:
    +2

    Xeon 1220V5 £90 (x2)
    Xeon 2603v4 £75
    Celeron G1610t £10
    16GB 2133p ECC RAM £80 (x2)
    [​IMG]

    Price and currency: £100
    Delivery: Delivery cost is not included
    Payment method: paypal , bank transfer
    Location: sheffield
    Advertised elsewhere?: Not advertised elsewhere
    Prefer goods collected?: I have no preference

    ______________________________________________________
    This message is automatically inserted in all classifieds forum threads.
    By replying to this thread you agree to abide by the trading rules detailed here.
    Please be advised, all buyers and sellers should satisfy themselves that the other party is genuine by providing the following via private conversation to each other after negotiations are complete and prior to dispatching goods and making payment:

    • Landline telephone number. Make a call to check out the area code and number are correct, too
    • Name and address including postcode
    • Valid e-mail address

    DO NOT proceed with a deal until you are completely satisfied with all details being correct. It’s in your best interest to check out these details yourself.

Share This Page

Loading…

Go to Original Article
Author:

TensorWatch: A debugging and visualization system for machine learning

TensorWatch

The rise of deep learning is accompanied by ever-increasing model complexity, larger datasets, and longer training times for models. When working on novel concepts, researchers often need to understand why training metrics are trending the way they are. So far, the available tools for machine learning training have focused on a “what you see is what you log” approach. As logging is relatively expensive, researchers and engineers tend to avoid it and rely on a few signals to guesstimate the cause of the patterns they see. At Microsoft Research, we’ve been asking important questions surrounding this very challenge: What if we could dramatically reduce the cost of getting more information about the state of the system? What if we had advanced tooling that could help researchers make more informed decisions effectively?

Introducing TensorWatch

We’re happy to introduce TensorWatch, an open-source system that implements several of these ideas and concepts. We like to think of TensorWatch as the Swiss Army knife of debugging tools with many advanced capabilities researchers and engineers will find helpful in their work. We presented TensorWatch at the 2019 ACM SIGCHI Symposium on Engineering Interactive Computing Systems.

Custom UIs and visualizations

The first thing you might notice when using TensorWatch is it extensively leverages Jupyter Notebook instead of prepackaged user interfaces, which are often difficult to customize. TensorWatch provides the interactive debugging of real-time training processes using either the composable UI in Jupyter Notebooks or the live shareable dashboards in Jupyter Lab. In addition, since TensorWatch is a Python library, you can also build your own custom UIs or use TensorWatch in the vast Python data science ecosystem. TensorWatch also supports several standard visualization types, including bar charts, histograms, and pie charts, as well as 3D variations.

With TensorWatch—a debugging and visualization tool for machine learning—researchers and engineers can customize the user interface to accommodate a variety of scenarios. Above is an example of TensorWatch running in Jupyter Notebook, rendering a live chart from multiple streams produced by an ML training application.

With TensorWatch—a debugging and visualization tool for machine learning—researchers and engineers can customize the user interface to accommodate a variety of scenarios. Above is an example of TensorWatch running in Jupyter Notebook, rendering a live chart from multiple streams produced by an ML training application.

Streams, streams everywhere

One of the central premises of the TensorWatch architecture is we uniformly treat data and other objects as streams. This includes files, console, sockets, cloud storage, and even visualizations themselves. With a common interface, TensorWatch streams can listen to other streams, which enables the creation of custom data flow graphs. Using these concepts, TensorWatch trivially allows you to implement a variety of advanced scenarios. For example, you can render many streams into the same visualization, or one stream can be rendered in many visualizations simultaneously, or a stream can be persisted in many files, or not persisted at all. The possibilities are endless!

TensorWatch supports a variety of visualization types. Above is an example of a TensorWatch t-SNE visualization of the MNIST dataset.

TensorWatch supports a variety of visualization types. Above is an example of a TensorWatch t-SNE visualization of the MNIST dataset.

Lazy logging mode

With TensorWatch, we also introduce lazy logging mode. This mode doesn’t require explicit logging of all the information beforehand. Instead, you can have TensorWatch observe the variables. Since observing is basically free, you can track as many variables as you like, including large models or entire batches during the training. TensorWatch then allows you to perform interactive queries that run in the context of these variables and returns the streams as a result. These streams can then be visualized, saved, or processed as needed. For example, you can write a lambda expression that computes mean weight gradients in each layer in the model at the completion of each batch and send the result as a stream of tensors that can be plotted as a bar chart.

Phases of model development

At Microsoft Research, we care deeply about improving debugging capabilities in all phases of model development—pre-training, in-training, and post-training. Consequently, TensorWatch provides many features useful for pre- and post-training phases as well. We lean on several excellent open-source libraries to enable many of these features, which include model graph visualization, data exploration through dimensionality reduction, model statistics, and several prediction explainers for convolution networks.

Open source on GitHub

We hope TensorWatch helps spark further advances and ideas for efficiently debugging and visualizing machine learning and invite the ML community to participate in this journey via GitHub.

Go to Original Article
Author: Microsoft News Center

Pica8 PicOS upgrade enhances network security

Pica8 upgraded its Linux-based network operating system PicOS to include new capabilities to address network efficiency and security.

With this release, Pica8 PicOS interoperates with existing network access control (NAC) tools, enabling fully automated network access policy enforcement in an open networking deployment. According to Pica8, automation improves operational efficiency with a simplified UI and improved security posture.

According to Forrester, 46% of information employees use personal laptops and mobile devices for work, which creates an increasingly complex device landscape that must be managed with BYOD programs and increased network security.

The age of BYOD and IoT deployments create security challenges for enterprise networking, according to Pica8; as enterprises automate, simplify and modernize access networks, most customers have NAC systems in place. PicOS’ NAC integration support and a centralized policy-based access control for network access points these challenges, according to the vendor.

Building off its October 2018 support for the Dell EMC N3132PX-ON 2.5G/5G Mutigig PoE switch, Pica8 added PicOS availability for ISE and ClearPass, supporting the Open Network Install Environment standard that enables PicOS to deploy onto platforms. Additionally, PicOS is automatically included with the Packetfence 9.0 release.

Pica8’s PicOS open networking operating system is offered in an enterprise edition as well as an SDN edition. PicOS Enterprise Edition installs on 1G to 100G open switches and offers the most comprehensive support, including the Debian Linux distribution, Nymble, and Pica8’s CrossFlow capability. PicOS SDN Edition includes the Debian Linux distribution and Nymble, and uses OpenFlow 1.5’s User-Defined Fields for packet inspection.

Go to Original Article
Author:

For Sale – xeon 1220v5 CPU, xeon 2603v4 CPU, g1610t CPU, ECC 2133p 16GB RAM

Discussion in ‘Desktop Computer Classifieds‘ started by dirtypaws, Jun 20, 2019.

  1. dirtypaws

    dirtypaws

    Novice Member

    Joined:
    Sep 29, 2015
    Messages:
    38
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    8
    Location:
    Sheffield
    Ratings:
    +2

    Xeon 1220V5 £90 (x2)
    Xeon 2603v4 £75
    Celeron G1610t £10
    16GB 2133p ECC RAM £80 (x2)
    [​IMG]

    Price and currency: £100
    Delivery: Delivery cost is not included
    Payment method: paypal , bank transfer
    Location: sheffield
    Advertised elsewhere?: Not advertised elsewhere
    Prefer goods collected?: I have no preference

    ______________________________________________________
    This message is automatically inserted in all classifieds forum threads.
    By replying to this thread you agree to abide by the trading rules detailed here.
    Please be advised, all buyers and sellers should satisfy themselves that the other party is genuine by providing the following via private conversation to each other after negotiations are complete and prior to dispatching goods and making payment:

    • Landline telephone number. Make a call to check out the area code and number are correct, too
    • Name and address including postcode
    • Valid e-mail address

    DO NOT proceed with a deal until you are completely satisfied with all details being correct. It’s in your best interest to check out these details yourself.

Share This Page

Loading…

Go to Original Article
Author:

TensorWatch: A debugging and visualization system for machine learning

TensorWatch

The rise of deep learning is accompanied by ever-increasing model complexity, larger datasets, and longer training times for models. When working on novel concepts, researchers often need to understand why training metrics are trending the way they are. So far, the available tools for machine learning training have focused on a “what you see is what you log” approach. As logging is relatively expensive, researchers and engineers tend to avoid it and rely on a few signals to guesstimate the cause of the patterns they see. At Microsoft Research, we’ve been asking important questions surrounding this very challenge: What if we could dramatically reduce the cost of getting more information about the state of the system? What if we had advanced tooling that could help researchers make more informed decisions effectively?

Introducing TensorWatch

We’re happy to introduce TensorWatch, an open-source system that implements several of these ideas and concepts. We like to think of TensorWatch as the Swiss Army knife of debugging tools with many advanced capabilities researchers and engineers will find helpful in their work. We presented TensorWatch at the 2019 ACM SIGCHI Symposium on Engineering Interactive Computing Systems.

Custom UIs and visualizations

The first thing you might notice when using TensorWatch is it extensively leverages Jupyter Notebook instead of prepackaged user interfaces, which are often difficult to customize. TensorWatch provides the interactive debugging of real-time training processes using either the composable UI in Jupyter Notebooks or the live shareable dashboards in Jupyter Lab. In addition, since TensorWatch is a Python library, you can also build your own custom UIs or use TensorWatch in the vast Python data science ecosystem. TensorWatch also supports several standard visualization types, including bar charts, histograms, and pie charts, as well as 3D variations.

With TensorWatch—a debugging and visualization tool for machine learning—researchers and engineers can customize the user interface to accommodate a variety of scenarios. Above is an example of TensorWatch running in Jupyter Notebook, rendering a live chart from multiple streams produced by an ML training application.

With TensorWatch—a debugging and visualization tool for machine learning—researchers and engineers can customize the user interface to accommodate a variety of scenarios. Above is an example of TensorWatch running in Jupyter Notebook, rendering a live chart from multiple streams produced by an ML training application.

Streams, streams everywhere

One of the central premises of the TensorWatch architecture is we uniformly treat data and other objects as streams. This includes files, console, sockets, cloud storage, and even visualizations themselves. With a common interface, TensorWatch streams can listen to other streams, which enables the creation of custom data flow graphs. Using these concepts, TensorWatch trivially allows you to implement a variety of advanced scenarios. For example, you can render many streams into the same visualization, or one stream can be rendered in many visualizations simultaneously, or a stream can be persisted in many files, or not persisted at all. The possibilities are endless!

TensorWatch supports a variety of visualization types. Above is an example of a TensorWatch t-SNE visualization of the MNIST dataset.

TensorWatch supports a variety of visualization types. Above is an example of a TensorWatch t-SNE visualization of the MNIST dataset.

Lazy logging mode

With TensorWatch, we also introduce lazy logging mode. This mode doesn’t require explicit logging of all the information beforehand. Instead, you can have TensorWatch observe the variables. Since observing is basically free, you can track as many variables as you like, including large models or entire batches during the training. TensorWatch then allows you to perform interactive queries that run in the context of these variables and returns the streams as a result. These streams can then be visualized, saved, or processed as needed. For example, you can write a lambda expression that computes mean weight gradients in each layer in the model at the completion of each batch and send the result as a stream of tensors that can be plotted as a bar chart.

Phases of model development

At Microsoft Research, we care deeply about improving debugging capabilities in all phases of model development—pre-training, in-training, and post-training. Consequently, TensorWatch provides many features useful for pre- and post-training phases as well. We lean on several excellent open-source libraries to enable many of these features, which include model graph visualization, data exploration through dimensionality reduction, model statistics, and several prediction explainers for convolution networks.

Open source on GitHub

We hope TensorWatch helps spark further advances and ideas for efficiently debugging and visualizing machine learning and invite the ML community to participate in this journey via GitHub.

Go to Original Article
Author: Microsoft News Center

Exchange Online vs. Office 365: Which plan is a better fit?

A disconnect between IT and the end users can cause problems when it’s time to pull the plug on your on-premises Exchange Server infrastructure.

Many organizations are dealing with the Exchange Online vs. Office 365 decision-making process as part of its messaging platform migration. Once you decide to move email from the data center, the IT department must then sift through the numerous factors to see if an Office 365 plan or one for just hosted email with the Exchange Online service will suffice. There is pressure to keep costs low, which can force IT to rush into a licensing decision that limits the ability to deliver crucial functionalities that users require.

For organizations moving from an Exchange Server on-premises configuration to Exchange Online or the full Office 365 suite, there are several key areas that are critical to the organization’s success that don’t always get the appropriate attention until the questions from users start rolling in when the migration is done.

1. Lack of email backups presents a risky scenario

It might be hard to believe, but Office 365 does not offer a backup service to its users. Microsoft provides recovery services going back up to 44 days for deleted email or mailboxes.

Many organizations ignore the lack of backup functionality until users lose email that even Microsoft support cannot recover. Backup is critical and every business moving to Office 365 should look at the offerings from third-party vendors such as SkyKick, Datto and Veaam in their Office 365 backup subscriptions for data protection.

2. Basic plans don’t offer desired security features

One other area overlooked by IT when adopting a less-expensive Office 365 license is the advanced security, which is only available as an add-on for those plans. The choice to go with some of the low-cost plans in Office 365 — Office 365 F1, Exchange Online Plan 1, Business Essentials or Business Premium — brings with it the risk of missing out on the defensive capabilities around Advanced Threat Protection and anti-phishing that help keep the organization’s data and systems safe.

Many organizations ignore the lack of backup functionality until users lose email that even Microsoft support cannot recover.

3. The advanced compliance capabilities might cost you extra

Some businesses must meet compliance requirements with the eDiscovery and On-Hold features. An organization that selects one of the lower-end plans, such as Office 365 F1, Exchange Online Plan 1, Office 365 Business Essentials or Office 365 Business Premium, will not have these compliance features. In most cases, to gain that functionality an organization would have to purchase additional licensing or the Exchange Online Protection Plan to add to each of its user licenses for those capabilities.

4. Try to account for all users’ storage requirements

Microsoft has been generous with the amount of storage it includes with most of its Office 365 plans, except for the F1 plan, which gives each user 2 GB of email storage. But the reality is that the 50 GB you get in the Business Essentials plan is plenty for many who might have had a much smaller mailbox size when their mailbox was on premises.

However, one commonly overlooked area is archiving. The basic Office 365 subscriptions — Office 365 F1, Exchange Online Plan 1, Office 365 Business Essentials or Office 365 Business Premium — do not include an archive feature. This leaves many of the users on those plans who might have depended on an archiving product with their on-premises messaging platform without this safety net. Those users might want to migrate archived email into their main mailbox. In addition to Exchange Online archiving, you can store third-party data, including documents stored in Box and Facebook data for an extra fee.

5. Not getting full use of all available services in the Office 365 suite

Many organizations do a very narrow cloud migration and will just shift the messaging platform to Exchange Online. Some of the basic Office 365 plans cost just an additional dollar per user, per month to move from Exchange Online Plan 1 ($4 per user, per month) to Office 365 Business Essentials ($5 per user, per month). An organization that does not want to spend a relatively small amount will miss out on a number of valuable collaboration services that can help end users do more with the platform, such as Microsoft Teams, Skype for Business, SharePoint, OneDrive, Office Online Apps, Forms, PowerApps, Flow and Bookings.

Go to Original Article
Author:

Verizon CX boss: Bots slash costs, but humans build brand loyalty

Customers appreciate the automation of formerly laborious processes, such as making payments, changing passwords and updating account information. But, a Verizon CX leader contends, in addition to the speed of automation and quick-response chatbots, humans ultimately build brand loyalty and trust.

That was one upshot from a survey of 6,000 consumers across 15 countries Verizon commissioned in February 2019 and released in mid-June.

Gordon Littley, managing director of Verizon’s global CX practice, said that confirmed what he’s learned anecdotally talking to Verizon customers’ CX leaders: Customers need human agents to close the deal, solve their problems on the support side and build a connection to a brand that results in repeat-business loyalty.

Chatbots have limitations, he said. They need to be designed in a customer-centric way, Littley said, to avoid migrating outdated thinking into new technology platforms. That kills the customer experience.

“We’re forcing customers to do business the way we want to do business. With IVR [interactive voice response], the whole intent of that platform was to avoid talking to customers,” Littley said. “If you view chatbots in the same way, you’re going to make the same mistakes. What you have to do is start with the customer, who says, ‘I don’t care what channel you service me on as long as it’s quick and to the point.'”

Gordon Littley, managing director of Verizon's global CX practice Gordon Littley

Further reinforcing the notion that automation for the sake of cutting costs of human employees doesn’t go over well with customers is that they’re demanding the human touch. In the Americas consumers said they expect people to be involved in CX, with 44% indicating that being unable to speak to a person when contacting a brand would be the No. 1 reason to send them to a competitor.

Globally, 59% of the Verizon CX survey respondents said email or secure messaging platforms are their preferred methods of communicating with brands. However, the next three on the list — they could choose more than one — involved human interactions: phone (54%), in person (39%) and live chat (39%).

Yet, consumers also expressed that they want technology serving them automated personalized experiences that include tailored and meaningful recommendations that follow them across mobile and desktop platforms as consumers change devices.

Consumers getting increasingly impatient with tech

In the [United States], privacy concerns — especially with what’s been in the news in the last 12 months — are spiking.
Gordon LittleyManaging director of Verizon’s global CX practice

The Verizon CX survey found that speed for making a transaction or getting a support question answered affects consumers’ perception of good versus bad CX. It’s not page-load speeds or server latency, but rather efficiency — the fewer the clicks and the more convenient, the better the experience, Littley said.

Technology can taint the customer experience when it fails — or at least delays — customer gratification. American consumers said slow apps (39%), having to make multiple attempts to resolve a problem (38%), a company ignoring data preferences (34%) and having to repeat information all would push them to a competitor.

Europeans want brands to focus more on data security and consumer priorities than consumers in the Americas, because that’s a bigger story there now because of the GDPR European data privacy law, Littley said. However, U.S. consumers are starting to demand the companies that serve them pay more attention, as high-profile breaches take place and laws such as California’s GDPR-like law go into effect next year.

“In the [United States], privacy concerns — especially with what’s been in the news in the last 12 months — are spiking,” Littley said.

E-commerce shop picking and packing merchandise orders.
E-commerce and customer support chatbot automation may be all the tech rage, but the Verizon CX survey of 6,000 consumers confirmed what smarter brands know: Humans build trust and brand loyalty.

Consumers getting savvier about AI, ethical data use

What AI technology does and how it works might be hard for the average consumer to grasp, let alone the ethics surrounding its use. But consumers know what they don’t like when they see it, such as data breaches that cause distrust and inconvenience as passwords or even credit card numbers have to be changed.

Globally, in the Verizon CX survey, consumers expressed a willingness to share data if the tradeoff is more personalized experiences for younger generations. Older generations — 55 and older — said they’d give up their data in exchange for economic benefits, such as lower prices or exclusive sales.

One of the most important messages revealed by the survey, according to Littley, was letting the consumer know exactly what is being done with their data and sticking to your own data use policies are key to building trust and inspiring loyalty.

“If you’re going to use data to help me in my interactions and how you deal with me, then I’m going to be OK with that, as long as I know what you’re doing,” Littley said.

Go to Original Article
Author:

For Sale – xeon 1220v5 CPU, xeon 2603v4 CPU, g1610t CPU, ECC 2133p 16GB RAM

Discussion in ‘Desktop Computer Classifieds‘ started by dirtypaws, Jun 20, 2019.

  1. dirtypaws

    dirtypaws

    Novice Member

    Joined:
    Sep 29, 2015
    Messages:
    38
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    8
    Location:
    Sheffield
    Ratings:
    +2

    Xeon 1220V5 £90 (x2)
    Xeon 2603v4 £75
    Celeron G1610t £10
    16GB 2133p ECC RAM £80 (x2)
    [​IMG]

    Price and currency: £100
    Delivery: Delivery cost is not included
    Payment method: paypal , bank transfer
    Location: sheffield
    Advertised elsewhere?: Not advertised elsewhere
    Prefer goods collected?: I have no preference

    ______________________________________________________
    This message is automatically inserted in all classifieds forum threads.
    By replying to this thread you agree to abide by the trading rules detailed here.
    Please be advised, all buyers and sellers should satisfy themselves that the other party is genuine by providing the following via private conversation to each other after negotiations are complete and prior to dispatching goods and making payment:

    • Landline telephone number. Make a call to check out the area code and number are correct, too
    • Name and address including postcode
    • Valid e-mail address

    DO NOT proceed with a deal until you are completely satisfied with all details being correct. It’s in your best interest to check out these details yourself.

Share This Page

Loading…

Go to Original Article
Author: