Category Archives: Microsoft Blog

Microsoft Blog

Exchange Online vs. Office 365: Which plan is a better fit?

A disconnect between IT and the end users can cause problems when it’s time to pull the plug on your on-premises Exchange Server infrastructure.

Many organizations are dealing with the Exchange Online vs. Office 365 decision-making process as part of its messaging platform migration. Once you decide to move email from the data center, the IT department must then sift through the numerous factors to see if an Office 365 plan or one for just hosted email with the Exchange Online service will suffice. There is pressure to keep costs low, which can force IT to rush into a licensing decision that limits the ability to deliver crucial functionalities that users require.

For organizations moving from an Exchange Server on-premises configuration to Exchange Online or the full Office 365 suite, there are several key areas that are critical to the organization’s success that don’t always get the appropriate attention until the questions from users start rolling in when the migration is done.

1. Lack of email backups presents a risky scenario

It might be hard to believe, but Office 365 does not offer a backup service to its users. Microsoft provides recovery services going back up to 44 days for deleted email or mailboxes.

Many organizations ignore the lack of backup functionality until users lose email that even Microsoft support cannot recover. Backup is critical and every business moving to Office 365 should look at the offerings from third-party vendors such as SkyKick, Datto and Veaam in their Office 365 backup subscriptions for data protection.

2. Basic plans don’t offer desired security features

One other area overlooked by IT when adopting a less-expensive Office 365 license is the advanced security, which is only available as an add-on for those plans. The choice to go with some of the low-cost plans in Office 365 — Office 365 F1, Exchange Online Plan 1, Business Essentials or Business Premium — brings with it the risk of missing out on the defensive capabilities around Advanced Threat Protection and anti-phishing that help keep the organization’s data and systems safe.

Many organizations ignore the lack of backup functionality until users lose email that even Microsoft support cannot recover.

3. The advanced compliance capabilities might cost you extra

Some businesses must meet compliance requirements with the eDiscovery and On-Hold features. An organization that selects one of the lower-end plans, such as Office 365 F1, Exchange Online Plan 1, Office 365 Business Essentials or Office 365 Business Premium, will not have these compliance features. In most cases, to gain that functionality an organization would have to purchase additional licensing or the Exchange Online Protection Plan to add to each of its user licenses for those capabilities.

4. Try to account for all users’ storage requirements

Microsoft has been generous with the amount of storage it includes with most of its Office 365 plans, except for the F1 plan, which gives each user 2 GB of email storage. But the reality is that the 50 GB you get in the Business Essentials plan is plenty for many who might have had a much smaller mailbox size when their mailbox was on premises.

However, one commonly overlooked area is archiving. The basic Office 365 subscriptions — Office 365 F1, Exchange Online Plan 1, Office 365 Business Essentials or Office 365 Business Premium — do not include an archive feature. This leaves many of the users on those plans who might have depended on an archiving product with their on-premises messaging platform without this safety net. Those users might want to migrate archived email into their main mailbox. In addition to Exchange Online archiving, you can store third-party data, including documents stored in Box and Facebook data for an extra fee.

5. Not getting full use of all available services in the Office 365 suite

Many organizations do a very narrow cloud migration and will just shift the messaging platform to Exchange Online. Some of the basic Office 365 plans cost just an additional dollar per user, per month to move from Exchange Online Plan 1 ($4 per user, per month) to Office 365 Business Essentials ($5 per user, per month). An organization that does not want to spend a relatively small amount will miss out on a number of valuable collaboration services that can help end users do more with the platform, such as Microsoft Teams, Skype for Business, SharePoint, OneDrive, Office Online Apps, Forms, PowerApps, Flow and Bookings.

Go to Original Article
Author:

Verizon CX boss: Bots slash costs, but humans build brand loyalty

Customers appreciate the automation of formerly laborious processes, such as making payments, changing passwords and updating account information. But, a Verizon CX leader contends, in addition to the speed of automation and quick-response chatbots, humans ultimately build brand loyalty and trust.

That was one upshot from a survey of 6,000 consumers across 15 countries Verizon commissioned in February 2019 and released in mid-June.

Gordon Littley, managing director of Verizon’s global CX practice, said that confirmed what he’s learned anecdotally talking to Verizon customers’ CX leaders: Customers need human agents to close the deal, solve their problems on the support side and build a connection to a brand that results in repeat-business loyalty.

Chatbots have limitations, he said. They need to be designed in a customer-centric way, Littley said, to avoid migrating outdated thinking into new technology platforms. That kills the customer experience.

“We’re forcing customers to do business the way we want to do business. With IVR [interactive voice response], the whole intent of that platform was to avoid talking to customers,” Littley said. “If you view chatbots in the same way, you’re going to make the same mistakes. What you have to do is start with the customer, who says, ‘I don’t care what channel you service me on as long as it’s quick and to the point.'”

Gordon Littley, managing director of Verizon's global CX practice Gordon Littley

Further reinforcing the notion that automation for the sake of cutting costs of human employees doesn’t go over well with customers is that they’re demanding the human touch. In the Americas consumers said they expect people to be involved in CX, with 44% indicating that being unable to speak to a person when contacting a brand would be the No. 1 reason to send them to a competitor.

Globally, 59% of the Verizon CX survey respondents said email or secure messaging platforms are their preferred methods of communicating with brands. However, the next three on the list — they could choose more than one — involved human interactions: phone (54%), in person (39%) and live chat (39%).

Yet, consumers also expressed that they want technology serving them automated personalized experiences that include tailored and meaningful recommendations that follow them across mobile and desktop platforms as consumers change devices.

Consumers getting increasingly impatient with tech

In the [United States], privacy concerns — especially with what’s been in the news in the last 12 months — are spiking.
Gordon LittleyManaging director of Verizon’s global CX practice

The Verizon CX survey found that speed for making a transaction or getting a support question answered affects consumers’ perception of good versus bad CX. It’s not page-load speeds or server latency, but rather efficiency — the fewer the clicks and the more convenient, the better the experience, Littley said.

Technology can taint the customer experience when it fails — or at least delays — customer gratification. American consumers said slow apps (39%), having to make multiple attempts to resolve a problem (38%), a company ignoring data preferences (34%) and having to repeat information all would push them to a competitor.

Europeans want brands to focus more on data security and consumer priorities than consumers in the Americas, because that’s a bigger story there now because of the GDPR European data privacy law, Littley said. However, U.S. consumers are starting to demand the companies that serve them pay more attention, as high-profile breaches take place and laws such as California’s GDPR-like law go into effect next year.

“In the [United States], privacy concerns — especially with what’s been in the news in the last 12 months — are spiking,” Littley said.

E-commerce shop picking and packing merchandise orders.
E-commerce and customer support chatbot automation may be all the tech rage, but the Verizon CX survey of 6,000 consumers confirmed what smarter brands know: Humans build trust and brand loyalty.

Consumers getting savvier about AI, ethical data use

What AI technology does and how it works might be hard for the average consumer to grasp, let alone the ethics surrounding its use. But consumers know what they don’t like when they see it, such as data breaches that cause distrust and inconvenience as passwords or even credit card numbers have to be changed.

Globally, in the Verizon CX survey, consumers expressed a willingness to share data if the tradeoff is more personalized experiences for younger generations. Older generations — 55 and older — said they’d give up their data in exchange for economic benefits, such as lower prices or exclusive sales.

One of the most important messages revealed by the survey, according to Littley, was letting the consumer know exactly what is being done with their data and sticking to your own data use policies are key to building trust and inspiring loyalty.

“If you’re going to use data to help me in my interactions and how you deal with me, then I’m going to be OK with that, as long as I know what you’re doing,” Littley said.

Go to Original Article
Author:

For Sale – xeon 1220v5 CPU, xeon 2603v4 CPU, g1610t CPU, ECC 2133p 16GB RAM

Discussion in ‘Desktop Computer Classifieds‘ started by dirtypaws, Jun 20, 2019.

  1. dirtypaws

    dirtypaws

    Novice Member

    Joined:
    Sep 29, 2015
    Messages:
    38
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    8
    Location:
    Sheffield
    Ratings:
    +2

    Xeon 1220V5 £90 (x2)
    Xeon 2603v4 £75
    Celeron G1610t £10
    16GB 2133p ECC RAM £80 (x2)
    [​IMG]

    Price and currency: £100
    Delivery: Delivery cost is not included
    Payment method: paypal , bank transfer
    Location: sheffield
    Advertised elsewhere?: Not advertised elsewhere
    Prefer goods collected?: I have no preference

    ______________________________________________________
    This message is automatically inserted in all classifieds forum threads.
    By replying to this thread you agree to abide by the trading rules detailed here.
    Please be advised, all buyers and sellers should satisfy themselves that the other party is genuine by providing the following via private conversation to each other after negotiations are complete and prior to dispatching goods and making payment:

    • Landline telephone number. Make a call to check out the area code and number are correct, too
    • Name and address including postcode
    • Valid e-mail address

    DO NOT proceed with a deal until you are completely satisfied with all details being correct. It’s in your best interest to check out these details yourself.

Share This Page

Loading…

Go to Original Article
Author:

TensorWatch: A debugging and visualization system for machine learning

TensorWatch

The rise of deep learning is accompanied by ever-increasing model complexity, larger datasets, and longer training times for models. When working on novel concepts, researchers often need to understand why training metrics are trending the way they are. So far, the available tools for machine learning training have focused on a “what you see is what you log” approach. As logging is relatively expensive, researchers and engineers tend to avoid it and rely on a few signals to guesstimate the cause of the patterns they see. At Microsoft Research, we’ve been asking important questions surrounding this very challenge: What if we could dramatically reduce the cost of getting more information about the state of the system? What if we had advanced tooling that could help researchers make more informed decisions effectively?

Introducing TensorWatch

We’re happy to introduce TensorWatch, an open-source system that implements several of these ideas and concepts. We like to think of TensorWatch as the Swiss Army knife of debugging tools with many advanced capabilities researchers and engineers will find helpful in their work. We presented TensorWatch at the 2019 ACM SIGCHI Symposium on Engineering Interactive Computing Systems.

Custom UIs and visualizations

The first thing you might notice when using TensorWatch is it extensively leverages Jupyter Notebook instead of prepackaged user interfaces, which are often difficult to customize. TensorWatch provides the interactive debugging of real-time training processes using either the composable UI in Jupyter Notebooks or the live shareable dashboards in Jupyter Lab. In addition, since TensorWatch is a Python library, you can also build your own custom UIs or use TensorWatch in the vast Python data science ecosystem. TensorWatch also supports several standard visualization types, including bar charts, histograms, and pie charts, as well as 3D variations.

With TensorWatch—a debugging and visualization tool for machine learning—researchers and engineers can customize the user interface to accommodate a variety of scenarios. Above is an example of TensorWatch running in Jupyter Notebook, rendering a live chart from multiple streams produced by an ML training application.

With TensorWatch—a debugging and visualization tool for machine learning—researchers and engineers can customize the user interface to accommodate a variety of scenarios. Above is an example of TensorWatch running in Jupyter Notebook, rendering a live chart from multiple streams produced by an ML training application.

Streams, streams everywhere

One of the central premises of the TensorWatch architecture is we uniformly treat data and other objects as streams. This includes files, console, sockets, cloud storage, and even visualizations themselves. With a common interface, TensorWatch streams can listen to other streams, which enables the creation of custom data flow graphs. Using these concepts, TensorWatch trivially allows you to implement a variety of advanced scenarios. For example, you can render many streams into the same visualization, or one stream can be rendered in many visualizations simultaneously, or a stream can be persisted in many files, or not persisted at all. The possibilities are endless!

TensorWatch supports a variety of visualization types. Above is an example of a TensorWatch t-SNE visualization of the MNIST dataset.

TensorWatch supports a variety of visualization types. Above is an example of a TensorWatch t-SNE visualization of the MNIST dataset.

Lazy logging mode

With TensorWatch, we also introduce lazy logging mode. This mode doesn’t require explicit logging of all the information beforehand. Instead, you can have TensorWatch observe the variables. Since observing is basically free, you can track as many variables as you like, including large models or entire batches during the training. TensorWatch then allows you to perform interactive queries that run in the context of these variables and returns the streams as a result. These streams can then be visualized, saved, or processed as needed. For example, you can write a lambda expression that computes mean weight gradients in each layer in the model at the completion of each batch and send the result as a stream of tensors that can be plotted as a bar chart.

Phases of model development

At Microsoft Research, we care deeply about improving debugging capabilities in all phases of model development—pre-training, in-training, and post-training. Consequently, TensorWatch provides many features useful for pre- and post-training phases as well. We lean on several excellent open-source libraries to enable many of these features, which include model graph visualization, data exploration through dimensionality reduction, model statistics, and several prediction explainers for convolution networks.

Open source on GitHub

We hope TensorWatch helps spark further advances and ideas for efficiently debugging and visualizing machine learning and invite the ML community to participate in this journey via GitHub.

Go to Original Article
Author: Microsoft News Center

Cloud data management finding its place as volumes soar sky high

IDC estimates that by 2023, there will be 103 zettabytes in the global datasphere. That’s about 103 billion terabytes.

And you can bet a lot of that data will be in the cloud. Also by 2023, 30% of all the IT systems in enterprises’ data centers and edge locations will be running public cloud-sourced services, according to IDC.

There will be twice as much on the cloud in the next five years as there was in the last 10 years, said Frank Gens, senior vice president and chief analyst at IDC.

“Just because data exists doesn’t mean it’s going to be useful to you and your organization,” Gens said in a keynote at last week’s Actifio Data Driven conference in Boston, which featured a focus on cloud data management.

And with SaaS applications growing in number and popularity, it’s going to be critical for organizations to manage all that cloud data efficiently. That management will include copies of data an organization uses for backup, recovery, analytics and DevOps, among other key functions.

Beyond backup in the cloud

The cloud is a good way to manage massive amounts of data, said Phil Buckellew, general manager of IBM Cloud Object Storage. Cloud object storage, specifically, is a cheap method for large volumes.

Data analytics will be a driver of growth in object storage, Buckellew said at a panel discussion about cloud data management.

Cloud object storage is transitioning from simply a cheap place to store data to include more uses such as governance, said Archana Venkatraman, research manager at IDC.

Backup and recovery still make up a large piece of cloud uses, said Jim Donovan, senior vice president of product at cloud storage vendor Wasabi.

The market is changing, though. Copy data management pioneer Actifio is now offering what it calls “cloud data management” that handles assorted uses, as are vendors originally geared toward backup, such as Veeam and Commvault. Newer entrants to the market Rubrik and Cohesity also stress their cloud data management capabilities.

It’s a cloud, cloud, cloud, cloud world

Cloud adoption is nuanced, Venkatraman said. Organizations are using the best capabilities of multiple clouds.

However, when Venkatraman asked crowd members if they’re happy with their multi-cloud strategy, no one raised a hand.

Just because data exists doesn’t mean it’s going to be useful to you and your organization.
Frank GensSenior vice president and chief analyst, IDC

Cloud provider representatives agreed that customers have a long way to go with multi-cloud and hybrid cloud data management.

“Hybrid cloud maturity is not where it should be,” said Yee-Chen Tjie, head of New England cloud sales engineering at Google.

On the plus side, customers are looking into the best ways to deploy a hybrid cloud platform, Tjie said. He estimated that almost all large enterprises have a mix of cloud and on-premises infrastructure.

“The industry is starting to recognize the importance of having a hybrid cloud strategy,” Buckellew said.

What about getting your data out of the cloud?

One major sticking point with the cloud is bringing data back.

George Crump, founder and president of IT analysis firm Storage Switzerland, said egress fees are a key.

“The cost to move data out of the cloud is big,” Crump said.

Wasabi, a newer entrant to the field, stands out among the cloud storage vendors because it doesn’t charge for egress. As a result, Crump said he thinks Wasabi has a shot in the cloud data management market.

The larger cloud storage vendors — AWS, Microsoft Azure and Google Cloud Platform, to name a few — need to figure out a better solution for egress, Crump said. If not, they will risk losing out to vendors such as Wasabi that provide easier and cheaper ways to get data out of the cloud.

And what about tape versus cloud?

Organizations need to think beyond cloud data management and protection. For one thing, the 3-2-1 rule of backup calls for two different media, in addition to the three total copies of data and one off-site storage location.

Tape is one of those other media that can come in handy for organizations.

The industry is seeing a “revival” of tape, Venkatraman said. One of the reasons is it is immune to ransomware because it is offline. True, it takes a long time to recover data from a tape cartridge compared to some other forms of storage. However, if an organization gets hit with ransomware, it would likely prefer a longer recovery time to no recovery of important data.

In addition, tape capacity continues to increase every two to three years with the release of a new version of LTO. The current LTO-8 provides 30 terabytes (TB) compressed capacity in one cartridge.

“There are cases where tape can make a hell of a lot of sense,” said IBM’s Buckellew, who noted that his company sells tape.

And there are cases where moving to cloud data management makes sense. Alex Ferguson, system administrator at Abilene Christian University in Texas, said the school’s previous backup setup was tape-based.

Recovery could take hours or days.

“It was extremely burdensome,” Ferguson said.

The university is shifting to a SaaS-first philosophy and uses Actifio to protect 31 TB. Now recoveries take minutes.

“It’s been a drastic improvement for us,” Ferguson said.

Go to Original Article
Author:

For Sale – xeon 1220v5 CPU, xeon 2603v4 CPU, g1610t CPU, ECC 2133p 16GB RAM

Discussion in ‘Desktop Computer Classifieds‘ started by dirtypaws, Jun 20, 2019.

  1. dirtypaws

    dirtypaws

    Novice Member

    Joined:
    Sep 29, 2015
    Messages:
    38
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    8
    Location:
    Sheffield
    Ratings:
    +2

    Xeon 1220V5 £90 (x2)
    Xeon 2603v4 £75
    Celeron G1610t £10
    16GB 2133p ECC RAM £80 (x2)
    [​IMG]

    Price and currency: £100
    Delivery: Delivery cost is not included
    Payment method: paypal , bank transfer
    Location: sheffield
    Advertised elsewhere?: Not advertised elsewhere
    Prefer goods collected?: I have no preference

    ______________________________________________________
    This message is automatically inserted in all classifieds forum threads.
    By replying to this thread you agree to abide by the trading rules detailed here.
    Please be advised, all buyers and sellers should satisfy themselves that the other party is genuine by providing the following via private conversation to each other after negotiations are complete and prior to dispatching goods and making payment:

    • Landline telephone number. Make a call to check out the area code and number are correct, too
    • Name and address including postcode
    • Valid e-mail address

    DO NOT proceed with a deal until you are completely satisfied with all details being correct. It’s in your best interest to check out these details yourself.

Share This Page

Loading…

Go to Original Article
Author:

TensorWatch: A debugging and visualization system for machine learning

TensorWatch

The rise of deep learning is accompanied by ever-increasing model complexity, larger datasets, and longer training times for models. When working on novel concepts, researchers often need to understand why training metrics are trending the way they are. So far, the available tools for machine learning training have focused on a “what you see is what you log” approach. As logging is relatively expensive, researchers and engineers tend to avoid it and rely on a few signals to guesstimate the cause of the patterns they see. At Microsoft Research, we’ve been asking important questions surrounding this very challenge: What if we could dramatically reduce the cost of getting more information about the state of the system? What if we had advanced tooling that could help researchers make more informed decisions effectively?

Introducing TensorWatch

We’re happy to introduce TensorWatch, an open-source system that implements several of these ideas and concepts. We like to think of TensorWatch as the Swiss Army knife of debugging tools with many advanced capabilities researchers and engineers will find helpful in their work. We presented TensorWatch at the 2019 ACM SIGCHI Symposium on Engineering Interactive Computing Systems.

Custom UIs and visualizations

The first thing you might notice when using TensorWatch is it extensively leverages Jupyter Notebook instead of prepackaged user interfaces, which are often difficult to customize. TensorWatch provides the interactive debugging of real-time training processes using either the composable UI in Jupyter Notebooks or the live shareable dashboards in Jupyter Lab. In addition, since TensorWatch is a Python library, you can also build your own custom UIs or use TensorWatch in the vast Python data science ecosystem. TensorWatch also supports several standard visualization types, including bar charts, histograms, and pie charts, as well as 3D variations.

With TensorWatch—a debugging and visualization tool for machine learning—researchers and engineers can customize the user interface to accommodate a variety of scenarios. Above is an example of TensorWatch running in Jupyter Notebook, rendering a live chart from multiple streams produced by an ML training application.

With TensorWatch—a debugging and visualization tool for machine learning—researchers and engineers can customize the user interface to accommodate a variety of scenarios. Above is an example of TensorWatch running in Jupyter Notebook, rendering a live chart from multiple streams produced by an ML training application.

Streams, streams everywhere

One of the central premises of the TensorWatch architecture is we uniformly treat data and other objects as streams. This includes files, console, sockets, cloud storage, and even visualizations themselves. With a common interface, TensorWatch streams can listen to other streams, which enables the creation of custom data flow graphs. Using these concepts, TensorWatch trivially allows you to implement a variety of advanced scenarios. For example, you can render many streams into the same visualization, or one stream can be rendered in many visualizations simultaneously, or a stream can be persisted in many files, or not persisted at all. The possibilities are endless!

TensorWatch supports a variety of visualization types. Above is an example of a TensorWatch t-SNE visualization of the MNIST dataset.

TensorWatch supports a variety of visualization types. Above is an example of a TensorWatch t-SNE visualization of the MNIST dataset.

Lazy logging mode

With TensorWatch, we also introduce lazy logging mode. This mode doesn’t require explicit logging of all the information beforehand. Instead, you can have TensorWatch observe the variables. Since observing is basically free, you can track as many variables as you like, including large models or entire batches during the training. TensorWatch then allows you to perform interactive queries that run in the context of these variables and returns the streams as a result. These streams can then be visualized, saved, or processed as needed. For example, you can write a lambda expression that computes mean weight gradients in each layer in the model at the completion of each batch and send the result as a stream of tensors that can be plotted as a bar chart.

Phases of model development

At Microsoft Research, we care deeply about improving debugging capabilities in all phases of model development—pre-training, in-training, and post-training. Consequently, TensorWatch provides many features useful for pre- and post-training phases as well. We lean on several excellent open-source libraries to enable many of these features, which include model graph visualization, data exploration through dimensionality reduction, model statistics, and several prediction explainers for convolution networks.

Open source on GitHub

We hope TensorWatch helps spark further advances and ideas for efficiently debugging and visualizing machine learning and invite the ML community to participate in this journey via GitHub.

Go to Original Article
Author: Microsoft News Center

Music data visualization comes to life in Tableau collection

Beyond the sound of music, the sight of it has been a reality at least since the advent of music videos, and now with vendors such as Tableau providing the ability to turn information into images, so too is the sight of music data visualizations.

On June 24, Tableau unveiled a collection of music data visualizations created by users of Tableau Public called Data + Music. Dig if you will the picture — as Prince sang in the lyrics to “When Doves Cry” — of analytics and song combined.

The data visualizations contained in Data + Music weren’t all done by music industry professionals, and they weren’t put to use for professional purposes, but they show how people can use the combination of data and visuals to enhance the understanding of music.

“Data visualization skills are rapidly growing,” said Pooja Gandhi, lead analytics engineer at technology skills platform vendor Pluralsight and creator of a visualization that examines Spotify’s top tracks from 2017 and rates them in such categories as Danceability, Tempo and Speechiness.

“[Data visualization can benefit] not just the music industry, but it can help any industry that deals with data,” she said. “Data is everywhere, but if you don’t know what to do with it you lose opportunities to capitalize and improve your products or services.”

Data + Music is not a business intelligence platform for the music industry. It’s an exhibit of sorts of some of the more interesting music data visualizations made by users of Tableau Public.

Tableau Public user Adam McCann created a data visualization examining song lyrics by the Beatles.
Beatles data visualization created by Tableau Public user Adam McCann. New Tableau Data + Music collection showcases visualizations such as McCann’s analysis of the band’s lyrics.

One such visualization examines the song lyrics of The Beatles, the revolutionary rock ‘n’ roll group that had 21 No. 1 hits between 1964 and their eventual breakup in 1970. The images show which songs were written by which member or members of the band, the breadth of the different writers’ word choices, and what each artist chose to write about.

Another breaks down the 100 greatest heavy metal albums of all time, using data to determine that Black Sabbath’s Paranoid, released in 1970, is the best, with Metallica’s Master of Puppets (1986) ranking second.

Still other music data visualizations look at the works of mainstream artists such as Prince, Bruce Springsteen, The Cure and Fleetwood Mac.

“This was people exploring and expressing themselves,” said Taha Ebrahimi, director of Tableau Public. “But then they were also teaching what they created; other users are able to download the visualization and then reverse-engineer it to figure how it was built.”

Data visualization skills are rapidly growing. [Data visualization can benefit] not just the music industry, but it can help any industry that deals with data.
Pooja GandhiLead analytics engineer, Pluralsight

But while Data + Music is more an art exhibit than BI platform, analytics are an important part of the music industry.

Dating back to the days when radio was the dominant means of consuming music, record company executives crunched numbers to try to figure out what was resonating with the public at that moment and what would be the next big thing.

Now, just like any other industry, BI plays a substantial role in the decision-making process.

“You have to look at streaming numbers and social media numbers,” said Keith Hagan, co-founder of the SKH Music management and PR company in New York and a veteran of close to three decades in the music business. “From those you can see where your engagement is coming from. You can track where your engagement is, and you can then get a good sense of where you should be going.”

And while Tableau doesn’t have a platform designed specifically for music companies, industry insiders do use Tableau’s platforms for their analytics needs.

“One hundred and one percent [it can be used by music companies],” Ebrahimi said. “We have many music clients.”

One of those clients is Spotify — which has a feature called Spotify Insights with music data visualizations.

“They’re trying to get people to relate to data who may not be data analysts,” Ebrahimi said.

Meanwhile, music industry BI vendors such as Chartmetric and Soundcharts cater specifically to the music industry.

Streaming services like Apple Music and Spotify similarly provide data that industry insiders use to inform their strategies, data that unlike in the days when radio ruled is available in real time.

“If you’re in the music business, you have to look at data,” Hagan said. He added, however, that relying solely on data when managing and promoting an artist “is not an intelligent business model. If you don’t have a strategy along with viewing the data you’re going to have a problem.”

Data + Music may be a more lighthearted project than a deep dive into BI, but it shows that greater understanding through analytics, and specifically music data visualizations, extends even to the art of sound.

Go to Original Article
Author:

For Sale – iMac 27 Retina 5k – Boxed & High Spec – Late 2015 – 2TB – R9 M395

iMac 27 Retina 5k Late 2015

Bought in Nov 2016, only 2.5 years old and hardly used.

– Radeon R9 M395 graphics with 2GB GDDR 5 memory
– 2TB Fusion drive
– i5 3.3Ghz quad core (Turbo Boost up to 3.9GHz)
– 8GB ram (user upgradable)
– 5k Retina display

Comes in original box with all accessories and packaging.

High spec model, perfect for video editing, gaming, home office, etc.

– like new condition, hardly used in home office. 1 owner from new
– no scratches or damage. Never modified, overclocked or upgraded
– screen is in perfect condition, no edge bleed, no dead pixels
– ram can be user upgraded; going to 16GB can be done for about £45
– comes in original box and all accessories such as mouse, cable and keyboard
– updated to latest MacOS, will be wiped and find my computer disabled
– selling as I have purchased a MacBook

Any questions let me know, no silly offers please. These typically sell around the £1000 mark, and this is a high end model with R9 M395 graphics. Can deliver within reason or meet in central or east London.

Price and currency: 900
Delivery: Goods must be exchanged in person
Payment method: Cash/transfer
Location: London
Advertised elsewhere?: Advertised elsewhere
Prefer goods collected?: I prefer the goods to be collected

______________________________________________________
This message is automatically inserted in all classifieds forum threads.
By replying to this thread you agree to abide by the trading rules detailed here.
Please be advised, all buyers and sellers should satisfy themselves that the other party is genuine by providing the following via private conversation to each other after negotiations are complete and prior to dispatching goods and making payment:

  • Landline telephone number. Make a call to check out the area code and number are correct, too
  • Name and address including postcode
  • Valid e-mail address

DO NOT proceed with a deal until you are completely satisfied with all details being correct. It’s in your best interest to check out these details yourself.

Go to Original Article
Author:

How to copy files to Windows containers

There are three ways to copy files to Windows containers — using Docker CP, mounting a volume or downloading files from the internet — and each method has its own use cases.

You can use the Docker CP command-line tool from within the container to quickly copy a set of files from the container host. Application developers typically use the mounting volume method for production containers to ensure the applications hosted inside the container can host the data on a volume outside the container. Remember that containers are lightweight applications that you can recreate in a few seconds. You should never use containers to host data. Only use the Wget method if you must copy files to Windows containers from the internet without first copying them to the container host.

Use the Docker CP command line

The CP command line is part of the Docker Engine installation. If you must copy a zip file or a few other files, you can use the command below from within the container:

Docker cp C:TempThisFile.ZIP myprodcontainer:/

The cp command after Docker copies C:TempThisFile.ZIP from the container host machine to a running container called myprodcontainer on the C: drive. Once the zip file is available inside the container, you can extract it using the Expand-Archive command.

Mount a volume inside the container

Mounting a volume is effective because the mounted volume remains inside the container. To mount a container host volume, follow these steps.

Step 1: Create a data volume inside the container by running the command below:

Docker volume create --name volume03

This command creates a volume by the name Volume03. This volume has no data and isn’t mounted to any of the host volumes yet.

To ensure the volume was created successfully, issue the Docker volume ls command inside the container. The Docker volume ls command lists the available volumes and their names.

Step 2: Attach the volume to the container host volume by executing the command below:

docker run --name thisvol -it -v c:programdatadockervolumesvolume03:c:volume03 microsoft/windowsservercore powershell.exe

This command attaches C:ProgramDataDockerVolumesVolume03 to C:Volumes3 on the container. The volume will appear by the name of thisvol inside the container.

Containers host all data volumes and mounted volumes in the C:ProgramDataDockerVolumes folder.

Download files from the internet using Wget

The third way to copy files to Windows containers is to download files from a URL. If the files are hosted on a web portal, use the Wget command line to get them from the URL, as shown in the command below:

Wget –URI https://download.thisfile.com/files/File1.zip -UseBasicParsing

Copy files from Windows containers to the host

Generally speaking, you won’t need to use the reverse copy method, as the data volumes are hosted outside the container, but you should know how, just in case. You can use the Docker CP command as shown below:

docker cp thiscontainer:/path/to/file /destionationhost/tofilelocation

It’s important to understand that containers follow Linux-based architecture. When executing commands inside the container, you must use the complete command-set in lowercase. This isn’t the case with Windows OS; you can use commands in both uppercase and lowercase. Similarly, when specifying the path inside the container, use / instead of .

Go to Original Article
Author: