Tag Archives: Need

The business benefits of enterprise data governance and MDM

With seemingly overwhelming amounts of data coming from myriad sources, the need for effective enterprise data governance strategies is of paramount importance to many organizations.

Enterprise data governance has many facets and can often intersect with master data management (MDM) efforts. That convergence was on display at Informatica’s MDM 360 and Data Governance virtual summit hosted on March 19.

The enterprise cloud data management vendor, based in Redwood City, Calif., has been particularly active in recent months, hiring a new CEO in January and expanding the company’s product portfolio with updated governance, data catalog and analytics capabilities.

“We all want tomorrow’s data yesterday, to make a decision for today,” Informatica CEO Amit Walia said during the event’s opening keynote.

Informatica’s virtual conference was among the many similar events that tech vendors have held or are planning to substitute for in-person events canceled because of the coronavirus pandemic.

One notable tech conference producer, O’Reilly Media, sponsor of the Strata Data and AI conferences, among others, said March 24 it is closing its in-person conference business altogether because of the pandemic.

Amit Walia
Informatica CEO Amit Walia

How Hertz is mastering enterprise data governance and management

Meanwhile, with its global car rental operations, Hertz Corporation possesses a lot of data that it needs to collect and govern, for some 100 million customers and a fleet of nearly a million vehicles.

We all want tomorrow’s data yesterday, to make a decision for today.
Amit WaliaCEO, Informatica

Speaking at the virtual event, Richard Leaton, master data leader at Hertz, outlined the challenges his organization faces and the best practices for data governance and data management Hertz has used.

“The overall business objectives of MDM from an IT perspective, was a $1 billion transformation, changing our reservation system, rental system, sales engine and fleet management,” Leaton said. “If it had an electronic component to it, I think we changed it.”

As part of that effort Hertz needed to improve data quality and data governance, so there could be a single source of information for customer and fleet vehicle data.

Leaton noted that when he joined Hertz in 2017, the company had multiple sets of customer and vehicle master data sources and 30 years of mainframe-based proprietary databases. The systems were highly customized, not easy to upgrade and not uniformly governed.

Leaton emphasized that Hertz started with a process to engage all the right constituencies in the business.

“Data is an asset,” he said. “Data can have real hard number committed to it and when you have hard numbers associated with a data program, you’re going to have people who are helping you to make that data program successful.”

The technology should be the easy part of data transformation, Leaton said. The business processes, the buy-in and making sure the right data quality is present become the hard parts.

Enterprise data governance is the key to master data management

The first step for enabling MDM is to start with data governance, according to Leaton.

“If you don’t have your terms defined, you can’t build an MDM suite effectively,” Leaton said. “We were partway along the governance journey and started into MDM the first time and that’s where we ran into trouble.”

Hertz IT managers thought that they had defined enterprise data governance terms, but they came to realize that the terms were not agreed upon across the multiple platform of the business.

Securing executive buy-in for defining data governance across an organization is critical, Leaton said. He also emphasized that financial metrics and business value needs to be associated with the effort. Business leaders need to understand what the business will get out of a data governance effort. It’s not enough just to want to have good data, leaders need to define terms.

The defined terms for data governance can outline how the effort will help ensure regulatory compliance and how it will help to grow the business because all the systems talk to each other and there is better operational efficiency.

Data governance at Invesco

Rich Turnock, global head of enterprise data services at financial services firm Invesco, based in Louisville, Ky., also has a structured process for data governance.

The Invesco enterprise data platform incorporates three core steps for data governance and quality. In the planning phase, much like at Hertz, Turnock said the organization needs to define and document data requests in terms of business outcomes.

In the capture phase of data, enterprise data governance policies for mapping and cataloging data are important. For data delivery, Turnock said data output should be delivered in the agreed upon format and with preferred mechanisms that were defined up front in the planning process.

Using data to improve healthcare at Highmark Health

Using enterprise data governance and MDM best practices isn’t just about improving business outcomes. Those best practices can also improve healthcare.

Also at the Informatica virtual event, Anthony Roscoe, director of enterprise data governance at Highmark Health in Pittsburgh, explained how his organization embraced data governance and MDM. The key challenge for Highmark Health is that the organization had grown via acquisitions and ended up with multiple disparate data systems.

Operational integration of data is also part of Highmark Health’s data journey, making sure that clinical data from health systems can be correlated with health plans. It’s an approach that Roscoe said can help to streamline care decisions between the health insurance and care delivery portions of Highmark Health’s business.

The overriding goal of Highmark Health’s enterprise data platform is to take all the individual parts, find where the organization needs to gather data from so it can be organized, and ultimately govern the data so that appropriate access is in place.

“Mastering the data so that we speak a common language across the entire enterprise is key,” Roscoe said. “Speaking from the same language can deliver accurate data statements and reports and other metrics across the different business units.”

Go to Original Article
Author:

For Sale – ASRock Z97, 4690k & 16gb DDR3 2400mhz

It would work together yes and I have an unused stock Intel cooler. I would need £155 + postage for that bundle, with the ssd.

I would rather keep the bundles as they are though. The Z97, k-processor and ram are better suited together. You can’t overclock and take advantage of the 4690k on a h81 so the Xeon would better in the h81 and the DDR3 is far too fast to waste at standard 1600 speeds in the h81.

The 4690k is 3.5ghz boost to 3.9ghz, 4 cores 4 threads
The Xeon is 3.5ghz boost to 3.8ghz, 4 cores, 8 threads so very little difference single core and much faster multi core with 8 threads and more cache

I can do the Xeon with the h81 board & ssd for £110 delivered and you could buy ram from here – undefined – CeX (UK): – Buy, Sell, Donate. I think I have a Geforce 8800 I can throw in for free with it too. Need to have a hoke in my box of bits.

That would have you set up for under £145

Go to Original Article
Author:

Three Years of Microsoft Teams

Microsoft Teams at 3: Everything you need to connect with your teammates and be more productive

This week marks the third anniversary of Microsoft Teams. It’s been an incredible three years, and we’re inspired to see the way organizations across the globe are using Teams to transform the way they work. Today, we’re sharing some new Teams capabilities across a few different aspects of the Teams experience, many with a tie to meetings.

Read more

Go to Original Article
Author: Microsoft News Center

For Sale – Gaming PC: 7700k/16GB/Vega56/NVME SSD Full specs inside

So I’ve gone back to team red out of a need to do hefty video editing and as such selling my old PC which was mostly used for gaming:

Intel Core i7 7700K (delid + liquid metal) – Cooled by a Noctua NH-D15
16GB Corsair Dominator Platinum 3200MHz C16
Asus Intel ROG STRIX Z270E Gaming
Samsung 960 EVO M.2 SSD (250GB)
Radeon Vega 56 8GB HBM2 (Purchased April 2019 so still within warranty)
Corsair Crystal 460X Midi Tower
BeQuiet PurePower 11 500W 80Plus Bronze PSU

If I spec up a PC today on OcUK at the same or lower spec parts it comes to £793, so this is available for £600 collected.

Also available are 2x WD RED 6TB NAS hard drives for £130 each. Replaced by larger capacity drives so working fine just no longer used. S.M.A.R.T tests etc all available.

Go to Original Article
Author:

Microsoft Azure Peering Services Explained

In this blog post, you’ll discover everything you need to know about Microsoft Azure Peering Services, a networking service introduced during Ignite 2019.

Microsoft explains the service within their documentation as follows:

Azure Peering Service is a networking service that enhances customer connectivity to Microsoft cloud services such as Office 365, Dynamics 365, software as a service (SaaS) services, Azure, or any Microsoft services accessible via the public internet. Microsoft has partnered with internet service providers (ISPs), internet exchange partners (IXPs), and software-defined cloud interconnect (SDCI) providers worldwide to provide reliable and high-performing public connectivity with optimal routing from the customer to the Microsoft network.

To be honest, Microsoft explained the service well, but what’s behind the explanation is much more complex. To understand Azure Peering Services and its benefits, you need to understand how peering, routing, and connectivity for internet providers work.

What Are Peering And Transit?

In the internet and network provider world, peering is an interconnection of separated and independent internet networks to exchange traffic between users within their respective networks. Peering or partnering is a free agreement between two providers. Normally both providers only pay their cross-connect in the datacenter and their colocation space. Traffic is not paid by any party. Instead, there are special agreements, e.g. from smaller to larger providers.

Normally you have the following agreements:

  • between equal providers or peering partners – traffic upload and download between these two networks is free for both parties
  • a larger provider and a smaller provider – the smaller provider needs to pay a fee for the transit traffic to the larger network provider
  • providers who transit another network to reach a 3rd party network (upstream service) – the provider using the upstream needs to pay a fee for the transit traffic to the upstream provider

An agreement by two or more networks to peer is instantiated by a physical interconnection of the networks, an exchange of routing information through the Border Gateway Protocol (BGP) routing protocol and, in some special cases, a formalized contractual document. These documents are called peering policies and Letter of Authorization or LOA.

Fun Fact – As a peering partner for Microsoft, you can easily configure the peering through the Azure Portal as a free service.

As you can see in the screenshot, Microsoft is very restrictive with their routing and peering policies. That prevents unwanted traffic and protects Microsoft customers when Peering for Azure ExpressRoute (AS12076).

Routing and peering policies Azure express route.

Now let’s talk a bit about the different types of peering.

Public Peering

Public peering is configured over the shared platform of Internet Exchange Point. Internet Exchanges charge a port and/or member fee for using their platform for interconnect.

If you are a small cloud or network provider with less infrastructure, the peering via an Internet Exchange is a good place to start. As a big player on the market, it is a good choice because you are also reaching smaller networks on a short path. The picture below shows an example of those prices. I took my example from the Berlin Commercial Internet Exchange Pricing Page.

Berlin Commercial Internet Exchange Pricing

Hurricane Electric offers a tool that can give you a peering map and more information about how a provider is publicly peered with other providers, but you will not get a map from the private peering there. The picture below shows you some examples for Microsoft AS 8075.

Microsoft AS 8075 peering

Private Peering

Private peering is a direct physical link between two networks. Commonly the peering is done by one or more 10GBE or 100GBE links. The connection is made from only one network to another, for which any site pays a set fee to the owner of the infrastructure or colocation that is used. Those costs are usually crossconnect within the datacenter. That makes private peering a good choice when you need to send large volumes of traffic to one specific network. That’s a much cheaper option when looking on the pricing per transferred gigabyte between both networks than with public peering. When peering private with providers you may need to follow some peering policies though.

A good provider also has a looking glass where you can get more insights into peerings, but we will look at this later on.

Transit and Upstream

When someone is using Transit, the provider itself has no access to the destination network. Therefore he needs to leverage other networks or network providers to reach the destination network and destination service. Those providers who give the transit are known as transit providers, with larger networks being considered as Tier 1 networks. As a network provider for cloud customers like Microsoft, you don’t want any transit routing. In the first place, you normally have high costs for transitive routing through other networks, and what is worse, you add additional latency and uncontrollable space between your customers and the cloud services. So, the first thing when handling cloud customers, avoid transit routing and peer yourself with cloud providers either through private or public network interconnect at interconnect locations.

That is one reason why Microsoft is working with Internet Exchanges and Network and Internet Providers to enable Services like Microsoft Azure Peering. It should give customers more control over how they reach Microsoft Services incl. Azure, Microsoft 365, xBox etc. To understand the impact, you also need to know about Service Provider Routing. That’s how we will follow up in the next part of the post.

How Internet Service Providers Route your Traffic?

When you look at routing, there are mostly only two options within a carrier network. The first one is cold potato or centralized routing. With cold potato routing, a provider keeps the traffic as long as possible within his network before he sends it to another 3rd party. The other option is hot potato routing or decentralized routing. Here the provider sends the traffic as fast as possible to the 3rd party, mostly in the same metro.

The picture below illustrates the difference between hot and cold potato routing.

cold and hot potato routing differences

As you can see in the drawing, the cold potato routing takes a longer distance through the provider network and with that to your target, e.g. Microsoft.

Those routing configurations have a large impact on your cloud performance because every kilometer distance adds latency. The actual number is 1ms in latency added per every 200 kilometers of distance. As a result, you will see an impact on the likes of voice quality during Teams Meetings or synchronization issues for Backups to Azure.

Microsoft has a big agenda to address that issue for their customers and the rest of the globe. You can read more about the plans in articles from Yousef Khalidi, Cop. Vice President Microsoft Networking.

Now let’s start with Peering Services and how it can change the game.

What is Azure Peering Services and How it Solves the Issue?

When you look at how the service is designed, you can see that it leverages all of Microsoft Provider Peering with AS 8075. Together with the Microsoft Azure Peering Services Partners, Microsoft can change the default routing and transit behavior to their services when using a partner provider.

Following the picture below, you can setup a routing so that traffic from your network to Azure (or other networks) now uses the Microsoft Global Backbone instead of a transit provider without any SLA.

What is Azure Peering Services

With that service enabled, the performance to Microsoft Services will increase and the latency will be reduced depending on the provider. As you can expect, services like Office 365 or Azure AD will profit from that Azure Service but there is more. When you for example build your backbone on the Microsoft Global Transit Architecture with Azure Virtual WAN and leverage Internet Connections of these certain Providers and Internet Exchange Partners, you will directly boost your network performance and you will have a pseudo-private network. The reason for that is because you now leverage private or public peering with route restrictions. Your backbone traffic will now bypass the regular Internet and flow through the Microsoft Global Backbone from A to B.

Let me try to explain it with a drawing.

Microsoft global backbone network

in addition to better performance, you will also get an additional layer of monitoring. While the regular internet is a black box regarding dataflow, performance, etc. with Microsoft Azure Peering Services you get fully operational monitoring of your wide area network through the Microsoft Backbone.

You can find this information in the Azure Peering Services Telemetry Data.

The screenshot below shows the launch partner of Azure Peering Services.

Launch partner of Azure Peering Services

When choosing a network provider for your access to Microsoft, you should follow this guideline:

  • Choose a provider well peered with Microsoft
  • Choose a provider with hot potato routing to Microsoft
  • Don`t let the price decide the provider, a good network has costs
  • Choose Dedicated Internet Access before regular Internet Connection any time possible
  • If possible use locale providers instead of global ones
  • A good provider always has a looking glass or can provide you with default routes between a city location and other peering partners. If not, it is not a good provider to choose

So, let’s learn about the setup of the service.

How to configure Azure Peering Services?

First, you need to understand that like with Azure ExpressRoute, there are two sites to contact and configure.

You need to follow the steps below to establish a Peering Services connection.

Step 1: Customer provision the connectivity from a connectivity partner (no interaction with Microsoft). With that, you get an Internet provider who is well connected to Microsoft and meets the technical requirements for performant and reliable connectivity to Microsoft. Again you should check the Partnerlist.
Step 2: Customer registers locations into the Azure portal. A location is defined by: ISP/IXP Name, Physical location of the customer site (state level), IP Prefix given to the location by the Service Provider or the enterprise. As a service from Microsoft, you now get Telemetry data like Internet Routes monitoring and traffic prioritization from Microsoft to the user’s closest edge location.

The registration of the locations happens within the Azure Portal.

Currently, you need to register for the public beta first. That happens with some simple PowerShell commands.

Using Azure PowerShell 

Using Azure CLI

Afterward, you can configure the service using the Azure Portal, Azure PowerShell, or Azure CLI.

You can find the responsive guide here.

After the Service went General Available (GA), customers also received SLAs on the Peering and Telemetry Service. Currently, there is no SLA and no support if you use the services in production.

Peering and Telemetry service

Closing Thoughts

From reading this article you now have a better understanding of Microsoft Azure Peering Services and its use, peering between providers, and the routing and traffic behavior within the internet. When digging deeper into Microsoft Peering Services, you now should be able to develop some architectures and ideas on how to use that service.

If you have any providers which are not aware about that Service or direct Peering with Microsoft AS 8075, point them to http://peering.azurewebsites.net/ or let them drop an email to [email protected]

When using the BGP Tools from Hurricane Electric, you should get info about some providers, peering with Microsoft. One thing you need to know, most of the 3500 Network Partners of Microsoft are peering private with Microsoft. The Hurricane tools and only observe the public peering partners.

Go to Original Article
Author: Florian Klaffenbach

Sigma analytics platform’s interface simplifies queries

In desperate need of data dexterity, Volta Charging turned to the Sigma analytics platform to improve its business intelligence capabilities and ultimately help fuel its growth.

Volta, based in San Francisco and founded in 2010, is a provider of electric vehicle charging stations, and three years ago, when Mia Oppelstrup started at Volta, the company faced a significant problem.

Because there aren’t dedicated charging stations the same way there are dedicated gas stations, Volta has to negotiate with organizations — mostly retail businesses — for parking spots where Volta can place its charging stations.

Naturally, Volta wants its charging stations placed in the parking spots with the best locations near the business they serve. But before an organization gives Volta those spots, Volta has to show that it makes economic sense, that by putting electric car charging stations closest to the door it will help boost customer traffic through the door.

That takes data. It takes proof.

Volta, however, was struggling with its data. It had the necessary information, but finding the data and then putting it in a digestible form was painstakingly slow. Queries had to be submitted to engineers, and those engineers then had to write code to transform the data before delivering a report.

Any slight change required an entirely new query, which involved more coding, time and labor for the engineers.

But then the Sigma analytics platform transformed Volta’s BI capabilities, Volta executives said.

Curiosity isn’t enough to justify engineering time, but curiosity is a way to get new insights. By working with Sigma and doing queries on my own I’m able to find new metrics.
Mia OppelstrupBusiness intelligence manager, Volta Charging

“If I had to ask an engineer every time I had a question, I couldn’t justify all the time it would take unless I knew I’d be getting an available answer,” said Oppelstrup, who began in marketing at Volta and now is the company’s business intelligence manager. “Curiosity isn’t enough to justify engineering time, but curiosity is a way to get new insights. By working with Sigma and doing queries on my own I’m able to find new metrics.”

Metrics, Oppelstrup added, that she’d never be able to find on her own.

“It’s huge for someone like me who never wrote code,” Oppelstrup said. “It would otherwise be like searching a warehouse with a forklift while blindfolded. You get stuck when you have to wait for an engineer.”

Volta looked at other BI platforms — Tableau and Microsoft’s Power BI, in particular — but just under two years ago chose Sigma and has forged ahead with the platform from the 2014 startup.

The product

Sigma Computing was founded by the trio of Jason Frantz, Mike Speiser and Rob Woollen.

Based in San Francisco, the vendor has gone through three rounds of financing and to date raised $58 million, most recently attracting $30 million in November 2019.

When Sigma was founded, and ideas for the Sigma analytics platform first developed, it was in response to what the founders viewed as a lack of access to data.

“Gartner reported that 60 to 73 percent of data is going unused and that only 30 percent of employees use BI tools,” Woollen, Sigma’s CEO, said. “I came back to that — BI was stuck with a small number of users and data was just sitting there, so my mission was to solve that problem and correct all this.”

Woollen, who previously worked at Salesforce and Sutter Hill Ventures — a main investor in Sigma — and his co-founders set out to make data more accessible. They set out to design a BI platform that could be used by ordinary business users — citizen data scientists — without having to rely so much on engineers, and one that respond quickly no matter the queries users ask of it.

Sigma launched the Sigma analytics platform in November 2018.

Like other BI platforms, Sigma — entirely based in the cloud — connects to a user’s cloud data warehouse in order to access the user’s data. Unlike most BI platforms, however, the Sigma analytics platform is a low-code BI tool that doesn’t require engineering expertise to sift through the data, pull the data relevant to a given query and present it in a digestible form.

A key element of that is the Sigma analytics platform’s user interface, which resembles a spreadsheet.

With SQL running in the background to automatically write the necessary code, users can simply make entries and notations in the spreadsheet and Sigma will run the query.

“The focus is always on expanding the audience, and 30 percent employee usage is the one that frustrates me,” Woollen said. “We’re focused on solving that problem and making BI more accessible to more people.”

The interface is key to that end.

“Products in the past focused on a simple interface,” Woollen said. “Our philosophy is that just because a businessperson isn’t technical that shouldn’t mean they can’t ask complicated questions.”

With the Sigma analytics platform’s spreadsheet interface, users can query their data, for example, to examine sales performance in a certain location, time or week. They can then tweak it to look at a different time, or a different week. They can then look at it on a monthly basis, compare it year over year, add and subtract fields and columns at will.

And rather than file a ticket to the IT department for each separate query, they can run the query themselves.

“The spreadsheet interface combines the power to ask any question of the data without having to write SQL or ask a programmer to do it,” Woollen said.

Giving end users power to explore data

Volta knew it had a data dexterity problem — an inability to truly explore its data given its reliance on engineers to run time- and labor-consuming queries — even before Oppelstrup arrived. The company was looking at different BI platforms to attempt to help, but most of the platforms Volta tried out still demanded engineering expertise, Oppelstrup said.

The outlier was the Sigma analytics platform.

“Within a day I was able to set up my own complex joins and answer questions by myself in a visual way,” Oppelstrup said. “I always felt intimidated by data, but Sigma felt like using a spreadsheet and Google Drive.”

One of the significant issues Volta faced before it adopted the Sigma analytics platform was the inability of its salespeople to show data when meeting with retail outlets and attempting to secure prime parking spaces for Volta’s charging stations.

Because of the difficulty accessing data, the salespeople didn’t have the numbers to prove that by placing charging stations near the door it would increase customer traffic.

With the platform’s querying capability, however, Oppelstrup and her team were able to make the discoveries that armed Volta’s salespeople with hard data rather than simply anecdotes.

They could now show a bank a surge in the use of charging stations near banks between 9 a.m. and 4 p.m., movie theaters a similar surge in the use just before the matinee and again before the evening feature, and grocery stores a surge near stores at lunchtime and after work.

They could also show that the charging stations were being used by actual customers, and not by random people charging up their vehicles and then leaving without also going into the bank, the movie theater or the grocery store.

“It’s changed how our sales team approaches its job — it used to just be about relationships, but now there’s data at every step,” Oppelstrup said.

Sigma enables Oppelstrup to give certain teams access to certain data, everyone access to other data, and importantly, easily redact data fields within a set that might otherwise prevent her from sharing information entirely, she said.

And that gets to the heart of Woollen’s intent when he helped start Sigma — enabling business users to work with more data and giving more people that ability to use BI tools.

“Access leads to collaboration,” he said.

Go to Original Article
Author:

For Sale – Parts Clear Out (Motherboards, Memory, CPUs, GPUs and Case) ***PRICE DROPS***

Due to a recent upgrade, and the need to clear some space in the garage, I’ve got the following up for sale.

ADDED:

Current build bundle – £220.00
Asus Z97 Pro Gamer
LGA 1150 – Intel Core i5 4690K

Motherboards:
MSI Z87 GD65 Used £65.00 £55.00
MSI Z170I ITX Used £95.00 £90.00

DDR3:
8GB Corsair Vengeance Pro – 2133Mhz (2x4GB) Used £35.00
16GB HyperX Savage Red – 2400Mhz (2x8GB) Used £50.00
SOLD to scott178

DDR4:
16GB Corsair Low Profile Black – 2400Mhz (2x8GB) Used £45.00

Intel Processors:
LGA 1150 – Intel Core i5-4670K Used £65.00 £55.00
LGA 1151 – Intel Core i5-6600 Used £90.00 £85.00

AMD Graphics Cards:
XFX AMD R9 390 – 8GB Used £75.00 SOLD to Jeeva

Nvidia Graphics Cards:
MSI – GTX 660Ti 2GB Used £45.00 £35.00
MSI – GTX 570 2GB Used £35.00 £25.00
Pulled from Sale

Mice:
Razer Mamba Elite 2016 Wireless Used £60.00

Cases:
Phanteks Evolve ITX Used £40.00 £35.00 (Collection Only)

Coolers:
Corsair H50 Used £40.00 £35.00
Corsair H80i Used £50.00 £45.00

Most items will be boxed in their original retail or OEM packaging.

I will updating this thread as I discover anything else that I no longer require.

Open to offers.

Price and currency: £845
Delivery: Delivery cost is not included
Payment method: BT/PPG
Location: Oxford
Advertised elsewhere?: Advertised elsewhere
Prefer goods collected?: I prefer the goods to be collected

______________________________________________________
This message is automatically inserted in all classifieds forum threads.
By replying to this thread you agree to abide by the trading rules detailed here.
Please be advised, all buyers and sellers should satisfy themselves that the other party is genuine by providing the following via private conversation to each other after negotiations are complete and prior to dispatching goods and making payment:

  • Landline telephone number. Make a call to check out the area code and number are correct, too
  • Name and address including postcode
  • Valid e-mail address

DO NOT proceed with a deal until you are completely satisfied with all details being correct. It’s in your best interest to check out these details yourself.

Go to Original Article
Author:

Wanted – Looking for a laptop for as cheap as possible, must have 16gb ram. (To play football manger 2020)

Why do you need so much ram for football manager? If of any interest i have a toshiba z40 ultrabook, specs are
* Intel Core i5-4210 Processor up to 2.7Ghz
* 12GB DDR3 RAM
* 128GB Original Toshiba SSD Drive
* Windows 7 Professional 64Bit(upgraded to windows 10)
* Built-In Bluetooth
* HDMI Port
* VGA Port
* 3x USB 3.0 Ports
* 3.5mm Headphone jack
* Ethernet Port
* SD Card Reader
* Original Toshiba Battery and charger
Not sure if its enough to run FM2020?

Go to Original Article
Author:

For Sale – Parts Clear Out (Motherboards, Memory, CPUs, GPUs and Case) ***PRICE DROPS***

Due to a recent upgrade, and the need to clear some space in the garage, I’ve got the following up for sale.

ADDED:

Current build bundle – £220.00
Asus Z97 Pro Gamer
LGA 1150 – Intel Core i5 4690K

Motherboards:
MSI Z87 GD65 Used £65.00 £55.00
MSI Z170I ITX Used £95.00 £90.00

DDR3:
8GB Corsair Vengeance Pro – 2133Mhz (2x4GB) Used £35.00
16GB HyperX Savage Red – 2400Mhz (2x8GB) Used £50.00
SOLD to scott178

DDR4:
16GB Corsair Low Profile Black – 2400Mhz (2x8GB) Used £45.00

Intel Processors:
LGA 1150 – Intel Core i5-4670K Used £65.00 £55.00
LGA 1151 – Intel Core i5-6600 Used £90.00 £85.00

AMD Graphics Cards:
XFX AMD R9 390 – 8GB Used £75.00 SOLD to Jeeva

Nvidia Graphics Cards:
MSI – GTX 660Ti 2GB Used £45.00 £35.00
MSI – GTX 570 2GB Used £35.00 £25.00
Pulled from Sale

Mice:
Razer Mamba Elite 2016 Wireless Used £60.00

Cases:
Phanteks Evolve ITX Used £40.00 £35.00 (Collection Only)

Coolers:
Corsair H50 Used £40.00 £35.00
Corsair H80i Used £50.00 £45.00

Most items will be boxed in their original retail or OEM packaging.

I will updating this thread as I discover anything else that I no longer require.

Open to offers.

Price and currency: £845
Delivery: Delivery cost is not included
Payment method: BT/PPG
Location: Oxford
Advertised elsewhere?: Advertised elsewhere
Prefer goods collected?: I prefer the goods to be collected

______________________________________________________
This message is automatically inserted in all classifieds forum threads.
By replying to this thread you agree to abide by the trading rules detailed here.
Please be advised, all buyers and sellers should satisfy themselves that the other party is genuine by providing the following via private conversation to each other after negotiations are complete and prior to dispatching goods and making payment:

  • Landline telephone number. Make a call to check out the area code and number are correct, too
  • Name and address including postcode
  • Valid e-mail address

DO NOT proceed with a deal until you are completely satisfied with all details being correct. It’s in your best interest to check out these details yourself.

Go to Original Article
Author: