Tag Archives: time

For Sale – Huge clearout – PCs, laptops, components, RAM/GPUs

It’s time for a serious clearout! I haven’t had enough time to go through each individual set of parts yet, if there’s interest I’ll clean and test each one before sale.

I’ve been using work laptops and been too busy with work to bother much with my hardware below! I’m in the process of deciding what I want to keep and what to sell, it will take me a little while to clear off data in some instances. I do still need 2 PCs in the future but I’m taking this opportunity as a hard reset in my lineup.

This list will be updated as and when I get time / locate items.

Laptops
Lenovo E570 i3-6006u, 4gb DDR4 (I have more RAM below if needed). 500gb HDD, DVD writer, 15.6” screen. £200
Lenovo Yoga 12 i7 / 8gb / 250gb ssd £300
IBM T41 (specs TBC)

PCs
Dell XPS 8700, i7 4770, 12gb, GTX 645. £420
Dell Studio 540s SFF PC (ideal as HTPC), C2Q Q8300, 4gb, discrete half-height GPU (can’t remember which but will confirm) £100
HP Media Centre PC M7000 (includes the removable HDD in the bay at the front!) £100
Mesh Q6600 Elite, C2Q Q6600, 4gb (I think), no GPU £80
Thermaltake build – C2Q Q6600, 7600GS, RAM TBC £80

GPUs
2x Gigabyte 7970 GHz edition – boxed – barely seen any use in the past 4 years as I’ve been using a work laptop. £80ea
Palit GTX 780 £80
Palit GTX 980ti £200

SSDs
240gb Kingston KC300 £30
500gb Crucial MX500 £50
1.92tb Sandisk Ultra 900 USBC external drive brand new boxed £600

HDDs
3tb Seagate £35
Assorted 500gb-4tb 3.5” drives (5+ drives in total)
Assorted 60-500gb 2.5” drives (10+ drives in total)

Optical drives
5.25” DVD writers
Laptop DVD writers (I’ve amassed about 25 of these, please post if needed)

RAM
1x8gb DDR4 SODIMM £25
1x16gb DDR4 SODIMM £55
4x4gb Corsair Dominator 2133mhz DDR3 £100
4x8gb Patriot DDR3 £125
Assorted DDR1/DDR2 (including ECC) e.g. 4x512mb DDR2 ECC £10

Cases
NZXT Phantom £30 (boxed, some yellowing with age but it’s easy to get this back to white)

Components
Corsair AX850 PSU with black cable set £75
Corsair red cable set for AX650/AX750/AX850 £50
Corsair AX1200 PSU with cables £150
Sabertooth X79 mobo (found the manual and driver disc but no box) £80
Rampage IV Extreme X79 mobo (boxed with OC-key and manuals) £200

Corsair H100i £60
Corsair H100 £50
I7 3960x £170
I7 3930k £60

Misc
Trendnet Powerline AV500 adapter x2 boxed, £20
Netgear EX6120 Wifi AC1200 extender £30
Plantronics Calisto P610M USB speakerphone (Skype, Zoom etc.) – brand new boxed, 2 available, £60 each.

Go to Original Article
Author:

Q&A: Recounting the rough-and-tumble history of PowerShell

To examine the history of PowerShell requires going back to a time before automation, when point-and-click administration ruled.

In the early days of IT, GUI-based systems management was de rigueur in a Windows environment. You added a new user by opening Active Directory, clicking through multiple screens to fill in the name, group membership, logon script and several other properties. If you had dozens or hundreds of new users to set up, it could take quite some time to complete this task.

To increase IT efficiency, Microsoft produced a few command-line tools. These initial automation efforts in Windows — batch files and VBScript — helped, but they did not go far enough for administrators who needed undiluted access to their systems to streamline how they worked with the Windows OS and Microsoft’s ever-growing application portfolio.

It wasn’t until PowerShell came out in 2006 that Microsoft gave administrators something that approximated shell scripting in Unix. PowerShell is both a shell — used for simple tasks such as gathering the system properties on a machine — and a scripting language to execute more advanced infrastructure jobs. Each successive PowerShell release came with more cmdlets, updated functionality and refinements to further expand the administrator’s dominion over Windows systems and its users. In some instances, the only way to make certain adjustments in some products is via PowerShell.

Today, administrators widely use PowerShell to manage resources both in the data center and in the cloud. It’s difficult to comprehend now, but shepherding a command-line tool through the development gauntlet at a company that had built its brand on the Windows name was a difficult proposition.

Don JonesDon Jones

Don Jones currently works as vice president of content partnerships and strategic initiatives for Pluralsight, a technology skills platform vendor, but he’s been fully steeped in PowerShell from the start. He co-founded PowerShell.org and has presented PowerShell-related sessions at numerous tech conferences.

Jones is also an established author and his latest book, Shell of an Idea, gives a behind-the-scenes history of PowerShell that recounts the challenges faced by Jeffrey Snover and his team.

In this Q&A, Jones talks about his experiences with PowerShell and why he felt compelled to cover its origin story.

Editor’s note: This interview has been edited for length and clarity.

What was it like for administrators before PowerShell came along?

Don Jones: You clicked a lot of buttons and wizards. And it could get really painful. Patching machines was a pain. Reconfiguring them was a pain. And it wasn’t even so much the server maintenance — it was those day-to-day tasks.

I worked at Bell Atlantic Network Integration for a while. We had, maybe, a dozen Windows machines and we had one person who basically did nothing but do new user onboarding: creating the domain account and setting up the mailbox. There was just no better way to do it, and it was horrific.

I started digging into VBScript around the mid-1990s and tried to automate some of those things. We had a NetWare server, and you periodically had to log on, look for idle connections and disconnect them to free up a connection for another user if we reached our license limit. I wrote a script to do something a human being was sitting and doing manually all day long.

This idea of automating — that was just so powerful, so tremendous and so life-affirming that it became a huge part of what I wound up doing for my job there and jobs afterward.

Do you remember your introduction to PowerShell?

Jones: It was at a time when Microsoft was being a little bit more free talking about products that they were working on it. There was a decent amount of buzz about this Monad shell, which was its code name. I felt this is clearly going to be the next thing and was probably going to replace VBScript from what they were saying.

I was working with a company called Sapien Technologies at the time. They produce what is probably still the most popular VBScript code editor. I said, ‘We’re clearly going to have to do something for PowerShell,’ and they said, ‘Absolutely.’ And PrimalScript was, I think, the first non-Microsoft tool that really embraced PowerShell and became part of that ecosystem.

That attracted the attention of Jeffrey Snover at Microsoft. He said, ‘We’re going to launch PowerShell at TechEd Europe 2006 in Barcelona, [Spain], and I’d love for you to come up and do a little demo of PrimalScript. We want to show people that this is ready for prime time. There’s a partner ecosystem. It’s the real deal, and it’s safe to jump on board.’

That’s where I met him. That was the first time I got to present at a TechEd and that set up the next large chapter of my career.

I think PowerShell has earned its place in a lot of people’s toolboxes and putting it out there as open source was such a huge step.
Don JonesVice president of content partnerships and strategic initiatives, Pluralsight

What motivated you to write this book?

Jones: I think I wanted to write it six or seven years ago. I remember being either at a TechEd or [Microsoft] Ignite at a bar with [Snover], Bruce Payette and, I think, Ken Hansen. You’re at a bar with the bar-top nondisclosure agreement. And they’re telling these great stories. I’m like, ‘We need to capture that.’ And they say, ‘Yeah, not right now.’

I’m not sure what really spurred me. Partly, because my career has moved to a different place. I’m not in PowerShell anymore. I felt being able to write this history would be, if not a swan song, then a nice bookend to the PowerShell part of my career. I reached out to a couple of the guys again, and they said, ‘You know what? This is the right time.’ We started talking and doing interviews.

As I was going through that, I realized the reason it’s the right time is because so many of them are no longer at Microsoft. And, more importantly, I don’t think any of the executives who had anything to do with PowerShell are still at Microsoft. They left around 2010 or 2011, so there’s no repercussions anymore.

Regarding Jeffrey Snover, do you think if anybody else had been in charge of the PowerShell project that it would have become what it is today?

Jones: I don’t think so. By no means do I want to discount all the effort everyone else put in, but I really do think it was due to [Snover’s] absolute dogged determination, just pure stubbornness.

He said, ‘Bill Gates got it fairly early.’ And even Bill Gates getting it and understanding it and supporting it didn’t help. That’s not how it worked. [Snover] really had to lead them through some — not just people who didn’t get it or didn’t care — but people who were actively working against them. There was firm opposition from the highest levels of the company to make this stop.

Because you got in close to the ground floor with PowerShell, were you able to influence any of its functionality from the outside?

Jones: Oh, absolutely. But it wasn’t really just me. It was all the PowerShell MVPs. The team had this deep recognition that we were their biggest fans — and their biggest critics.

They went out of their way to do some really sneaky stuff to make sure they could get our feedback. Windows Vista moving into Windows 7, there was a lot of secrecy. Microsoft knew it had botched — perceptually if nothing else — the Vista release. They needed Windows 7 to be a win, and they were being really close to the vest about it. For them to show us anything that had anything to do with Windows 7 was verboten at the highest levels of the company. Instead they came up with this idea of the “Windows Vista update,” which was nothing more than an excuse to show us PowerShell version 3 without Windows 7 being in the context.

They wanted to show us workflows. They put us in a room and they not only let us play with it and gave us some labs to run through, but they had cameras running the whole time. They said, ‘Tell us what you think.’

I think nearly every single release of PowerShell from version 2 onward had a readme buried somewhere. They listed the bug numbers and the person who opened it. A ton of those were us: the MVPs and people in the community. We would tell the team, ‘Look, this is what we feel. This is what’s wrong and here’s how you can fix it.’ And they would give you fine-print credit. Even before it went open source, there was probably more community interaction with PowerShell than most Microsoft products.

I came from the perspective of teaching. By the time I was really in with PowerShell, I wasn’t using it in a production environment. I was teaching it to people. My feedback tended to be along the lines of, ‘Look, this is hard for people to grasp. It’s hard to understand. Here’s what you could do to improve that.’ And a lot of that stuff got adopted.

Was there any desire — or an offer — to join Microsoft to work on PowerShell directly?

Jones: If I had ever asked, it probably could have happened. I had had previous dealings with Microsoft, as a contractor, that I really enjoyed.

I applied for a job there — and I did not enjoy how that went down.

My feeling was that I was making a lot more money and having a lot more impact as an independent.

What is your take on PowerShell since it made the switch to an open source project?

Jones: It’s been interesting. PowerShell 6, which was the first cross-platform open source was a big step backward in a lot of ways. Just getting it cross-platform was a huge step. You couldn’t take the core of PowerShell, and, at that point, 11 years of add-on development and bring it all with you at once. I think a lot of people looked at it as an interesting artifact.

The very best IT people reach out for whatever tool you put in front of them. They rip it apart and they try to figure out how is this going make my job better, easier, faster, different, whatever. They use all of them.
Don JonesVice president of content partnerships and strategic initiatives, Pluralsight

[In PowerShell 7], they’ve done so much work to make it more functional. There’s so much parity now across macOS, Linux and Windows. I feel the team tripled down and really delivered and did exactly what they said they were going to do.

I think a lot more people take it seriously. PowerShell is now built into the Kali Linux distribution because it’s such a good tool. I think a lot of really hardcore, yet open-minded, Linux and Unix admins look at PowerShell and — once they take the time to understand it — they realize this is what shells structurally should have been.

I think PowerShell has earned its place in a lot of people’s toolboxes and putting it out there as open source was such a huge step.

Do you see PowerShell ever making any inroads with Linux admins?

Jones: I don’t think they’re the target audience. If you’ve got a tool that does the job, and you know how to use it, and you know how to get it done, that’s fine.

We have a lot of home construction here in [Las] Vegas. I see guys putting walls up with a hammer and nails. Am I going to force you to use a nail gun? No. Are you going to be a lot faster? Yes, if you took a little time to learn how to use it. You never see younger guys with the hammer; it’s always the older guys who’ve been doing this for a long, long time.

I feel that PowerShell has already been through this cycle once. We tried to convince everyone that you needed to use PowerShell instead of the GUI, and a lot of admins stuck with the GUI. That’s a fairly career-limiting move right now, and they’re all finding that out. They’re never going to go any further. The people who picked it up, they’re the ones who move ahead.

The very best IT people reach out for whatever tool you put in front of them. They rip it apart and they try to figure out how is this going make my job better, easier, faster, different, whatever. They use all of them.

You don’t lose points for using PowerShell and Bash. It would be stupid for Linux administrators to fully commit to PowerShell and only PowerShell, because you’re going to run across systems that have this other thing. You need to know them both.

Microsoft has released a lot of administrative tools — you’ve got PowerShell, Office 365 CLI and Azure CLI to name a few. Someone new to IT might wonder where to concentrate their efforts when there are all these options.

Jones: You get a pretty solid command-line tool in the Azure CLI. You get something that’s very purpose-specific. It’s scoped in fairly tightly. It doesn’t have an infinite number of options. It’s a straightforward thing to write tutorials around. You’ve got an entire REST API that you can fire things off at. And if you’re a programmer, that makes a lot more sense to you and you can write your own tools around that.

PowerShell sits kind of in the middle and can be a little bit of both. PowerShell is really good at bringing a bunch of things together. If you’re using the Azure CLI, you’re limited to Azure. You’re not going to use the Azure CLI to do on-prem stuff. PowerShell can do both. Some people don’t have on-prem, they don’t need that. They just have some very simple basic Azure needs. And the CLI is simpler and easier to start with.

Where do you see PowerShell going in the next few years?

Jones: I think you’re going to continue to see a lot of investment both by Microsoft and the open source community. I think the open source people have — except for the super paranoid ones — largely accepted that Microsoft’s purchase of GitHub was not inimical. I think they have accepted that Microsoft is really serious about open source software. I think people are really focusing on making PowerShell a better tool for them, which is really what open source is all about.

I think you’re going to continue to see it become more prevalent on more platforms. I think it will wind up being a high common denominator for hiring managers who understand the value it brings to a business and some of the outcomes it helps achieve. Even AWS has invested heavily in their management layer in PowerShell, because they get it — also because a lot of the former PowerShell team members now work for AWS, including Ken Hansen and Bruce Payette, who invented the language.

I suspect that, in the very long run, it will probably shift away from Microsoft control and become something a little more akin to Mozilla, where there will be some community foundation that, quote unquote, owns PowerShell, where a lot of people contribute to it on an equal basis, as opposed to Microsoft, which is still holding the keys but is very engaged and accepting of community contributions.

I think PowerShell will probably outlive most of its predecessors over the very long haul.

Go to Original Article
Author:

Wanted – Ryzen Setup (CPU / MB / RAM) & NVIDIA GPU

Hi all,

My 3570k is getting a bit long in the tooth – time to look for an upgrade. Looking for a Ryzen CPU, Motherboard and RAM combo.

Also looking for a decent nVidia GPU – something better than an RX 470. As much as I love the card (and AMD cards overall) – the issues with drivers and getting ReLive to work reliably are a deal breaker for me and they’re driving me nuts – never had any issues with ShadowPlay so looking to go back to the green side. Don’t mind a few gens old.

Basically all I do these days is play Modern Warfare anyhow! Would like something to be able to run it on Medium or above at 70FPS+.

Not looking to spend the world, but offer up and let me know what you have.

Thank you!

Go to Original Article
Author:

For Sale – Core i7-7700K, 32GB DDR4-2400, Asus Prime B250M-K, Carbide Air 240 White, 256GB SSD

Hello! Having upgraded my PC, it’s time to create some space. Note some key components are missing (PSU and GPU as I kind of need those!) so you’ll need to obtain those separately to make a complete PC.

Case: Corsair Carbide Air 240 White with window, boxed and in excellent condition

Mainboard: Asus Prime B250M-K, with box and complete

Processor: Intel Core i7-7700K, boxed version (unused Intel sticker!)

Cooler: Arcric Freezer i11, boxed

Memory: Corsair VENGEANCE LPX 32GB (2 x 16GB) DDR4 2400MHz C16, with box

Fans: 2 x Arctic Freezer F12 PWM PST case fans with PWM sharing (pass-thru connector), one boxed

SSD: Samsung 830 256GB SATA SSD, £35 inc. P&P

Currently the CPU, cooler, RAM and mainboard are mounted in the case. I would prefer collection of these components (recognising Covid avoidance distancing) as they are, but I’ll consider splitting if there are no takers for the combination. With the case fans, I’m asking for £360 £400.

Thanks for looking, pictures will be coming soon…

Go to Original Article
Author:

For Sale – Huge clearout – PCs, laptops, components, RAM/GPUs

It’s time for a serious clearout! I haven’t had enough time to go through each individual set of parts yet, if there’s interest I’ll clean and test each one before sale.

I’ve been using work laptops and been too busy with work to bother much with my hardware below! I’m in the process of deciding what I want to keep and what to sell, it will take me a little while to clear off data in some instances. I do still need 2 PCs in the future but I’m taking this opportunity as a hard reset in my lineup.

This list will be updated as and when I get time / locate items.

Laptops
Lenovo E570 i3-6006u, 4gb DDR4 (I have more RAM below if needed). 500gb HDD, DVD writer, 15.6” screen. £200
Lenovo Yoga 12 i7 / 8gb / 250gb ssd £300
IBM T41 (specs TBC)

PCs
Dell XPS 8700, i7 4770, 12gb, GTX 645. £420
Dell Studio 540s SFF PC (ideal as HTPC), C2Q Q8300, 4gb, discrete half-height GPU (can’t remember which but will confirm) £100
HP Media Centre PC M7000 (includes the removable HDD in the bay at the front!) £100
Mesh Q6600 Elite, C2Q Q6600, 4gb (I think), no GPU £80
Thermaltake build – C2Q Q6600, 7600GS, RAM TBC £80

GPUs
2x Gigabyte 7970 GHz edition – boxed – barely seen any use in the past 4 years as I’ve been using a work laptop. £80ea
Palit GTX 780 £80
Palit GTX 980ti £200

SSDs
240gb Kingston KC300 £30
500gb Crucial MX500 £50
1.92tb Sandisk Ultra 900 USBC external drive brand new boxed £600

HDDs
3tb Seagate £35
Assorted 500gb-4tb 3.5” drives (5+ drives in total)
Assorted 60-500gb 2.5” drives (10+ drives in total)

Optical drives
5.25” DVD writers
Laptop DVD writers (I’ve amassed about 25 of these, please post if needed)

RAM
1x8gb DDR4 SODIMM £25
1x16gb DDR4 SODIMM £55
4x4gb Corsair Dominator 2133mhz DDR3 £100
4x8gb Patriot DDR3 £125
Assorted DDR1/DDR2 (including ECC) e.g. 4x512mb DDR2 ECC £10

Cases
NZXT Phantom £30 (boxed, some yellowing with age but it’s easy to get this back to white)

Components
Corsair AX850 PSU with black cable set £75
Corsair red cable set for AX650/AX750/AX850 £50
Corsair AX1200 PSU with cables £150
Sabertooth X79 mobo (found the manual and driver disc but no box) £80
Rampage IV Extreme X79 mobo (boxed with OC-key and manuals) £200

Corsair H100i £60
Corsair H100 £50
I7 3960x £170
I7 3930k £60

Misc
Trendnet Powerline AV500 adapter x2 boxed, £20
Netgear EX6120 Wifi AC1200 extender £30
Plantronics Calisto P610M USB speakerphone (Skype, Zoom etc.) – brand new boxed, 2 available, £60 each.

Go to Original Article
Author:

For Sale – Core i7-7700K, 32GB DDR4-2400, Asus Prime B250M-K, Carbide Air 240 White, 256GB SSD

Hello! Having upgraded my PC, it’s time to create some space. Note some key components are missing (PSU and GPU as I kind of need those!) so you’ll need to obtain those separately to make a complete PC.

Case: Corsair Carbide Air 240 White with window, boxed and in excellent condition

Mainboard: Asus Prime B250M-K, with box and complete

Processor: Intel Core i7-7700K, boxed version (unused Intel sticker!)

Cooler: Arcric Freezer i11, boxed

Memory: Corsair VENGEANCE LPX 32GB (2 x 16GB) DDR4 2400MHz C16, with box

Fans: 2 x Arctic Freezer F12 PWM PST case fans with PWM sharing (pass-thru connector), one boxed

SSD: Samsung 830 256GB SATA SSD, £35 inc. P&P

Currently the CPU, cooler, RAM and mainboard are mounted in the case. I would prefer collection of these components (recognising Covid avoidance distancing) as they are, but I’ll consider splitting if there are no takers for the combination. With the case fans, I’m asking for £400.

Thanks for looking, pictures will be coming soon…

Go to Original Article
Author:

For Sale – Huge clearout – PCs, laptops, components, RAM/GPUs

It’s time for a serious clearout! I haven’t had enough time to go through each individual set of parts yet, if there’s interest I’ll clean and test each one before sale.

I’ve been using work laptops and been too busy with work to bother much with my hardware below! I’m in the process of deciding what I want to keep and what to sell, it will take me a little while to clear off data in some instances. I do still need 2 PCs in the future but I’m taking this opportunity as a hard reset in my lineup.

This list will be updated as and when I get time / locate items.

Laptops
Lenovo E570 i3-6006u, 4gb DDR4 (I have more RAM below if needed). 500gb HDD, DVD writer, 15.6” screen. £200
Lenovo Yoga 12 i7 / 8gb / 250gb ssd £300
IBM T41 (specs TBC)

PCs
Dell XPS 8700, i7 4770, 12gb, GTX 645. £420
Dell Studio 540s SFF PC (ideal as HTPC), C2Q Q8300, 4gb, discrete half-height GPU (can’t remember which but will confirm) £100
HP Media Centre PC M7000 (includes the removable HDD in the bay at the front!) £100
Mesh Q6600 Elite, C2Q Q6600, 4gb (I think), no GPU £80
Thermaltake build – C2Q Q6600, 7600GS, RAM TBC £80

GPUs
2x Gigabyte 7970 GHz edition – boxed – barely seen any use in the past 4 years as I’ve been using a work laptop. £80ea
Palit GTX 780 £80
Palit GTX 980ti £200

SSDs
240gb Kingston KC300 £30
500gb Crucial MX500 £50
1.92tb Sandisk Ultra 900 USBC external drive brand new boxed £600

HDDs
3tb Seagate £35
Assorted 500gb-4tb 3.5” drives (5+ drives in total)
Assorted 60-500gb 2.5” drives (10+ drives in total)

Optical drives
5.25” DVD writers
Laptop DVD writers (I’ve amassed about 25 of these, please post if needed)

RAM
1x8gb DDR4 SODIMM £25
1x16gb DDR4 SODIMM £55
4x4gb Corsair Dominator 2133mhz DDR3 £100
4x8gb Patriot DDR3 £125
Assorted DDR1/DDR2 (including ECC) e.g. 4x512mb DDR2 ECC £10

Cases
NZXT Phantom £30 (boxed, some yellowing with age but it’s easy to get this back to white)

Components
Corsair AX850 PSU with black cable set £75
Corsair red cable set for AX650/AX750/AX850 £50
Corsair AX1200 PSU with cables £150
Sabertooth X79 mobo (found the manual and driver disc but no box) £80
Rampage IV Extreme X79 mobo (boxed with OC-key and manuals) £200

Corsair H100i £60
Corsair H100 £50
I7 3960x £170
I7 3930k £60

Misc
Trendnet Powerline AV500 adapter x2 boxed, £20
Netgear EX6120 Wifi AC1200 extender £30
Plantronics Calisto P610M USB speakerphone (Skype, Zoom etc.) – brand new boxed, 2 available, £60 each.

Go to Original Article
Author:

For Sale – Huge clearout – PCs, laptops, components, RAM/GPUs

It’s time for a serious clearout! I haven’t had enough time to go through each individual set of parts yet, if there’s interest I’ll clean and test each one before sale.

I’ve been using work laptops and been too busy with work to bother much with my hardware below! I’m in the process of deciding what I want to keep and what to sell, it will take me a little while to clear off data in some instances. I do still need 2 PCs in the future but I’m taking this opportunity as a hard reset in my lineup.

This list will be updated as and when I get time / locate items.

Laptops
Lenovo E570 i3-6006u, 4gb DDR4 (I have more RAM below if needed). 500gb HDD, DVD writer, 15.6” screen. £200
Lenovo Yoga 12 i7 / 8gb / 250gb ssd £300
IBM T41 (specs TBC)

PCs
Dell XPS 8700, i7 4770, 12gb, GTX 645. £420
Dell Studio 540s SFF PC (ideal as HTPC), C2Q Q8300, 4gb, discrete half-height GPU (can’t remember which but will confirm) £100
HP Media Centre PC M7000 (includes the removable HDD in the bay at the front!) £100
Mesh Q6600 Elite, C2Q Q6600, 4gb (I think), no GPU £80
Thermaltake build – C2Q Q6600, 7600GS, RAM TBC £80

GPUs
2x Gigabyte 7970 GHz edition – boxed – barely seen any use in the past 4 years as I’ve been using a work laptop. £80ea
Palit GTX 780 £80
Palit GTX 980ti £200

SSDs
240gb Kingston KC300 £30
500gb Crucial MX500 £50
1.92tb Sandisk Ultra 900 USBC external drive brand new boxed £600

HDDs
3tb Seagate £35
Assorted 500gb-4tb 3.5” drives (5+ drives in total)
Assorted 60-500gb 2.5” drives (10+ drives in total)

Optical drives
5.25” DVD writers
Laptop DVD writers (I’ve amassed about 25 of these, please post if needed)

RAM
1x8gb DDR4 SODIMM £25
1x16gb DDR4 SODIMM £55
4x4gb Corsair Dominator 2133mhz DDR3 £100
4x8gb Patriot DDR3 £125
Assorted DDR1/DDR2 (including ECC) e.g. 4x512mb DDR2 ECC £10

Cases
NZXT Phantom £30 (boxed, some yellowing with age but it’s easy to get this back to white)

Components
Corsair AX850 PSU with black cable set £75
Corsair red cable set for AX650/AX750/AX850 £50
Corsair AX1200 PSU with cables £150
Sabertooth X79 mobo (found the manual and driver disc but no box) £80
Rampage IV Extreme X79 mobo (boxed with OC-key and manuals) £200

Corsair H100i £60
Corsair H100 £50
I7 3960x £170
I7 3930k £60

Misc
Trendnet Powerline AV500 adapter x2 boxed, £20
Netgear EX6120 Wifi AC1200 extender £30
Plantronics Calisto P610M USB speakerphone (Skype, Zoom etc.) – brand new boxed, 2 available, £60 each.

Go to Original Article
Author:

The Acid Test for Your Backup Strategy

For the first several years that I supported server environments, I spent most of my time working with backup systems. I noticed that almost everyone did their due diligence in performing backups. Most people took an adequate responsibility to verify that their scheduled backups ran without error. However, almost no one ever checked that they could actually restore from a backup — until disaster struck. I gathered a lot of sorrowful stories during those years. I want to use those experiences to help you avert a similar tragedy.

Successful Backups Do Not Guarantee Successful Restores

Fortunately, a lot of the problems that I dealt with in those days have almost disappeared due to technological advancements. But, that only means that you have better odds of a successful restore, not that you have a zero chance of failure. Restore failures typically mean that something unexpected happened to your backup media. Things that I’ve encountered:

  • Staff inadvertently overwrote a full backup copy with an incremental or differential backup
  • No one retained the necessary decryption information
  • Media was lost or damaged
  • Media degraded to uselessness
  • Staff did not know how to perform a restore — sometimes with disastrous outcomes

I’m sure that some of you have your own horror stories.

These risks apply to all organizations. Sometimes we manage to convince ourselves that we have immunity to some or all of them, but you can’t get there without extra effort. Let’s break down some of these line items.

People Represent the Weakest Link

We would all like to believe that our staff will never make errors and that the people that need to operate the backup system have the ability to do so. However, as a part of your disaster recovery planning, you must expect an inability to predict the state or availability of any individual. If only a few people know how to use your backup application, then those people become part of your risk profile.

You have a few simple ways to address these concerns:

  • Periodically test the restore process
  • Document the restore process and keep the documentation updated
  • Non-IT personnel need knowledge and practice with backup and restore operations
  • Non-IT personnel need to know how to get help with the application

It’s reasonable to expect that you would call your backup vendor for help in the event of an emergency that prevented your best people from performing restores. However, in many organizations without a proper disaster recovery plan, no one outside of IT even knows who to call. The knowledge inside any company naturally tends to arrange itself in silos, but you must make sure to spread at least the bare minimum information.

Technology Does Fail

I remember many shock and horror reactions when a company owner learned that we could not read the data from their backup tapes. A few times, these turned into grief and loss counselling sessions as they realized that they were facing a critical — or even complete — data loss situation. Tape has its own particular risk profile, and lots of businesses have stopped using it in favour of on-premises disk-based storage or cloud-based solutions. However, all backup storage technologies present some kind of risk.

In my experience, data degradation occurred most frequently. You might see this called other things, my favourite being “bit rot”. Whatever you call it, it all means the same thing: the data currently on the media is not the same data that you recorded. That can happen just because magnetic storage devices have susceptibilities. That means that no one made any mistakes — the media just didn’t last. For all media types, we can establish an average for failure rates. But, we have absolutely no guarantees on the shelf life for any individual unit. I have seen data pull cleanly off decade-old media; I have seen week-old backups fail miserably.

Unexpectedly, newer technology can make things worse. In our race to cut costs, we frequently employ newer ways to save space and time. In the past, we had only compression and incremental/differential solutions. Now, we have tools that can deduplicate across several backup sets and at multiple levels. We often put a lot of reliance on the single copy of a bit.

How to Test your Backup Strategy

The best way to identify problems is to break-test to find weaknesses. Leveraging test restores will help identity backup reliability and help you solve these problems. Simply, you cannot know that you have a good backup unless you can perform a good restore. You cannot know that your staff can perform a restore unless they perform a restore. For maximum effect, you need to plan tests to occur on a regular basis.

Some tools, like Altaro VM Backup, have built-in tools to make tests easy. Altaro VM Backup provides a “Test & Verify Backups” wizard to help you perform on-demand tests and a “Schedule Test Drills” feature to help you automate the process.

how to test and verify backups altaro

If your tool does not have such a feature, you can still use it to make certain that your data will be there when you need it. It should have some way to restore a separate or redirected copy. So, instead of overwriting your live data, you can create a duplicate in another place where you can safely examine and verify it.

Test Restore Scenario

In the past, we would often simply restore some data files to a shared location and use a simple comparison tool. Now that we use virtual machines for so much, we can do a great deal more. I’ll show one example of a test that I use. In my system, all of these are Hyper-V VMs. You’ll have to adjust accordingly for other technologies.

Using your tool, restore copies of:

  • A domain controller
  • A SQL server
  • A front-end server dependent on the SQL server

On the host that you restored those VMs to, create a private virtual switch. Connect each virtual machine to it. Spin up the copied domain controller, then the copied SQL server, then the copied front-end. Use the VM connect console to verify that all of them work as expected.

Create test restore scenarios of your own! Make sure that they match a real-world scenario that your organization would rely on after a disaster.


Go to Original Article
Author: Eric Siron