Tag Archives: only

How to Quickly Recover and Restore Windows Server Hyper-V Backups

Perhaps the only thing worse than having a disaster strike your datacenter is the stress of recovering your data and services as quickly as possible. Most businesses need to operate 24 hours a day and any service outage will upset customers and your business will lose money. According to a 2016 study by the Ponemon Institute, the average datacenter outage costs enterprises over $750,000 and lasts about 85 minutes, losing the businesses roughly $9,000 per minute. While your organization may be operating at a smaller scale, any service downtime or data loss is going to hurt your reputation and may even jeopardize your career. This blog is going to give you the best practices for how to recover your data from a backup and bring your services online as fast as possible.

Automation is key when it comes to decreasing your Recovery Time Objective (RTO) and minimizing your downtime. Any time you have a manual step in the process, it is going to create a bottleneck. If the outage is caused by a natural disaster, relying on human intervention is particularly risky as the datacenter may be inaccessible or remote connections may not be available. As you learn about the best practice of detection, alerting, recovery, startup, and verification, consider how you could implement each of these steps in a fully-automated fashion.

The first way to optimize your recovery speed is to detect the outage as quickly as possible. If you have an enterprise monitoring solution like System Center Operations Manager (SCOM), it will continually check the health of your application and its infrastructure, looking for errors or other problems.  Even if you have developed an in-house application and do not have access to enterprise tools, you can use Windows Task Manager to set up tasks that automatically check for system health by scanning event logs, then trigger recovery actions. There are also many free monitoring tools such as Uptime Robot which alerts you anytime your website goes offline.

Once the administrators have been alerted, immediately begin the recovery process.  Meanwhile, you should run a secondary health check on the system to make sure that you did not receive a false alert. This is a great background task to continually run during the recovery process to make sure that something like a cluster failover or transient network failure does not force your system into restarting if it is actually healthy. If the outage was indeed a false positive, then have a task prepared which will terminate the recovery process so that it does not interfere with the now-healthy system.

If you restore your service and determine that there was data loss, then you will need to make a decision whether to accept that loss or if you should attempt to recover from the last good backup, which can cause further downtime during the restoration. Make sure you can automatically determine whether you need to restore a full backup, or whether a differencing backup is sufficient to give you a faster recovery time. By comparing the timestamp of the outage to the timestamp on your backup(s), you can determine which option will minimize the impact on your business. This can be done with a simple PowerShell script, but make sure that you know how to get this information from your backup provider and pass it into your script.

Once you have identified the best backup, you then need to copy it to your production system as fast as possible. A lot of organizations will deprioritize their backup network since they are only used a few times a day or week. This may be acceptable during the backup process, but these networks need to be optimized during recovery.  If you do need to restore a backup, consider running a script that will prioritize this traffic, such as by changing the quality of service (QoS) settings or disabling other traffic which uses that same network.

Next, consider the storage media which the backup is copied before the restoration happens.  Try to use your fastest SSD disks to maximize the speed in which the backup is restored.  If you decided to backup your data on a tape drive, you will likely have high copy speeds during restoration.  However, tape drives usually require manual intervention to find and mount that drive, which should generally be avoided if you want a fully automated process.  You can learn more about the tradeoffs of using tape drives and other media here.

Once your backup has been restored, then you need to restart the services and applications.  If you are restoring to a virtual machine (VM), then you can optimize its startup time by maximizing the memory which is allocated to it during startup and operations.  You can also configure VM prioritization to ensure that this critical VM starts first in case it is competing with other VMs to launch on a host which has recently crashed.  Enable QoS on your virtual network adapters to ensure that traffic flows through to the guest operating system as quickly as possible, which will speed up the time to restore a backup within the VM, and also help clients reconnect faster.  Whether you are running this application within a VM or on bare metal, you can also use Task Manager to enhance the priority of the important processes.

Now verify that your backup was restored correctly and your application is functioning as expected by running some quick test cases.  If you feel confident that those tests worked, then you can allow users to reconnect.  If those tests fail, then work backward through the workflow to try to determine the bottleneck, or simply roll back to the next “good” backup and try the process again.

Anytime you need to restore from a backup, it will be a frustrating experience, which is why testing throughout your application development lifecycle is critical.  Any single point of failure can cause your backup or recovery to fail, which is why this needs to be part of your regular business operations.  Once your systems have been restored, always make sure your IT department does a thorough investigation into what caused the outage, what worked well in the recovery, and what areas could be improved.  Review the time each step took to complete and ask yourself whether any of these should be optimized.  It is also a good best practice to write up a formal report which can be saved and referred to in the future, even if you have moved on to a different company.

The top software backup provides like Altaro can help you throughout the process by offering backup solutions for Hyper-V, Azure, O365 and PCs with the Altaro API interface which can be used for backup automation.

No matter how well you can prepare your datacenter, disasters can happen, so make sure that you have done all you can to try to recover your data – so that you can save your company!

Go to Original Article
Author: Symon Perriman

Information Builders rebranded IBI, updates analytics suite

Not only is Information Builders now IBI, but the vendor also is trying to modernize its analytics suite by adding augmented intelligence capabilities and expanding beyond data visualizations to become an end-to-end platform.

Information Builders was founded in 1975 and is based in New York. Well into its third decade, the vendor’s analytics platform was considered to offer one of the more vibrant business intelligence tools. In recent years, however, newer vendors such as Tableau, Qlik and ThoughtSpot have developed innovations that pushed the capabilities of their platforms beyond those of the older analytics purveyors.

Information Builders, however, has responded in recent years to changes in the market. Frank Vella took over as CEO in January 2019, replacing founder Gerry Cohen, and since then has led an overhaul of Information Builders’ capabilities with AI and the cloud as focal points along with a target audience of small and midsize businesses.

During its virtual conference last month, the vendor not only further advanced its platform with the introduction of four new features but also debuted a revamped website and revealed that it will now be known as IBI rather than Information Builders.

“It’s more than just a new logo,” said Keith Kohl, senior vice president of product management. “One of the things we always talked about as Information Builders was that we were a BI and analytics company. But the fact is, we’re a data and analytics company.”

That, Kohl continued, means that IBI has customers whose needs go well beyond analyzing data but use its platform for data management as well.

“It’s a rebirth of the company with new messaging,” Kohl said.

An organization's sales information is displayed on an IBI dashboard.
An IBI dashboard displays an organization’s sales data.

As far as the new features IBI added to its analytics platform in late June — three of which are now generally available — both the cloud and AI play a prominent role.

Automated Insights is a new tool built on AI and machine learning designed to help users more quickly and easily derive insights from their data; Open Data Visualizations uses IBI’s Open Data Platform to helps customers connect to new data sources and embed applications so they can access data in real time in order to make data-driven decisions; Omni-HealthData Cloud Essentials is a SaaS-only offering that will be available in August and will enable midmarket healthcare providers to better use patient data; and finally a new partnership between IBI and ASG, a provider of IT management technology, will enable customers to see key metrics and data lineage information in dashboards and reports to help support their data governance efforts.

Taken as a whole, though none of the new features are going to revolutionize analytics, they further demonstrate that IBI’s platform is again modern. More importantly, they will benefit the vendor’s customers, according to Mike Leone, senior analyst at Enterprise Strategy Group.

“Many of their customers and end users are at a tipping point when it comes to leveraging next-generation technology like AI and ML,” he said. “Combined with the need to support more end users, IBI has a recipe for success. They’ve been successful to date at easily meeting the large-scale demands of growing business and streaming data sets. Now it’s about enabling faster ramp up of next-gen technology to fuel better, faster data storytelling and eventual decision-making for more end users.”

The response from users, meanwhile — not only to the new features but also to the evolution of the IBI analytics platform over the course of many months — has been positive.

They’ve been successful to date at easily meeting the large-scale demands of growing business and streaming data sets. Now it’s about enabling faster ramp up of next-gen technology to fuel better, faster data storytelling and eventual decision-making for more end users.
Mike LeoneSenior analyst, Enterprise Strategy Group

Sound Credit Union, based in Tacoma, Wash., began using Information Builders for its BI needs about two years ago, according to Martin Walker, the credit union’s vice president of digital experience and innovation. When the credit union chose the vendor’s platform, it was looking for a product that would empower end users to do analytical analysis without having to go through its IT department every time they needed a report.

In addition, the credit union wanted a platform that would provide data management capabilities in addition to BI capabilities.

“With a lot of the other solutions we saw, we really would have needed to have two solutions,” Walker said. “Some were very good at presenting the data and visualizations but didn’t have the data warehouse component. We would have had to build that out separately or hire a consultant to do that for us, and with IBI we essentially got the whole puzzle.”

But while Information Builders provided the data-management and self-service capabilities Sound Credit Union was seeking, it couldn’t foresee the evolution into IBI that was to come and the capabilities the platform would enable.

“It was good in 2018, and it is fantastic today,” Walker said. “They’ve led the way for us to understand what is possible. We were in our infancy in terms of understanding how we could leverage our data, and IBI has taken us a lot further than we thought in the last two-and-a-half years.”

Regarding the rebranding of company name after 45 years, both Walker and Leone said it is a positive development.

In fact, they said it makes sense given the direction IBI has taken since Vella became CEO.

“It almost feels natural that it comes with the changing of the guard that took play a year and a half ago,” Leone said. “Their existing customers will continue to see value and a new wave of customers will look to achieve the same if not better successes.”

Meanwhile, Walker, who spent 20 years in marketing, including 10 years with the NBA’s Seattle SuperSonics before they left for Oklahoma City, said “it’s pretty cool.”

Looking toward future IBI analytics platform updates, Kohl said IBI’s areas of focus will be to add more AI capabilities, including automating repetitive tasks in the short term and more complex ones in the long term, adding more features to help nontechnical users use machine learning models to assist them with specific use cases, improving ease of use and adding more cloud services to make the platform easier to adopt.

“I believe IBI is headed in the right direction,” Leone said. “I think as they look to place a heavy focus on enabling the masses to better achieve data-driven success, it will be well-received by both existing and potential customers alike.”

Go to Original Article

For Sale – Cheap computer – i5 3330 – gtx 960 – 8gb ddr3 £140

Hi, i have built myself a new tower so the following is for sale, it is for sale as a complete unit only and priced for a quick sale.

Computer consists of…

Black Atx tower case with acrylic window.
Intel i5 3330, Quad core, 3rd gen, with new intel heatsink.
8gb Crucial Ballistix sport 1600mhz ddr3 memory.
Msi gtx 960 2gb graphics card.
120gb Afox ssd.
Asus P8H61-M motherboard.
Usb 3.0 card with x2 rear usb3.0 and x1 on the front of the case.
600w power supply.
2x 120mm corsair led fans…

Cheap computer – i5 3330 – gtx 960 – 8gb ddr3 £140

Go to Original Article

For Sale – I9 9900KS, 2080 Super for sale…

for sale specs as below…. only 4 months old.

CPU – INTEL i9-9900KS (5.0ghz)
RAM – Corsair 16GB VENGEANCE DDR4 3600MHz
COOLING – CORSAIR H60 2018 HYDRO SERIES + 3 Case Fans + Artic uprated Thermal CPU paste

Still under 1 years manufacturer warranty.

PRICE – £2200.

Newcastle upon tyne
Price and currency
Delivery cost included
Delivery is NOT included
Prefer goods collected?
I have no preference
Advertised elsewhere?
Advertised elsewhere
Payment method
Bank Transfer

Last edited:

Go to Original Article

For Sale – Intel Skull Canyon bundle

Hi and thanks for reading.

In dire need of cash atm so my one and only pc setup has to go now

I’ve only ever used it for work and gaming on my tv.

Specs are as follows:

6th generation Intel Core i7-6770HQ processor.
• Intel Iris Pro Graphics.
• Two DDR4 SO-DIMM sockets (up to 32 GB, 2133+ MHz).
• Two M.2 slots with flexible support for a 42 or 80 mm SATA or PCIe SSD.
• Intel Dual Band Wireless-AC 8260 and Bluetooth 4.2.
• SD card slot.
• Two USB 3.0 ports (including one charging port).
• Intel HD Audio via Headphone/Microphone jack.
• Consumer infrared sensor.
• Support for user-replaceable third-party lids.
• Kensington lock support.
• Back panel DC power connector (19V).
• Combo speaker/TOSLINK audio output.
• Intel Gigabit LAN.
• Two USB 3.0 ports on the back panel.
• One Mini DisplayPort version 1.2 supporting 8 channel digital audio (7.1 surround sound).
• One Thunderbolt 3 port.
• One full-size HDMI 2.0 display port supporting 8 channel audio (7.1 surround sound).

My pc will come boxed with the following installed:
1x 256gb samsung m.2 ssd installed (will be formatted before sending out, you will need to install your own OS)
1 x 16gb samsung 2400mhz ram installed

Also comes bundled with the following:

1 x samsung 256gb evo plus m.2 nvme still sealed, mint fresh in the box
1 x Official Microsoft Xbox One Controller, light grey with green accents. Good condition with box
1 x Logitech wireless K830 keyboard with trackpad, perfect for sitting on the sofa, this has recently acquired a scratch on the Space Bar but you cant feel it and doesnt effect usage at all. It has backlight for using in the dark, bluetooth and wireless (with dongle, included) options. Can’t find the box for this but will be suitably packed.

I need £500 including postage for all this no offers please.

Go to Original Article

New AI tools in the works for ThoughtSpot analytics platform

The ThoughtSpot analytics platform only has been available for six years, but since 2014 the vendor has quickly gained a reputation as an innovator in the field of business intelligence software.

ThoughtSpot, founded in 2012 and based in Sunnyvale, Calif., was an early adopter of augmented intelligence and machine learning capabilities, and even as other BI vendors have begun to infuse their products with AI and machine learning, the ThoughtSpot analytics platform has continued to push the pace of innovation.

With its rapid rise, ThoughtSpot attracted plenty of funding, and an initial public offering seemed like the next logical step.

Now, however, ThoughtSpot is facing the same uncertainty as most enterprises as COVID-19 threatens not only people’s health around the world, but also organizations’ ability to effectively go about their business.

In a recent interview, ThoughtSpot CEO Sudheesh Nair discussed all things ThoughtSpot, from the way the coronavirus is affecting the company to the status of an IPO.

In part one of a two-part Q&A, Nair talked about how COVID-19 has changed the firm’s corporate culture in a short time. Here in part two, he discusses upcoming plans for the ThoughtSpot analytics platform and when the vendor might be ready to go public.

One of the main reasons the ThoughtSpot analytics platform has been able to garner respect in a short time is its innovation, particularly with respect to augmented intelligence and machine learning. Along those lines, what is a recent feature ThoughtSpot developed that stands out to you?

ThoughtSpot CEO Sudheesh NairSudheesh Nair

Sudheesh Nair: One of the main changes that is happening in the world of data right now is that the source of data is moving to the cloud. To deliver the AI-based, high-speed innovation on data, ThoughtSpot was really counting on running the data in a high-speed memory database, which is why ThoughtSpot was mostly focused on on-premises customers. One of the major changes that happened in the last year is that delivered what we call Embrace. With Embrace we are able to move to the cloud and leave the data in place. This is critical because as data is moving, the cost of running computations will get higher because computing is very expensive in the cloud.

With ThoughtSpot, what we have done is we are able to deliver this on platforms like Snowflake, Amazon Redshift, Google BigQuery and Microsoft Synapse. So now with all four major cloud vendors fully supported, we have the capability to serve all of our customers and leave all of their data in place. This reduces the cost to operate ThoughtSpot — the value we deliver — and the return on investment will be higher. That’s one major change.

Looking ahead, what are some additions to the ThoughtSpot analytics platform customers can expect?

Nair: If you ask people who know ThoughtSpot — and I know there are a lot of people who don’t know ThoughtSpot, and that’s OK — … if you ask them what we do they will say, ‘search and AI.’ It’s important that we continue to augment on that; however, one thing that we’ve found is that in the modern world we don’t want search to be the first thing that you do. What if search became the second thing you do, and the first thing is that what you’ve been looking for comes to you even before you ask?

What if search became the second thing you do, and the first thing is that what you’ve been looking for comes to you even before you ask?
Sudheesh NairCEO, ThoughtSpot

Let’s say you’re responsible for sales in Boston, and you told the system you’re interested in figuring out sales in Boston — that’s all you did. Now the system understands what it means to you, and then runs multiple models and comes back to you with questions you’ll be interested in, and most importantly with insights it thinks you need to know — it doesn’t send a bunch of notifications that you never read. We want to make sure that the insights we’re sending to you are so relevant and so appropriate that every single one adds value. If one of them doesn’t add value, we want to know so the system can understand what it was that was not valuable and then adjust its algorithms internally. We believe that the right action and insight should be in front of you, and then search can be the second thing you do prompted by the insight we sent to you.

What tools will be part of the ThoughtSpot analytics platform to deliver these kinds of insights?

Nair: There are two features we are delivering around it. One is called Feed, which is inspired by our social media curating insights, and conversations and opinions around facts. Right now social media is all opinion, but imagine a fact-driven social media experience where someone says they had a bad a quarter and someone else says it was great and then data shows up so it doesn’t become an opinion based on another opinion. It’s important that it should be tethered to facts. The second one is Monitor, which is the primary feature where the thing you were looking for shows up even before you ask in the format that you like — could be mobile, could be notifications, could be an image.

Those two features are critical innovations for our growth, and we are very focused on delivering them this year.

The last time we spoke we talked about the possibility of ThoughtSpot going public, and you were pretty open in saying that’s something you foresee. It’s about seven months later, where do plans for going public currently stand?

Nair: If you had asked me before COVID-19 I would have had a bit of a different answer, but the big picture hasn’t changed. I still firmly believe that a company like ThoughtSpot will tremendously benefit from going public because our customers are massive customers, and those customers like to spend more with a public company and the trust that comes with it.

Having said that, I talked last time about building a team and predictability, and I feel seven months later that we have built the executive team that can be the best in class when it comes to public companies. But going public also requires being predictable, and we’re getting in that right spot. I think that the next two quarters will be somewhat fluid, which will maybe set us back when it comes to building a plan to take the company public. But that is basically it. I think taken one by one, we have a good product market, we have good business momentum, we have a good team, and we just need to put together the history that is necessary so that the business is predictable and an investor can appreciate it. That’s what we’re focused on. There might be a short-term setback because of what the coronavirus might throw at us, but it’s going to definitely be a couple of more quarters of work.

Does the decline in the stock market related to COVID-19 play into your plans at all?

Nair: It’s absolutely an important event that’s going on and no one knows how it will play out, but when I think about a company’s future I never think about an IPO as a few quarters event. It’s something we want to do, and a couple of quarters here or there is not going to make a major difference. Over the last couple of weeks, we haven’t seen any softness in the demand for ThoughtSpot, but we know that a lot of our customers’ pipelines are in danger from supply impacts from China, so we will wait and see. We need to be very close to our customers right now, helping them through the process, and in that process we will learn and make the necessary course corrections.

Editor’s note: This interview has been edited for clarity and conciseness.

Go to Original Article

Delivering information and eliminating bottlenecks with CDC’s COVID-19 assessment bot – The Official Microsoft Blog

In a crisis like the COVID-19 pandemic, it’s not only important to deliver medical care but to also provide information to help people make decisions and prevent health systems from being overwhelmed.

Microsoft is helping with this challenge by offering its Healthcare Bot service powered by Microsoft Azure to organizations on the frontlines of the COVID-19 response to help screen patients for potential infection and care.

For example, the U.S. Centers for Disease Control and Prevention (CDC) just released a COVID-19 assessment bot that can quickly assess the symptoms and risk factors for people worried about infection, provide information and suggest a next course of action such as contacting a medical provider or, for those who do not need in-person medical care, managing the illness safely at home.

The bot, which utilizes Microsoft’s Healthcare Bot service, will initially be available on the CDC website.

Public health organizations, hospitals and others on the frontlines of the COVID-19 response need to be able to respond to inquiries, provide the public with up-to-date outbreak information, track exposure, quickly triage new cases and guide next steps.  Many have expressed great concern about the overwhelming demand COVID-19 is creating on resources such as urgent, emergency and nursing care.

In particular, the need to screen patients with any number of cold or flu-like symptoms — to determine who has high enough risk factors to need access to limited medical resources and which people may more safely care for themselves at home — is a bottleneck that threatens to overwhelm health systems coping with the crisis.

Microsoft’s Healthcare Bot service is one solution that uses artificial intelligence (AI) to help the CDC and other frontline organizations respond to these inquiries, freeing up doctors, nurses, administrators and other healthcare professionals to provide critical care to those who need it.

The Healthcare Bot service is a scalable Azure-based public cloud service that allows organizations to quickly build and deploy an AI-powered bot for websites or applications that can offer patients or the general public personalized access to health-related information through a natural conversation experience. It can be easily customized to suit an organization’s own scenarios and protocols.

To assist customers in the rapid deployment of their COVID-19 bots, Microsoft is making available a set of COVID-19 response templates that customers can use and modify:

  • COVID-19 risk assessment based on CDC guidelines
  • COVID-19 clinical triage based on CDC protocols
  • COVID-19 up-to-date answers to frequently asked questions
  • COVID-19 worldwide metrics
COVID-19 assessment bot screenshots
Screenshots from the U.S. Centers for Disease Control and Prevention COVID-19 assessment bot.

Providence, one of the largest health systems in the U.S. headquartered near Seattle and serving seven Western states, had previously used Microsoft’s Healthcare Bot service running on Azure to create a healthcare chatbot named Grace that could help answer patient’s questions online. Using CDC guidelines and its own clinical protocols, Providence was able to build a similar Coronavirus Assessment Tool in just three days to help people in the communities it serves know whether they should seek medical attention for their respiratory symptoms.

The tool, which launched in early March, can bring a prospective patient directly into a telehealth session with a clinician to get immediate care.  It also aims to prevent healthy people or those with mild symptoms from showing up at clinics and emergency departments, which helps to limit community infection and save hospital beds and equipment for those who need it.

Other providers who are now using Microsoft’s Healthcare Bot service to respond to COVID-19 inquiries include:

Virginia Mason Health System, based in Seattle and serving the Pacific Northwest region, has created a patient assessment Healthcare Bot to help its patients understand whether care is needed. The instance is live and has thousands of daily users.

Novant Health, a healthcare provider in four states in the Southeast with one of the largest medical groups in the country, has created a Healthcare bot for COVID-19 information that went live on its website within a few days, with thousands of daily users since its launch.

Across all users, customized instances of Microsoft’s Healthcare Bot service are now fielding more than 1 million messages per day from members of the public who are concerned about COVID-19 infections — a number we expect to escalate quickly to meet growing needs. We hope the answers it can provide will curb anxiety that the “worried well” may experience without clear guidance and save lives by speeding the path to care for those who need it most.

Tags: ,

Go to Original Article
Author: Microsoft News Center

For Sale – 8GB HyperX Fury DDR4 RAM

Only used for one week as I upgraded my ram shortly after buying my new computer
2x4GB sticks 2400mhz
Comes in box

Looking for £30 including 1st class recorded delivery Royal mail

Accept PayPal and bank transfer

West Midlands, UK
Price and currency
Delivery cost included
Delivery Is Included
Prefer goods collected?
I have no preference
Advertised elsewhere?
Not advertised elsewhere
Payment method
Bank transfer or paypal

Last edited:

Go to Original Article

For Sale – Dell 24 Inch Monitor & AOC 23 inch Monitor (NI only) / 120W Pico PSU & AC Adapter

AOC 23 inch I2369VM Monitor – £65 *Collection only*
– Boxed with all cables, in excellent condition bar a couple of scratches around the VESA mount holes on the back. Also has the normal ‘spiderweb’ scratches on the high gloss black plastic back & sides of the monitor but these are only noticable under direct light.

Dell 24 inch P2414H (Rev 01) Monitor – £65 *Collection only*
-Unboxed with power cable. Note that it doesn’t come with a stand so you’ll either need to provide your own or buy my Duronic Monitor Arm below Also worth noting is that the screen is a Rev 01 which is known to have much less backlight bleed than other revisions. In excellent condition with a few scuffs to the surround that to be fair were there when I bought it from NRG website.

Duronic Single Monitor Arm Stand DM251X2 – £12 *Collection only*
Unboxed but provided with 4 screws (fit the AOC monitor above fine!)

120W Pico PSU – PICOPSU-120-WI-25 – mini-itx.com: picoPSU-120 WI 12-25V psu
FSP 19V 6.32A 120W AC/DC Adapter – 2.5mm x 5.5mm Jack
Selling the Pico and power adapter as a bundle for £35 collected or £40 delivered, if able to collect from Moira area will knock some money off

Go to Original Article

For Sale – Custom loop water cooled pc – i9 9900k, 2080ti, 32gb 3200mhz ram, 2tb nvme

Selling as only seems to be my work machine rather than playing games and creating content as intended

Built by myself in November 2019, machine is only a few months old.

Only the best components were chosen When this was built.

Machine runs at 5ghz on all cores and gpu never sees above 50c.

Motherboard – ASus maximus Code

Cpu – intel i9 9900k with ek water block

Gpu – msi ventus oc 2080ti with ek water block and nickel backplate

Ram- 32gb g skill royal silver 3200mhz

Nvme – 1tb wd black

Nvme – 1tb sabrent

Psu – Corsair 750 modular

Ek nickel fittings

Ek d5 stand alone pump

Phanteks reservoir

6 Thermaltake ring plus fans with controllers

2 360mm x 45mm alphacool radiators

Thermaltake acrylic tubes and liquid

Custom cables

I am based in Tadworth Surrey and the machine can be seen and inspected in person.

Go to Original Article