Tag Archives: been

For Sale – DDR4 Vengeance pro RGB 3600, Corsair K65 lux, Corsair MP510 480gb, G502 Hero, RTX 2080 Super FTW3 Ultra

EVERYTHING HAS BEEN CLEANED WITH ANTIBIOTIC WIPES (obviously not the gpu, ram and ssd lol)

Due to the current situation I’m having a calearout and downgrade and so I have the below for sale cash on collection or bank transfer all ID details ect can be provided.

RTX 2080 SUPER FTW3 ULTRA GAMING EVGA £680ono (I also have the hydrocopper block to match this £60) this was a step up from my 2070 super and has full warranty with EVGA and comes with all boxes and bits see warranty page screenshot for details below.

Corsair vengeance pro RGB DDR4 16gb 3600 C18 boxed £70ono (CURRENTLY UNDER OFFER AWAITING PAYMENT)

Corsair k65 lux rgb keyboard with glorious gaming wrist rest £65ono (comes boxed with all key caps and bits and original unused wrist rest)

Corsair MP510 480gb nvme ssd £45ono (CURRENTLY UNDER OFFER AWAITING PAYMENT)

Logitech G502 Hero boxed with all bits SOLD

Pictures can be provided with username and date ect and will happily speak over the phone whatsapp or whatever you like to confirm any details.

I WILL HAPPILY CONFIRM ITEMS WORKING VIA VIDEO CALL!

Go to Original Article
Author:

Wanted – Wanted – ATX PSU 450w+, modular if possible – South Wales area.

I’ve been let down by an eBay seller and a retailer so my gaming build has been on hold for a week. I’d like to get it finished before we all go into lockdown.

Does anybody have a PSU for sale? 450w or more, 80+ efficiency, modular if possible.

Local collection would be great (Swansea/South Wales area).

Thanks

Go to Original Article
Author:

For Sale – Intel i7-6700K, EK CPU waterblock and Gigabyte G1.Sniper Z170 motherboard

For Sale:

Intel i7-6700K CPU
Never overclocked and has been watercooled all its life, so temperatures kept very low.
£185.00
59738882-F6C2-47FB-B9A6-992BB9090E96.jpegC6E3C5C6-9A4F-4BEF-B677-4AA4A5653609.jpeg

EK-Supremacy EVO CPU watercooling block
Used on the above CPU, all in perfect condition, complete with all fittings, instructions etc.
£30.00
3A8624AF-E576-4674-B979-0B68E5B5A759.jpeg

Gigabyte G1.Sniper Z170 Motherboard
Again, perfect working order, boxed with all fittings, cables, instructions etc.
£100.00
6917E2B6-EFD2-430D-9A84-0E51515E2D70.jpegE2245E33-6A2C-4B04-BD92-810F76906F89.jpeg

Thermaltake Toughpower DPSG RGB 750W PSU
Perfect working order, only for sale as I swapped to an 850w which I didn’t actually need in the end.

SOLD

EVGA Nvidia GTX1070 GPU with EK FC-1070Nickel Plexi waterblock fitted
GPU has been watercooled its whole life, so kept very cool. Ideally I’d like to sell this with the waterblock in place.

SOLD

Price has been based on completed listings on eBay, so feel free to make an offer.

Thanks!

Go to Original Article
Author:

New AI tools in the works for ThoughtSpot analytics platform

The ThoughtSpot analytics platform only has been available for six years, but since 2014 the vendor has quickly gained a reputation as an innovator in the field of business intelligence software.

ThoughtSpot, founded in 2012 and based in Sunnyvale, Calif., was an early adopter of augmented intelligence and machine learning capabilities, and even as other BI vendors have begun to infuse their products with AI and machine learning, the ThoughtSpot analytics platform has continued to push the pace of innovation.

With its rapid rise, ThoughtSpot attracted plenty of funding, and an initial public offering seemed like the next logical step.

Now, however, ThoughtSpot is facing the same uncertainty as most enterprises as COVID-19 threatens not only people’s health around the world, but also organizations’ ability to effectively go about their business.

In a recent interview, ThoughtSpot CEO Sudheesh Nair discussed all things ThoughtSpot, from the way the coronavirus is affecting the company to the status of an IPO.

In part one of a two-part Q&A, Nair talked about how COVID-19 has changed the firm’s corporate culture in a short time. Here in part two, he discusses upcoming plans for the ThoughtSpot analytics platform and when the vendor might be ready to go public.

One of the main reasons the ThoughtSpot analytics platform has been able to garner respect in a short time is its innovation, particularly with respect to augmented intelligence and machine learning. Along those lines, what is a recent feature ThoughtSpot developed that stands out to you?

ThoughtSpot CEO Sudheesh NairSudheesh Nair

Sudheesh Nair: One of the main changes that is happening in the world of data right now is that the source of data is moving to the cloud. To deliver the AI-based, high-speed innovation on data, ThoughtSpot was really counting on running the data in a high-speed memory database, which is why ThoughtSpot was mostly focused on on-premises customers. One of the major changes that happened in the last year is that delivered what we call Embrace. With Embrace we are able to move to the cloud and leave the data in place. This is critical because as data is moving, the cost of running computations will get higher because computing is very expensive in the cloud.

With ThoughtSpot, what we have done is we are able to deliver this on platforms like Snowflake, Amazon Redshift, Google BigQuery and Microsoft Synapse. So now with all four major cloud vendors fully supported, we have the capability to serve all of our customers and leave all of their data in place. This reduces the cost to operate ThoughtSpot — the value we deliver — and the return on investment will be higher. That’s one major change.

Looking ahead, what are some additions to the ThoughtSpot analytics platform customers can expect?

Nair: If you ask people who know ThoughtSpot — and I know there are a lot of people who don’t know ThoughtSpot, and that’s OK — … if you ask them what we do they will say, ‘search and AI.’ It’s important that we continue to augment on that; however, one thing that we’ve found is that in the modern world we don’t want search to be the first thing that you do. What if search became the second thing you do, and the first thing is that what you’ve been looking for comes to you even before you ask?

What if search became the second thing you do, and the first thing is that what you’ve been looking for comes to you even before you ask?
Sudheesh NairCEO, ThoughtSpot

Let’s say you’re responsible for sales in Boston, and you told the system you’re interested in figuring out sales in Boston — that’s all you did. Now the system understands what it means to you, and then runs multiple models and comes back to you with questions you’ll be interested in, and most importantly with insights it thinks you need to know — it doesn’t send a bunch of notifications that you never read. We want to make sure that the insights we’re sending to you are so relevant and so appropriate that every single one adds value. If one of them doesn’t add value, we want to know so the system can understand what it was that was not valuable and then adjust its algorithms internally. We believe that the right action and insight should be in front of you, and then search can be the second thing you do prompted by the insight we sent to you.

What tools will be part of the ThoughtSpot analytics platform to deliver these kinds of insights?

Nair: There are two features we are delivering around it. One is called Feed, which is inspired by our social media curating insights, and conversations and opinions around facts. Right now social media is all opinion, but imagine a fact-driven social media experience where someone says they had a bad a quarter and someone else says it was great and then data shows up so it doesn’t become an opinion based on another opinion. It’s important that it should be tethered to facts. The second one is Monitor, which is the primary feature where the thing you were looking for shows up even before you ask in the format that you like — could be mobile, could be notifications, could be an image.

Those two features are critical innovations for our growth, and we are very focused on delivering them this year.

The last time we spoke we talked about the possibility of ThoughtSpot going public, and you were pretty open in saying that’s something you foresee. It’s about seven months later, where do plans for going public currently stand?

Nair: If you had asked me before COVID-19 I would have had a bit of a different answer, but the big picture hasn’t changed. I still firmly believe that a company like ThoughtSpot will tremendously benefit from going public because our customers are massive customers, and those customers like to spend more with a public company and the trust that comes with it.

Having said that, I talked last time about building a team and predictability, and I feel seven months later that we have built the executive team that can be the best in class when it comes to public companies. But going public also requires being predictable, and we’re getting in that right spot. I think that the next two quarters will be somewhat fluid, which will maybe set us back when it comes to building a plan to take the company public. But that is basically it. I think taken one by one, we have a good product market, we have good business momentum, we have a good team, and we just need to put together the history that is necessary so that the business is predictable and an investor can appreciate it. That’s what we’re focused on. There might be a short-term setback because of what the coronavirus might throw at us, but it’s going to definitely be a couple of more quarters of work.

Does the decline in the stock market related to COVID-19 play into your plans at all?

Nair: It’s absolutely an important event that’s going on and no one knows how it will play out, but when I think about a company’s future I never think about an IPO as a few quarters event. It’s something we want to do, and a couple of quarters here or there is not going to make a major difference. Over the last couple of weeks, we haven’t seen any softness in the demand for ThoughtSpot, but we know that a lot of our customers’ pipelines are in danger from supply impacts from China, so we will wait and see. We need to be very close to our customers right now, helping them through the process, and in that process we will learn and make the necessary course corrections.

Editor’s note: This interview has been edited for clarity and conciseness.

Go to Original Article
Author:

Sea of Thieves – Celebrate Our Second Anniversary with a Bonanza Weekend!

It’s been two years since Sea of Thieves arrived on Xbox One and Windows 10, and what years they’ve been. Since launch, we’ve seen over 10 million pirates plundering the seas, and during the last 24 months we’ve forged on with the wind in our sails to deliver an abundance of additions to the game – most notably in 2019’s Anniversary Update and with the introduction of monthly content updates from last July.

We’re incredibly proud of how far we’ve journeyed from launch, and we’re excited to continue making waves with future content updates. We’re humbled by how many player stories we’ve seen shared, and our community continually inspires us. So we can’t wait to show you all what’s on the horizon – but for now, we want to celebrate everything that’s come before!

As a thank you to our players and a celebration of all things Sea of Thieves, we’ve planned a programme of challenges and goodies kicking off this weekend and running throughout March’s content update. There’s a lot of in-game swag to be bagged for making all the right moves, so let’s take a look at the line-up!

Play Sea of Thieves free this weekend

To start with, an incentive if you’re not one of the millions of pirates who’ve joined us on the seas already: Sea of Thieves is part of the Xbox Free Play Days this weekend, and will be free for all Xbox Live subscribers to play until March 23rd! Don’t worry about being a late starter as all new pirates are eased into the game via the Maiden Voyage, a narrative-driven tutorial experience that provides guidance and information to fledgling sailors.

Enter the Heart of Fire

Let’s not forget this month’s free content update, Heart of Fire. Live since March 12th, this update brings the next fiery Tall Tale to Sea of Thieves, Athena’s Run Voyages for Pirate Legends and some brand new missiles in the form of chainshot for your cannons and throwable Blunderbombs.

Heart of Fire: Official Sea of Thieves Content Update

Duration 4:52

Bag the Anniversary Eye of Reach

What would a birthday be without a present? If you play Sea of Thieves between Thursday, March 19th and Friday, March 27th, you’ll get the very special, very golden ‘X Marks the Spot’ Eye of Reach! For those of you who will want to equip it straight away, don’t worry – the weapon will appear in your armoury immediately upon entering the game.

Snap up the skeletal Spinal Figurehead

As made famous in Rare’s ’90s fighting game Killer Instinct (and resurrected for the modern version in 2013), Spinal can be claimed for the front of your ship just by watching Sea of Thieves’ anniversary stream at mixer.com/seaofthieves on Friday, March 20th. Make sure your Microsoft and Sea of Thieves accounts are linked so that you qualify for this MixPot item, sign in and join us there from 5pm-7pm GMT!

Set sail with Ori and the Will of the Wisps

If you’re joining Sea of Thieves via Game Pass Ultimate, don’t forget you can also claim the wonderful Ori-inspired ship set to carry you into adventure. This gorgeous new livery is available exclusively to Game Pass Ultimate subscribers from March 18th, and you can see it in all its glowing glory right here:

Ancestral Ship Set Reveal Trailer – Official Sea of Thieves

Duration 0:41

Relive some of Sea of Thieves’ greatest moments

From Friday, March 20th, pirates will also be able relive some of the greatest Sea of Thieves moments from the last two years. Take a truncated tour through The Hungering Deep, Cursed Sails and Forsaken Shores to bag cosmetics previously limited to the first time these updates launched. For example, if you hadn’t taken to the seas or missed your chance to bag Merrick’s drum the first time around, you’ll have the opportunity to earn it now – allowing everyone to get a taste of some of the events they might have missed from year one!

Turn the seas red with Bleeding Edge

The fun doesn’t stop there. From March 30th, you’ll also be able to unlock some awesome Bleeding Edge ship cosmetics. Pirates will be challenged throughout the week with three objectives, and motivated to complete them with stunning Bleeding Edge-inspired sail, flag and hull designs. As with the Hunter’s Haul event last month, you’ll also be able to track your progress through these objectives here on the Sea of Thieves website – so stay tuned for more info on this event which is set to see the seas turn red…

Want to find out more about Sea of Thieves? Follow us at any of our social channels below, then take the plunge and embark on an epic journey with one of gaming’s most welcoming communities!

Go to Original Article
Author: Microsoft News Center

ONC, CMS information blocking, interoperability rules finalized

The 2020 HIMSS Global Health Conference & Exhibition may have been canceled Thursday due to coronavirus concerns, but federal regulators wasted no time in announcing that two long-awaited health IT rules finally have been released.

The finalized interoperability and information blocking rules from the Office of the National Coordinator for Health IT (ONC) and the Centers for Medicare and Medicaid Services (CMS) will require healthcare organizations give patients access to data through standardized APIs within the next two years, said Don Rucker, national coordinator for ONC, during a media briefing Monday. The rules also focus on data sharing between health insurers, as well as exceptions to information blocking, or situations that do not constitute healthcare organizations keeping data from patients.

Both ONC’s information blocking and interoperability rule, and CMS’ patient data access rule, were finalized amid concerns about patient privacy. Organizations, including EHR vendor Epic, voiced concerns that there weren’t enough privacy protections in place to keep patient data safe.

Proposals for the two rules were unveiled at last year’s event and it was rumored they would drop in conjunction with President Trump’s last-minute addition to this year’s HIMSS speaker lineup, which was slated to start today.

ONC’s interoperability rule

ONC’s interoperability rule mandates that healthcare organizations use FHIR-based APIs to connect patient-facing and consumer-grade apps to patient EHRs. It’s part of the Trump administration’s push to consumerize healthcare.

At the start of the year, one of the biggest EHR vendors, Epic, publicly expressed concerns on sharing patient data with third-party apps because of the lack of outlined privacy protections. During the media briefing, Rucker addressed those concerns head on, saying that the apps will use the same, secure API technology used in banking apps. Additionally, Rucker said providers will be able to let patients know in a “deliberate, straight-forward way” what information they’re consenting to sharing through a patient authentication process.  

“That is not snuck in on the side,” Rucker said. “It’s central to the way that patients allow an app to get access to their information. We’ve empowered providers to communicate the privacy issues in that process.” 

Rucker said a second part of the finalized ONC rule identifies activities that do not constitute information blocking, which is the interference of a healthcare organization with the sharing of health data, and establishes new rules to prevent information blocking practices by healthcare providers, developers of certified health IT and health information exchange networks, as required by the 21st Century Cures Act.

The rule also requires health IT developers to meet certification requirements to ensure interoperability.  Health IT developers must comply with requirements such as assuring that they are not restricting communication about a product’s usability or security so that nurses and doctors are able to discuss safety and usability issues without being bound by what Rucker said has historically been called a “gag clause.”

The finalized ONC rule also replaces the Common Clinical Data Set (CCDS) data elements standard with the U.S. Core Data for Interoperability (USCDI) data set for the exchange of data within APIs. The USCDI is a defined set of data that includes clinical notes such as allergies and medications. The data set will support data exchange, Rucker said.

“These are standardized sets of data classes and data elements … to help improve this flow of information,” he said.

CMS patient access rule

The ONC rule goes hand in hand with the CMS rule, which aims to open data sharing between the health insurance system and patients.

Starting in 2021, the CMS patient data access rule will require all health plans that do business with the federal government to share data with patients through a standards-based API. The push to make it easier for patients to access health data follows a model CMS implemented with Blue Button 2.0, an API which gives Medicare beneficiaries the ability to connect their claims data to apps of their choosing, such as research apps.

The rule also requires health plans to make their provider directory available through an API, so patients know if their physician is in their insurance network.

“This will allow innovative third parties to design apps that will help patients evaluate which plan networks are right for them and potentially avoid surprise billing by having a clear picture of which clinicians are in network,” CMS administrator Seema Verma said during Monday’s media briefing.

Starting in 2022, Verma said insurance plans will also be required to share patient information with each other, which will enable patients to take data with them as they move between plans.

Additionally, effective six months from today, CMS is changing the participation conditions for Medicare- and Medicaid-participating hospitals as part of the rule. To ensure they are supporting care coordination for patients, Verma said the rule requires the hospitals to send admission, discharge and transfer notifications so patients receive a “timelier follow-up supporting better care and better health outcomes.”

“The Trump administration is pushing the healthcare system forward,” Verma said. “We are breaking down barriers to a seamless, data-driven healthcare system. The result of these two rules will be a more intuitive and convenient experience for American patients.”  

Go to Original Article
Author:

Splice Machine 3.0 integrates machine learning capabilities, database

Databases have long been used for transactional and analytics use cases, but they also have practical utility to help enable machine learning capabilities. After all, machine learning is all about deriving insights from data, which is often stored inside a database.

San Francisco-based database vendor Splice Machine is taking an integrated approach to enabling machine learning with its eponymous database. Splice Machine is a distributed SQL relational database management system that includes machine learning capabilities as part of the overall platform.

Splice Machine 3.0 became generally available on March 3, bringing with it updated machine learning capabilities. It also has new Kubernetes cloud native-based model for cloud deployment and enhanced replication features.

In this Q&A, Monte Zweben, co-founder and CEO of Splice Machine, discusses the intersection of machine learning and databases and provides insight into the big changes that have occurred in the data landscape in recent years.

How do you integrate machine learning capabilities with a database?

Monte ZwebenMonte Zweben

Monte Zweben: The data platform itself has tables, rows and schema. The machine learning manager that we have native to the database has notebooks for developing models, Python for manipulating the data, algorithms that allow you to model and model workflow management that allows you to track the metadata on models as they go through their experimentation process. And finally we have in-database deployment.

So as an example, imagine a data scientist working in Splice Machine working in the insurance industry. They have an application for claims processing and they are building out models inside Splice Machine to predict claims fraud. There’s a function in Splice Machine called deploy, and what it will do is take a table and a model to generate database code. The deploy function builds a trigger on the database table that tells the table to call a stored procedure that has the model in it for every new record that comes in the table.

So what does this mean in plain English? Let’s say in the claims table, every time new claims would come in, the system would automatically trigger, grab those claims, run the model that predicts claim cause and outputs those predictions in another table. And now all of a sudden, you have real-time, in-the-moment machine learning that is detecting claim fraud on first notice of loss.

What does distributed SQL mean to you?

Zweben: So at its heart, it’s about sharing data across multiple nodes. That provides you the ability to parallelize computation and gain elastic scalability. That is the most important distributed attribute of Splice Machine.

In our new 3.0 release, we just added distributed replication. It’s another element of distribution where you have secondary Splice Machine instances in geo-replicated areas, to handle failover for disaster recovery.

What’s new in Splice Machine 3.0?

Zweben: We moved our cloud stack for Splice Machines from an old Mesos architecture to Kubernetes. Now our container-based architecture is all Kubernetes, and that has given us the opportunity to enable the separation of storage and compute. You literally can pause Splice Machine clusters and turn them back on. This is a great utility for consumption based usage of databases.

Along with our upgrade to Kubernetes, we also upgraded our machine learning manager from an older notebook technology called Zeppelin to a newer notebook technology that has really gained momentum in the marketplace, as much as Kubernetes has in the DevOps world. Jupyter notebooks have taken off in the data science space.

We’ve also enhanced our workflow management tool called mlflow, which is an open source tool that originated with Databricks and we’re part of that community. Mlflow allows data scientists to track their experiments and has that record of metadata available for governance.

What’s your view on open source and the risk of a big cloud vendor cannibalizing open source database technology?

Zweben: We do compose many different open source projects into a seamless and highly performant integration. Our secret sauce is how we put these things together at a very low level, with transactional integrity, to enable a single integrated system. This composition that we put together is open source, so that all of the pieces of our data platform are available in our open source repository, and people can see the source code right now.

I’m intensely worried about cloud cannibalization. I switched to an AGPL license specifically to protect against cannibalization by cloud vendors.

On the other hand, we believe we’re moving up the stack. If you look at our machine learning package, and how it’s so inextricably linked with the database, and the reference applications that we have in different segments, we’re going to be delivering more and more higher-level application functionality.

What are some of the biggest changes you’ve seen in the data landscape over the seven years you’ve been running Splice Machine?

Zweben: With the first generation of big data, it was all about data lakes, and let’s just get all the data the company has into one repository. Unfortunately, that has proven time and time again, at company after company, to just be data swamps.

Data repositories work, they’re scalable, but they don’t have anyone using the data, and this was a mistake for several reasons.

Instead of thinking about storing the data, companies should think about how to use the data.
Monte ZwebenCo-founder and CEO, Splice Machine

Instead of thinking about storing the data, companies should think about how to use the data. Start with the application and how you are going to make the application leverage new data sources.

The second reason why this was a mistake was organizationally, because the data scientists who know AI were all centralized in one data science group, away from the application. They are not the subject matter experts for the application.

When you focus on the application and retrofit the application to make it smart and inject AI, you can get a multidisciplinary team. You have app developers, architects, subject-matter experts, data engineers and data scientists, all working together on one purpose. That is a radically more effective and productive organizational structure for modernizing applications with AI.

Go to Original Article
Author:

For Sale – Mac Mini (Late 2012), Magic Keyboard and Trackpad, Monitor

Hi ModestN….

1. Model info…. (I’ve previously been advised against sharing serial numbers publicly so hope this is sufficient)
Model Name: Mac mini
Model Identifier: Macmini6,2
Processor Name: Intel Core i7
Processor Speed: 2.6 GHz

2. Yes, that’s fine. Without the monitor, I’d like £440 delivered or £420 collected from London.

3. SSD is ‘Crucial MX500 CT500MX500SSD1(Z) 500 GB (3D NAND, SATA, 2.5 Inch, Internal SSD)’ – taken from the amazon page where I bought it (I installed it myself)

4. I’m not massively keen on doing this, but aware that I have limited feedback on this forum. You can check out my eBay feedback – my username is mrcjbush – or let me know where you are as collection could be possible.

Let me know if you have any more questions.

Thanks,

Chris.

Go to Original Article
Author:

For Sale – or Trade (eGPU enclosure) : Dell 3020M USFF (4150T, 8GB, SSD, WiFi)

Very compact PC which has been used mainly as an HTPC. Intel 4150T, 8GB, wifi, 128GB SSD, Win10.
Will consider trade with a graphics card enclosure as long as it’s TB3 compatible. Cash your way depending on model.

Location
bristol
Price and currency
120
Delivery cost included
Delivery Is Included
Prefer goods collected?
I have no preference
Advertised elsewhere?
Advertised elsewhere
Payment method
BT

Last edited:

Go to Original Article
Author:

For Sale – Various Components (X99, Z170, GTX 1070s, 1000W PSUs)

Hi all,

Having a clear out of a bunch of old computer components that have been sat doing nothing other than gathering dust for the last couple of years. No warranty as all the bits are at least 2 years old, but I’ve tested them today and they all appear to be working as expected – A lot of the components are top tier, so I wasn’t expecting to find anything dead

Gigabyte G1 1070 8GB – Box and card only (no cables or manual) – £175

Palit 1070 Dual fan 8GB – Box and card only (no cables or manual) – £160

Bundle –
Xeon 2603v3
Asus X99 Deluxe motherboard – Seems to have all the bits (WiFi, break out cards, back plate, box, manual etc…)
– £120

Bundle –
Intel G3900
8GB DDR4 (2x4GB)
Asrock Z170 Fatal1ty i7 – Box, backplate and a few other little bits
– £100

Corsair RM1000i – No box but comes with most cables (missing a couple peripheral power cables for molex sata – will update when if I find them) – £90

Corsair RM1000x – Same as above – £90

Delivery is not included, collection from Sheffield welcome.

Pictures to come shortly.

if someone by some amazing coincidence wants all of it then happy to knock a nice amount off!

Cheers!

Go to Original Article
Author: