Tag Archives: some

Companies bolster endpoint data protection for remote work

With more people working from home due to the coronavirus, some companies have had to adjust how they handle backup and business continuity.

The spread of COVID-19, which is the disease caused by the new coronavirus, created a unique challenge for data protection experts. Instead of threatening data or applications, this disaster directly affects personnel. Because of social distancing and shelter-in-place orders, many employees must work remotely. Not all businesses’ IT infrastructure can easily accommodate this shift.

In recent months, MDL AutoMation, based out of Roswell, Ga., has been testing a business continuity plan for when its employees can no longer come to work. This includes Carbonite software installed on all laptops, Dell DDPE encryption and Absolute DDS for asset tracking and security. This level of endpoint data protection is largely unnecessary when everyone works in the office, but MDL AutoMation manager of infrastructure Eric Gutmann said they may not have that option for long.

“We will be able to continue functioning as a company with all our employees working remotely as if they were in the office,” Gutmann said.

MDL is a software company that sells car tracking capabilities to car dealerships. It has a client base of about 250 dealerships and manages 1.4 TB of data gathered from IoT devices.

Gutmann said he has VPN and remote desktop protocol (RDP) ready, and the switch to remote working and enhanced endpoint data protection is meant to be temporary. He is prepared to implement it for two months.

No going back

Marc Staimer, president of Dragon Slayer Consulting, said it’s highly unlikely that any business that implements endpoint data protection will want to go back. Endpoint data protection is a separate investment from workstation data protection and involves extra security measures such as geolocation and remote wiping. Businesses that do not already have this will need to invest time and money into such a system, and will likely want to keep it after making that investment.

Many businesses may already be in a good position to support remote work. Staimer said organizations that use virtual desktop infrastructure (VDI) do not have to worry about backing up laptops, and less data-intensive businesses can have everyone work off of the cloud. Bandwidth is also much more abundant now, eliminating what used to be a roadblock to remote work.

With SaaS-based applications such as Microsoft Office 365 and Google Docs and cloud-based storage such as OneDrive and Dropbox, teleworking isn’t complicated to implement. The difficulty, according to Steven Hill, senior analyst at 451 Research, part of S&P Global Market Intelligence, comes from making sure everything on the cloud is just as protected as anything on premises.

Unlike endpoint data protection, using the cloud is more about locking down storage being used than protecting multiple devices. Whether it’s Dropbox, OneDrive or a private cloud NAS, an administrator only has to worry about protecting and securing that one management point. Aside from native tools, third-party vendors such as Backblaze and CloudAlly can provide data protection for these storage environments.

“Rather than storing business information locally, you could dictate that everything goes to and comes from the cloud,” Hill said.

Staimer said the pandemic will make many businesses realize they don’t need all of their workers in a single location. While some organizations won’t treat the coronavirus seriously enough to implement any of these systems, Staimer expects that for many, it will be the impetus to do what they should’ve been doing.

Coronavirus is going to change the way we work — permanently.
Marc StaimerPresident, Dragon Slayer Consulting

“Coronavirus is going to change the way we work — permanently,” Staimer said.

For some businesses, the biggest challenge will be accommodating workers who cannot perform their jobs from home. They may include partners or customers, as well as a company’s employees.

KCF Technologies, based in State College, Penn., which manufactures industrial diagnostic equipment, is already invested in endpoint data protection. Myron Semack, chief infrastructure architect at KCF, said the company is cloud-centric and many of its workers can work from anywhere.

However, the business would still be impacted if it or its customers go into lockdown because of the coronavirus. Not only would KCF be unable to produce its sensor products, but any installation or project work in the field would have to be suspended. This isn’t anything IT can fix.

“Our manufacturing line employees cannot work from home, unfortunately. If they were forced to stay home, our ability to build or ship product would be impacted,” Semack said.

Go to Original Article

Databricks bolsters security for data analytics tool

Some of the biggest challenges with data management and analytics efforts is security.

Databricks, based in San Francisco, is well aware of the data security challenge, and recently updated its Databricks’ Unified Analytics Platform with enhanced security controls to help organizations minimize their data analytics attack surface and reduce risks. Alongside the security enhancements, new administration and automation capabilities make the platform easier to deploy and use, according to the company.

Organizations are embracing cloud-based analytics for the promise of elastic scalability, supporting more end users, and improving data availability, said Mike Leone, a senior analyst at Enterprise Strategy Group. That said, greater scale, more end users and different cloud environments create myriad challenges, with security being one of them, Leone said.

“Our research shows that security is the top disadvantage or drawback to cloud-based analytics today. This is cited by 40% of organizations,” Leone said. “It’s not only smart of Databricks to focus on security, but it’s warranted.”

He added that Databricks is extending foundational security in each environment with consistency across environments and the vendor is making it easy to proactively simplify administration.

As organizations turn to the cloud to enable more end users to access more data, they’re finding that security is fundamentally different across cloud providers.
Mike LeoneSenior analyst, Enterprise Strategy Group

“As organizations turn to the cloud to enable more end users to access more data, they’re finding that security is fundamentally different across cloud providers,” Leone said. “That means it’s more important than ever to ensure security consistency, maintain compliance and provide transparency and control across environments.”

Additionally, Leone said that with its new update, Databricks provides intelligent automation to enable faster ramp-up times and improve productivity across the machine learning lifecycle for all involved personas, including IT, developers, data engineers and data scientists.

Gartner said in its February 2020 Magic Quadrant for Data Science and Machine Learning Platforms that Databricks Unified Analytics Platform has had a relatively low barrier to entry for users with coding backgrounds, but cautioned that “adoption is harder for business analysts and emerging citizen data scientists.”

Bringing Active Directory policies to cloud data management

Data access security is handled differently on-premises compared with how it needs to be handled at scale in the cloud, according to David Meyer, senior vice president of product management at Databricks.

Meyer said the new updates to Databricks enable organizations to more efficiently use their on-premises access control systems, like Microsoft Active Directory, with Databricks in the cloud. A member of an Active Directory group becomes a member of the same policy group with the Databricks platform. Databricks then maps the right policies into the cloud provider as a native cloud identity.

Databricks uses the open source Apache Spark project as a foundational component and provides more capabilities, said Vinay Wagh, director of product at Databricks.

“The idea is, you, as the user, get into our platform, we know who you are, what you can do and what data you’re allowed to touch,” Wagh said. “Then we combine that with our orchestration around how Spark should scale, based on the code you’ve written, and put that into a simple construct.”

Protecting personally identifiable information

Beyond just securing access to data, there is also a need for many organizations to comply with privacy and regulatory compliance policies to protect personally identifiable information (PII).

“In a lot of cases, what we see is customers ingesting terabytes and petabytes of data into the data lake,” Wagh said. “As part of that ingestion, they remove all of the PII data that they can, which is not necessary for analyzing, by either anonymizing or tokenizing data before it lands in the data lake.”

In some cases, though, there is still PII that can get into a data lake. For those cases, Databricks enables administrators to perform queries to selectively identify potential PII data records.

Improving automation and data management at scale

Another key set of enhancements in the Databricks platform update are for automation and data management.

Meyer explained that historically, each of Databricks’ customers had basically one workspace in which they put all their users. That model doesn’t really let organizations isolate different users, however, and has different settings and environments for various groups.

To that end, Databricks now enables customers to have multiple workspaces to better manage and provide capabilities to different groups within the same organization. Going a step further, Databricks now also provides automation for the configuration and management of workspaces.

Delta Lake momentum grows

Looking forward, the most active area within Databricks is with the company’s Delta Lake and data lake efforts.

Delta Lake is an open source project started by Databrick and now hosted at the Linux Foundation. The core goal of the project is to enable an open standard around data lake connectivity.

“Almost every big data platform now has a connector to Delta Lake, and just like Spark is a standard, we’re seeing Delta Lake become a standard and we’re putting a lot of energy into making that happen,” Meyer said.

Other data analytics platforms ranked similarly by Gartner include Alteryx, SAS, Tibco Software, Dataiku and IBM. Databricks’ security features appear to be a differentiator.

Go to Original Article

Coronavirus: Surge in remote work strains Zoom services

Zoom has struggled to keep some of its services online this week amid a spike in remote work because of the global coronavirus pandemic.

Users have had to wait significantly longer than usual to access recordings of Zoom meetings in the cloud. The company said its engineering team was working to resolve the issue, attributing the backlog to “excessive demand.”

Zoom’s dial-in numbers have also faltered several times this month. Elevated traffic has so far clogged audio lines in Japan, New York and Hong Kong, forcing users to connect to a meeting’s audio using the internet. A dial-in number in Australia was also inaccessible at times this week. 

Meanwhile, some users were intermittently unable to make and receive calls through Zoom Phone, the vendor’s cloud telephony service, for extended periods of time this week.  

Users have now dealt with 18 non-scheduled Zoom service disruptions in March. There were no such incidents in January and just one in February (an issue that affected only subscribers in Brazil).

In a statement, Zoom said it was working to find a “long-term, sustainable solution” to the issues affecting Zoom Phone. The company thanked customers for their “patience and understanding” during an “unprecedented and challenging time for everyone.”

Zoom is not the only collaboration vendor struggling to cope with a sudden surge in usage. Many users of Microsoft Teams were unable to send messages and perform other tasks on Monday. Some Teams users in Europe were affected by another chat outage on Tuesday.

Last week, experts said they didn’t expect any of the major collaboration vendors to suffer outages that forced their services completely offline for multiple days. So far, that prediction has held. Nevertheless, the influx of remote workers is having some impact.

Zoom has not said how many new users it has gained in recent weeks, but its mobile client is now the most popular free download on Apple’s App Store. Notably, countless schools and universities worldwide have begun to hold virtual classes on Zoom.

Statistics shared by other vendors provide clues to the surge in traffic Zoom is likely dealing with. Microsoft Teams gained 12 million daily active users between March 11 and March 18, a 37% increase. Slack added paid customers at nearly three times its typical rate between Feb. 1 and March 18.

Zoom’s support team is also likely fielding complaints related to factors outside of the vendor’s control, such as the quality of a user’s home Wi-Fi. Residential connections are often less reliable than corporate networks.

Go to Original Article

For Sale – Intel Nuc, cpu’s, SODIMM ram

Hi all

time to have a clean up of some bits

first up is an Intel NUC7I7BNH, all firmwares fully updated and HDMI 2.0 working from HDMI and USB-C, comes with 16 gig of ram and 256gig SSD drive $410 Delivered now £350 + postage of your choice, work great with the HDR kodi build

CPU’s all chip only except 9100f

one Intel i3 9100F used for about 2 months retail boxed £75 delivered now £65 + postage of your choice

1 x intel I5-9500 £100 delivered (sold chunksinspace)

Intel i5-8500T(35w) good for a low power HTPC £90 now £75 + postage of your choice (sold elsewhere)

and 4 x of Samsung 8GB DDR4 PC4-21300, 2666MHZ, 260 PIN SODIMM £30 each now £25 each

many thanks for looking

Go to Original Article

NetApp digs into Talon Storage for Cloud OnTap

The NetApp cloud ecosystem gained some muscle for shared file with the acquisition of software-defined vendor Talon Storage.

NetApp said the Talon Storage Fast software suite is already integrated in NetApp Cloud OnTap as part of an ongoing partnership. Talon provides global file caching and syncs data locally from public clouds to a company’s remote branch offices. The NetApp integration does not include local OnTap clusters.

The NetApp cloud file storage is available with two products. NetApp Cloud Volumes OnTap allows a storage administrator to run NetApp storage in the public cloud in the same manner as on premises. Customers also may run NetApp file storage as a service in Amazon Web Services, Google Cloud Platform and Microsoft Azure.

Those NetApp cloud offerings have direct connectivity and integration with Talon, said Anthony Lye, a NetApp senior vice president and GM of its cloud business unit.

Talon Storage is designed to mitigate the performance penalties that occur when shared cloud is used for local processing. The Talon technology runs as a virtual machine. Ctera Networks, Nasuni and Panzura are Talon’s chief competitors. NetApp did not disclose the purchase price, or how many Talon employees will join NetApp.

Cloud storage works well with cloud-native applications, but performance breaks down when more and more data is consumed locally. Major cloud providers are addressing the issue with products by placing storage closer to on-premises users, including AWS Outposts and Google Anthos.

“Talon is a nice acquisition for NetApp. Small player, but good technology,” said Steve McDowell, a senior analyst for storage and data center technologies at Moor Insights & Strategy, based in Austin, Texas.

Support for Talon Storage customers

Lye said Talon gives NetApp new tools to help customers shift more data workloads to the cloud. He added that it lowers the cost for organizations that need to sync data between multiple locations.

 “The only way you could get low response times historically was to sync the files between all the different remote offices and branch offices. Talon uses memory on the remote machine and has an intelligence to place frequently accessed files into the memory. The files are cached remotely, even though the source of truth still exists on the back-end file server,” Lye said.

NetApp has been exposing functionality to third-party vendors via RESTful APIs, which Lye said helped speed up the Talon integration. He said NetApp has gained more than 1,000 Talon Storage customers. He said NetApp will support Talon customers regardless of their back-end storage.

“We are promoting aggressively to new prospects a bundle that would include Talon Storage and NetApp Cloud Volumes, at a significantly lower TCO,” Lye said.

Talon Storage is the fifth NetApp cloud software acquisition since 2017. The company added compliance software vendor Cognigo last year and Kubernetes specialist Stackpoint.io in 2018, one year after landing memory startup Plexistor and cloud orchestration player GreencloudIQ.

Go to Original Article

Women of Microsoft Quantum (Part 2)

In honor of International Women’s Day, Microsoft is proud to recognize some of the amazing women of Microsoft Quantum. These engineers, scientists, program managers and business leaders are working toward realizing Microsoft’s mission of building a scalable quantum computer and global quantum community to help solve some of the world’s most challenging problems.

Last year, we introduced you to some of the women working on quantum software; this year we’re profiling more women delivering impact in the Microsoft Quantum program, across quantum hardware, software, partnerships, and business development.

This is the second of a two-part series. In case you missed it, meet The Women of Microsoft Quantum in Part 1.

Sydney Schreppler – Quantum Hardware Engineer

Sydney Schreppler bio picture

Sydney Schreppler bio pictureQ: Tell us more about your role in the Microsoft Quantum group. What exciting things are you working on right now?

I am a Quantum Engineer working as part of a global hardware team that characterizes quantum materials for the development of our topological qubit technology. Our team measures electrical transport properties in cryogenic environments, probing the quantum nature of the materials. Right now I’m excited to be working in Redmond, where, together with the Quantum Systems team, we span Microsoft’s full quantum stack, from our topological qubit layer at the very bottom all the way up to the algorithms offered in Azure Quantum.

Q: What was it that attracted you to the technology field? How and why did you decide to join the domain of quantum computing?

Long before I knew I wanted to study physics, people around me seemed to know it. I think it was because I was always asking for simple explanations for how the world worked and because I liked to understand those answers through a mathematical lens. Once I started studying physics, the more I learned, the simpler and more elegant the explanations got.

What attracted me to quantum measurements first, and later to quantum computing, was the idea that something that seemed so non-intuitive and mysterious was nonetheless observable, and even useful! I wanted to see quantum effects for myself, so as a college student, I sought out opportunities in labs measuring quantum things. And once I had “seen” quantum, I was hooked. I measured the quantum mechanical motion of tiny membranes, the interaction of ultracold atoms with laser light (obtaining my Ph.D. in physics along the way), and the quantum entanglement of superconducting circuits. Now at Microsoft, I get to harness these same kinds of measurements to develop our quantum hardware.

Amrita Singh – Quantum Hardware Engineer

Amrita Singh bio picture

Amrita Singh bio picture

Q: Tell us more about your role in the Microsoft Quantum group. What exciting things are you working on right now?

I am a hardware engineer and coordinate the substrate fabrication activities with a small team of nanofabrication engineers at Microsoft Quantum Labs – Delft. We engineer the substrates and create a platform for selective area growth of a high-quality III-V semiconductor/ superconductor hybrid network, which is a building block of the topological quantum qubit.

Q: What was it that attracted you to the technology field? How and why did you decide to join the domain of quantum computing?

I was born and raised in a remote rural village in northern India where a girl’s education wasn’t important and the only expectation from a girl was to get married at an early age, raise children, and at the most, become a primary school teacher in the village. Mathematics and Science were considered to be boys’ subjects and weren’t even available as an option until the senior year at my all-girls school when I started. I was fortunate though, in that they were introduced a year before I reached my final year.

I studied science in my school to prove my worthiness as much as the boys in the neighborhood, but didn’t fully believe in it because it conflicted with my belief in God and other superstitions. But I always loved mathematics because of its precision, as no belief could justify 2+2≠4.

My exposure to technology was very limited and I had my first encounter with computers during my Masters (Physics) degree at IIT Delhi. During my Ph.D. in Experimental Condensed Matter Physics, I started appreciating the power of scientific attitude when I would verify a hypothesis with experimental data. Being an experimental physicist, I would feel restless for my blind faith and that is when I started to question my deep-rooted superstitions and religious beliefs, getting rid of them over the course of about four to five years. This was only possible due to my career choice in Science and Technology and it has shaped me into who I am today.

I did my Ph.D. on quantum devices for spintronic application and I extended my knowledge to superconducting spintronics during my postdoc work at Leiden University, where I gained expertise in interface engineering for hybrid quantum devices. I believed that, with my diverse background in quantum physics and device engineering, I would be able to contribute toward the realization of an ambitious topological quantum computer at Microsoft, as well as be able to learn and grow without limit by working with great minds.

Science for me is not just a profession but a way of living. I strongly believe that we could change the lives of millions of unprivileged deserving children in the world by giving quality education and bring them into the mainstream by using technologies.

Aarthi Meenakshi Sundaram – Researcher

Aarthi Meenakshi Sundaram bio picture

Aarthi Meenakshi Sundaram bio picture

Q: Tell us more about your role in the Microsoft Quantum group. What exciting things are you working on right now?

I am a postdoctoral researcher in the Research and Applications team at Microsoft Quantum, where my overarching goal is to understand both the power and limitations of using quantum computers to solve some of our most challenging problems. Sometimes, this means defining efficient quantum algorithms for various problems. Other times, this means defining a mathematically rigorous computational model and analyzing which problems are “easy” or “difficult” in this model à la complexity theory.

Currently, I am looking forward to tackling both aspects in the context of quantum machine learning. It’s a nascent but rapidly evolving area with new algorithms being discovered and comes with its own set of challenges for us to understand precisely what kinds of learning problems can be sped up with quantum resources and to what extent. In classical computational learning theory, there are many well-established models of learning. Inevitably, we find that there may be various ways to “quantize” these models (i.e., add some “quantum magic” to these models, and each way could be useful in vastly different scenarios – some abstract/mathematical, some very real and even implementable in the near-term on quantum computers! Investigating these in all their variations is what excites me right now.

On a slightly different track, I also care about building tools that could help to efficiently verify quantum programs – through type checking or other methods. One of the main challenges is that any quantum program debugger that observes or measures how a quantum state is manipulated in the program could destroy the quantum nature of the state itself. Another challenge is that certain techniques that work well on small quantum programs will scale badly with the size of our program and could take too long to verify realistically. So, along with my collaborators here, we are investigating ways to build efficient type checkers that could provide us with the ability to verify some, if not all of the properties of interest in a quantum program.

Q: What was it that attracted you to the technology field? How and why did you decide to join the domain of quantum computing?

I have been reliably informed by my mother that, as a 4- or 5-year old, I took great joy in sitting on her lap and helping her with her programming work by entering the programs into our computer at home and marveling at this new object that knew how to follow my orders (or throw error messages!) So, while I don’t remember ever having to make a conscious choice to work in the world of computing, it has always seemed like a foregone conclusion in my mind, leading to my Math and Computer Science majors during undergrad.

For the first time at my university, one of my professors offered a course in quantum information and computing. I had just started getting interested in cryptography then and being introduced to this new computing model that could break state-of-the-art cryptosystems was a revelation! I was intrigued by this field that almost sounded like something out of science fiction and seemed so counterintuitive, at first.

Encouraged by my professor to pursue it beyond that one course, it was a natural progression for me to eventually pursue a Ph.D. in quantum complexity theory. It allowed me to blend the skills I had learned from both of my undergrad majors seamlessly. Being interested in the more abstract and theoretical aspects of computer science, I spent my Ph.D. analyzing the power of quantum analogs of various computational models. A continuous inspiration since I’ve delved more into quantum computing is that by living at the intersection and cutting edge of many different fields, one gets to work and learn from people whose expertise is vastly different than your own. With Microsoft Quantum’s aim of delivering a full stack of quantum services, that means, I am thrilled for the opportunity to interact with everyone from material scientists to mathematicians within the team.

Judith Suter – Senior Researcher

Judith Suter bio picture

Judith Suter bio picture

Q: Tell us more about your role in the Microsoft Quantum group. What exciting things are you working on right now?

In my work as a Senior Researcher in the Microsoft Quantum Hardware Program, I focus mainly on electrical characterization of different device types, materials, and fabrication processes. My days revolve around planning and designing experiments, running and optimizing low-temperature measurements, and exploring the resulting aggregated data. As part of a global team, another element of my job is cross-site collaboration where we leverage the diverse expertise of the whole team to collectively tackle challenging projects.

Recently I also became part of the Azure Hardware Systems and Infrastructure Diversity and Inclusion Council, where I represent the Quantum Hardware Program. I am excited to help drive the efforts towards the ambitious goals of Microsoft to fuel systemic change, widen our pipelines to reach and engage a diverse group of people, and transform our culture to ensure that everyone feels welcome and valued.

Q: What was it that attracted you to the technology field? How and why did you decide to join the domain of quantum computing?

My path to working on quantum computing was not without detours. As a high school student, I was fascinated by surrealist painters and the strange but self-consistent worlds they portrayed, so I commenced my studies at an arts and graphic design academy. Eventually, I left, longing to do something completely different, something I knew nothing about. I signed up for an undergraduate degree in Nanoscience, where I felt I could get a taste of different scientific fields. There, quantum physics intrigued me from the start: counterintuitive concepts born out of creative boldness – surprisingly, some lectures ended up reminding me of my art classes studying surrealism. I was hooked. I bought a one-way ticket to the epicenter of quantum physics, the Niels Bohr Institute in Copenhagen, joined Prof. Charles Marcus’ lab there at the Center for Quantum Devices and started my training to become a quantum physicist.

Vicky Svidenko – Partner Quantum Data Sciences

a woman smiling for the camera

a woman smiling for the camera

Q: Tell us more about your role in the Microsoft Quantum group. What exciting things are you working on right now?

I am leading the Quantum Systems Integration team – helping to accelerate quantum research and development. The Microsoft Quantum group is exploring ways to build a full-stack quantum computer and has become the world’s center of expertise on topological quantum computing. I am incredibly humbled by the opportunity to support this development effort and contribute to the new breakthroughs, together with an amazing team of talented researchers and engineers.

Q: What was it that attracted you to the technology field? How and why did you decide to join the domain of quantum computing?

I came to Quantum because I enjoy the loosely orchestrated chaos of early product development and the frenzy of excitement for every new learning and every new benchmark. I like that incredible sensation of being part of something futuristically amazing, now evolving and materializing.

Another reason: This was my first opportunity to work for an amazing female manager – Krysta Svore – and I wasn’t going to miss it.

Meet more of The Women of Microsoft Quantum in Part 1 of this series.

This is just a small sample of the amazing people on the Microsoft Quantum team. If you want to join us as we build the quantum future, we’re hiring!

Go to Original Article
Author: Microsoft News Center

For Sale – Intel Nuc, cpu’s, SODIMM ram

Hi all

time to have a clean up of some bits

first up is an Intel NUC7I7BNH, all firmwares fully updated and HDMI 2.0 working from HDMI and USB-C, comes with 16 gig of ram and 256gig SSD drive $410 Delivered now £350 + postage of your choice, work great with the HDR kodi build

CPU’s all chip only except 9100f

one Intel i3 9100F used for about 2 months retail boxed £75 delivered now £65 + postage of your choice

1 x intel I5-9500 £100 delivered

Intel i5-8500T(35w) good for a low power HTPC £90 now £75 + postage of your choice

and 4 x of Samsung 8GB DDR4 PC4-21300, 2666MHZ, 260 PIN SODIMM £30 each now £25 each

many thanks for looking

Go to Original Article

For Sale – Various Hard Drives & WTB Lian Li cases

I have decommissioned some workstations and moving these drives on. None of them have any warranty remaining however all have passed a Passmark diskcheck with no errors. I’m looking to sell in bundles and have been priced accordingly. My preference is to have these collected or I meet part way. Delivery will be £10 on top of any agreed sale.

3 x 3TB Seagate Constellation (7200rpm high performance drives) – £100

2 x 3TB Western Digital Black (7200rpm high performance drives) – £80

5 x 2TB Western Digital Black (7200rpm high performance drives) – £130

Will listen to offers on the whole lot

also looking for used Lian li cases. Preferably with lots of HDD capacity.

North Kent
Price and currency
Delivery cost included
Delivery is NOT included
Prefer goods collected?
I prefer the goods to be collected
Advertised elsewhere?
Not advertised elsewhere
Payment method

Last edited:

Go to Original Article

For Sale – Or trade PCSpecialists Recoil II 17 inch gaming laptop. i7, RTX2060

Sorry for late reply, no notification for some reason.

Yes I am happy to send. I just put collection preferred to enable any potential buyers to come and look too.

Power supply doesn’t actually state a wattage on it… Buts it’s model number is fsp180 so I am presuming 180 Watts.

I have played destiny 2 ( I know it’s not the most intensive game) for hours and it’s absolutely perfect and stable…. More so than the bloody Alienware that I have just got.

Go to Original Article