Tag Archives: several

New to Microsoft 365 in June—streamlining teamwork and security – Microsoft 365 Blog

This month, we introduced several new capabilities that improve user experience, streamline the management of common tasks, and enhance identity-driven security measures. We also want to hear your feedback, so that we can make sure these updates are relevant and useful to you.

Streamlining the way you work

Updates to the Office 365 user experience—We announced updates for Word, Excel, PowerPoint, OneNote, and Outlook that are designed to embrace the breadth and depth of Office 365 features, while simplifying the user interface and improving accessibility. These updates include a simplified ribbon to encourage focus and collaboration, modern colors and icons to improve rendering and accessibility, and AI-powered search to quickly surface relevant information. These changes will start to roll out to Microsoft 365 and Office 365 subscribers over the next few months.

Connect Office 365 Groups to SharePoint sites—Office 365 Groups can now connect to existing SharePoint sites, allowing newly created Office 365 groups to integrate with your existing SharePoint infrastructure. Connecting a group to a site provides a single starting point to find content, team news, and communications with modern pages, libraries, and lists—without losing any previous content or permissions.

A screenshot displays a SharePoint page. A dropdown from Settings in the upper right shows the user is about to connect a new Office 365 group.

Reduce distractions with Outlook for Android—We introduced “Do Not Disturb” in Outlook for Android to help you reduce distractions and get more done. Now, subscribers can set timed or scheduled periods when email and calendar notifications will be paused. For those with multiple Outlook accounts, Do Not Disturb settings can be customized for each email address—enabling granular control over how you spend your focus-hours.

An animated screenshot highlights the steps a user needs to take to set their Outlook to

Manage progress in Microsoft To-Do—This month, we introduced “Steps in Microsoft To-Do—a new feature that allows you to break down tasks into smaller, incremental steps—making large projects more manageable. Now, when a you create a To-Do item, you can add a range of detailed steps that are tracked through to completion. We also introduced the ability to share your To-Do lists, enabling you to work together on tasks and complete projects with colleagues and friends.

An animated screenshot highlights a user sending a 1:1 invitation link to a teammate.

Dictation in OneNote—Office 365 subscribers with Windows 10 can now take advantage of hands-free dictation using nine languages in OneNote. Dictation provides a simple, yet transformational, way to express ideas and capture notes using only your voice. You can also make edits using your keyboard without having to pause the recording. Simply click or tap the Dictate icon and start speaking.

Adobe PDF integration in Office 365—Last September, we expanded our strategic partnership with Adobe to focus on integrations between Adobe Sign and Office 365 products, like Microsoft Teams, SharePoint, and Outlook. This month, the Adobe Document Cloud team announced new capabilities for OneDrive and SharePoint that provide improved fidelity when working with PDF documents. Once integrated by your administrator, PDF services provide rich previews of PDF documents in OneDrive and your SharePoint sites, and allow you to combine several files into a single PDF in your document library.

A screenshot displays documents in SharePoint. A Word, Excel, and PowerPoint have been selected and are ready to combine.

Securing the modern workplace

We introduced several new important capabilities that strengthen your organization’s identity-driven security, and ensure important data is kept safe.

Secure your organization with baseline security policy in Azure Active Directory—We introduced the preview of a baseline security policy in Azure AD that enforces multi-factor authentication for privileged accounts. This new policy will apply to all organizations that have Azure Active Directory and help secure the most important accounts in your tenant. Customers can opt in to the baseline protection policy in preview, and at general availability will be opted in by default with the ability to opt out at any time.

Block legacy authentication using Azure Active Directory conditional access—This month, we introduced the preview of conditional access support for blocking legacy authentication, which enables organizations to stop users from authenticating to legacy apps. Identity attacks such as password spray almost exclusively target these older client apps. This feature improves the overall security of your IT environment by getting users to move to more modern clients that support modern authentication mechanisms.

Enhance data classification across your organization—The new Label Activity Explorer in Office 365 provides a quick overview of how the data in your organization has been labeled—allowing you to investigate risky or abnormal activity. To help you manage labeling across the lifecycle of your organization’s content, we enhanced the Data Governance dashboard with new features like the Data Governance toolbox, added links and tools for common data governance tasks, and provided a single resource for guidance.

A screenshot of the Security & Compliance Center in Office 365. The user is exploring the Label Activity Explorer in the Data governance dashboard.

Other updates

  • Microsoft Teams has reached FedRAMP Moderate Compliance and will start rolling out to U.S. Government Community Cloud (GCC) customers on July 17, 2018.
  • Visio Online is now available in Microsoft Teams. Coworkers can now collaborate on Visio Online diagrams from within their team or channel without toggling between apps.
  • SharePoint Swoop—our new enterprise reality show—features a team of MVP experts with just three days to help a Microsoft 365 customer modernize their intranet.
  • At Computex 2018, we outlined our vision for how partners can build intelligent edge devices and solutions.

What is Active Directory? – Definition from WhatIs.com

Active Directory (AD) is a Microsoft product that consists of several services that run on Windows Server to manage permissions and access to networked resources.

Active Directory stores data as objects. An object is a single element, such as a user, group, application or device, such as a printer. Objects are normally defined as either resources — such as printers or computers — or security principals — such as users or groups.

Active Directory categorizes objects by name and attributes. For example, the name of a user might include the name string, along with information associated with the user, such as passwords and Secure Shell (SSH) keys.

The main service in Active Directory is Domain Services (AD DS), which stores directory information and handles the interaction of the user with the domain. AD DS verifies access when a user signs into a device or attempts to connect to a server over a network. AD DS controls which users have access to each resource. For example, an administrator typically has a different level of access to data than an end user.

Other Microsoft products, such as Exchange Server and SharePoint Server, rely on AD DS to provide resource access. The server that hosts AD DS is the domain controller.

Active Directory services

Several other services comprise Active Directory. They are Lightweight Directory Services, Certificate Services, Federation Services and Rights Management Services. Each service expands the product’s directory management capabilities.

Lightweight Directory Services (AD LDS) has the same codebase as AD DS, sharing similar functionalities, such as the API. AD LDS, however, can run in multiple instances on one server and holds directory data in a data store using Lightweight Directory Access Protocol (LDAP).

[embedded content]

How to use the identity and access tool
from Microsoft

LDAP is an application protocol used to access and maintain directory services over a network. LDAP stores objects — such as usernames and passwords — in directory services — such as Active Directory — and shares that object data across the network.

Certificate Services (AD CS) generates, manages and shares certificates. A certificate uses encryption to enable a user to exchange information over the internet securely with a public key.

Active Directory Federation Services (AD FS) authenticates user access to multiple applications — even on different networks — using single sign-on (SSO). As the name indicates, SSO only requires the user to sign on once rather than use multiple dedicated authentication keys for each service.

Rights Management (AD RMS) controls information rights and management. AD RMS encrypts content, such as email or Word documents, on a server to limit access.

Major features in Active Directory Domain Services

Active Directory Domain Services uses a tiered layout consisting of domains, trees and forests to coordinate networked elements.

A domain is a group of objects, such as users or devices, that share the same AD database. Domains have a domain name system (DNS) structure.

Group Policy Management console
Active Directory’s Group Policy Management console gives admins a tool to customize user and computer settings in their organization.

A tree is one or more domains grouped together. The tree structure uses a contiguous namespace to gather the collection of domains in a logical hierarchy. Trees can be viewed as trust relationships where a secure connection, or trust, is shared between two domains. Multiple domains can be trusted where one domain can trust a second, and the second domain can trust a third. Because of the hierarchical nature of this setup, the first domain can implicitly trust the third domain without needing explicit trust.

A forest is a group of multiple trees. A forest consists of shared catalogs, directory schemas, application information and domain configurations. The schema defines an object’s class and attributes in a forest. In addition, global catalog servers provide a listing of all the objects in a forest.

Organizational Units (OUs) organize users, groups and devices. Each domain can contain its own OU. However, OUs cannot have separate namespaces, as each user or object in a domain must be unique. For example, a user account with the same username cannot be created.

History and development of Active Directory   

Microsoft offered a preview of Active Directory in 1999 and released it a year later with Windows 2000 Server. Microsoft continued to develop new features with each successive Windows Server release.

Windows Server 2003 included a notable update to add forests and the ability to edit and change the position of domains within forests. Domains on Windows Server 2000 could not support newer AD updates running in Server 2003.

Windows Server 2008 introduced AD FS. Additionally, Microsoft rebranded the directory for domain management as AD DS, and AD became an umbrella term for the directory-based services it supported.

Windows Server 2016 updated AD DS to improve AD security and migrate AD environments to cloud or hybrid cloud environments. Security updates included the addition of privileged access management (PAM).

PAM monitored access to an object, the type of access granted and what actions the user took. PAM added bastion AD forests to provide an additional secure and isolated forest environment. Windows Server 2016 ended support for devices on Windows Server 2003.

In December 2016, Microsoft released Azure AD Connect to join an on-premises Active Directory system with Azure Active Directory (Azure AD) to enable SSO for Microsoft’s cloud services, such as Office 365. Azure AD Connect works with systems running Windows Server 2008, Windows Server 2008 R2, Windows Server 2012, Windows Server 2012 R2 and Windows Server 2016.

Active Directory versus Workgroup

Workgroup is another Microsoft program that connects Windows machines over a peer-to-peer network. Workgroup allows these machines to share files, internet access, printers and other resources over the network. Peer-to-peer networking removes the need for a server for authentication.

Main competitors to Active Directory

Other directory services on the market that provide similar functionality to AD include Red Hat Directory Server, Apache Directory and OpenLDAP.

Red Hat Directory Server manages user access to multiple systems in Unix environments. Similar to AD, Red Hat Directory Server includes user ID and certificate-based authentication to restrict access to data in the directory.

Apache Directory is an open source project that runs on Java and operates on any LDAP server, including systems on Windows, macOS and Linux. Apache Directory includes a schema browser and an LDAP editor/browser. Apache Directory supports Eclipse plug-ins.

OpenLDAP is a Windows-based open source LDAP directory. OpenLDAP enables users to browse, search and edit objects in an LDAP server. OpenLDAP also features copying, moving and deleting of trees in the directory, as well as enabling schema browsing, password management, LDAP SSL support, and more.

MongoDB 4.0, Stitch aim to broaden use of NoSQL database

MongoDB Inc. is releasing several technologies designed to make its namesake NoSQL database a viable option for more enterprise applications, led by a MongoDB 4.0 update with expanded support for the ACID transactions that are a hallmark of mainstream relational databases.

Beyond MongoDB 4.0, the company, at its MongoDB World user conference in New York, also launched a serverless platform called Stitch that’s meant to streamline application development, initially for use with the MongoDB Atlas hosted database service in the cloud.

In addition, MongoDB made a mobile version of the database available for beta testing and enabled Atlas users to distribute data to different geographic areas globally for faster performance and regulatory compliance.

While MongoDB is one of the most widely used NoSQL technologies, the open source document database still has a tiny presence compared to relational behemoths like Oracle Database and Microsoft SQL Server. MongoDB, which went public in October 2017, reported total revenue of just $154.5 million for its fiscal year that ended in January — amounting to a small piece of the overall database market.

But MongoDB 4.0’s support for ACID transactions across multiple JSON documents could make it a stronger alternative to relational databases, according to Stephen O’Grady, an analyst at technology research and consulting firm RedMonk in Portland, Maine.

The ACID properties — atomicity, consistency, isolation and durability — ensure that database transactions are processed accurately and reliably. Previously, MongoDB only offered a form of such guarantees at the individual document level. MongoDB 4.0, which has been in beta testing since February, supports multi-document ACID transactions — a must-have requirement for many enterprise users with transactional workloads to run, O’Grady said.

“Particularly in financial shops, if you can’t give me an ACID guarantee, that’s just a non-starter,” he said.

O’Grady said he doesn’t expect companies to replace the back-end relational databases that run their ERP systems with MongoDB, but he added that the document database is now a more feasible option for users who are looking to take advantage of the increased data flexibility and lower costs offered by NoSQL software in other types of transactional applications.

New technologies from MongoDB
MongoDB’s new product offerings, at a glance.

Moving from Oracle to MongoDB

That’s the case at Acxiom Corp., which collects and analyzes customer data to help companies target their online marketing efforts to web users.

Acxiom already converted two Oracle-based systems to MongoDB: a metadata repository three years ago, and a real-time operational data store (ODS) that was switched over in January. And the Conway, Ark., company wants to move more data processing work to MongoDB in the future, said Chris Lanaux, vice president of its product and engineering group.

Oracle and other relational databases are much more expensive to run and aren’t as cloud-friendly as MongoDB is, Lanaux said.

When you’re moving 90 miles per hour, it’s helpful to have guaranteed consistency. Now we don’t have to worry about that anymore.
John Riewertssenior director of engineering, Acxiom Corp.

John Riewerts, senior director of engineering on Lanaux’s team, added that Amazon Web Services and cloud platform providers offer individual flavors of relational databases. With MongoDB, “it’s just a flip of a switch for us to decide which cloud platform to put it on,” he said.

The ACID transactions support in MongoDB 4.0 is a big step forward for the NoSQL database, Riewerts said. Acxiom writes transactions to multiple documents in both the metadata system and the ODS; currently, it does workarounds to make sure that all of the data gets updated properly, but that isn’t optimal, according to Riewerts.

“When you’re moving 90 miles per hour, it’s helpful to have guaranteed consistency,” he said. “Now we don’t have to worry about that anymore.”

Acxiom also was an early user of the MongoDB Stitch backend-as-a-service platform, which was released for beta testing a year ago. Stitch gives developers an API that connects to MongoDB at the back end, plus built-in capabilities for creating JavaScript functions, integrating with other cloud services and setting triggers to automatically invoke real-time actions when data is updated.

Scott Jones, a principal architect at Acxiom, said the serverless technology enabled two developers in the product and engineering group to deploy the ODS on the MongoDB Atlas cloud service without having to wait for the company’s IT department to set up the system.

“We’re not dealing with anything really but the business logic of what we’re trying to build,” he noted.

More still needed from MongoDB

Lanaux said MongoDB still has to deliver some additional functionality before Acxiom can move other applications to the NoSQL database. For example, improvements to a connector that links MongoDB to SQL-based BI and analytics tools could pave the way for some data analytics jobs to be shifted.

“But we’re betting on [MongoDB],” he said. “Thus far, they’ve checked every box that they’ve promised us.”

Ovum analyst Tony Baer said MongoDB also needs to stay focused on competing against its primary document database rivals, including DataStax Enterprise and Amazon DynamoDB, as well as Microsoft’s Azure Cosmos DB multimodel database.

Particularly in the cloud, DynamoDB and Azure Cosmos DB “are going to challenge them,” Baer said, noting that Amazon and Microsoft can bill their products as the default NoSQL offerings for their cloud platforms. Stitch may help counter that, though, by keeping MongoDB “true to its roots as a developer-friendly database,” he added.

MongoDB 4.0 lists for $14,990 per server. MongoDB Stitch users will be charged 50 cents for each GB of data transferred between Stitch and their front-end applications, as well as back-end services other than Atlas. They’ll also pay for using compute resources at a rate of $0.000025 per GB-second, which is calculated by multiplying the execution time of each processing request by the amount of memory that’s consumed.

A data replication strategy for all your disaster recovery needs

Meeting an organization’s disaster recovery challenges requires addressing problems from several angles based on specific recovery point and recovery time objectives. Today’s tight RTO and RPO expectations mean almost no data gets lost and no downtime.

To meet those expectations, businesses must move beyond backup and consider a data replication strategy. Modern replication products offer more than just a rapid disaster recovery copy of data, though. They can help with cloud migration, using the cloud as a DR site and even solving copy data challenges.

Replication software comes in two forms. One is integrated into a storage system, and the other is bought separately. Both have their strengths and weaknesses.

An integrated data replication strategy

The integrated form of replication has a few advantages. It’s often bundled at no charge or is relatively inexpensive. Of course, nothing in life is really free. The customer pays extra for the storage hardware in order to get the “free” software. In addition, at-scale, storage-based replication is relatively easy to manage. Most storage system replication works at a volume level, so one job replicates the entire volume, even if there are a thousand virtual machines on it. And finally, storage system-based replication is often backup-controlled, meaning the replication job can be integrated and managed by backup software.

There are, however, problems with a storage system-based data replication strategy. First, it’s specific to that storage system. Consequently, since most data centers use multiple storage systems from different vendors, they must also manage multiple replication products. Second, the advantage of replicating entire volumes can be a disadvantage, because some data centers may not want to replicate every application on a volume. Third, most storage system replication inadequately supports the cloud.

Stand-alone replication

IT typically installs stand-alone replication software on each host it’s protecting or implements it into the cluster in a hypervisor environment. Flexibility is among software-based replication’s advantages. The same software can replicate from any hardware platform to any other hardware platform, letting IT mix and match source and target storage devices. The second advantage is that software-based replication can be more granular about what’s replicated and how frequently replication occurs. And the third advantage is that most software-based replication offers excellent cloud support.

While backup software has improved significantly, tight RPOs and RTOs mean most organizations will need replication as well.

At a minimum, the cloud is used as a DR target for data, but it’s also used as an entire disaster recovery site, not just a copy. This means there can be instantiate virtual machines, using cloud compute in addition to cloud storage. Some approaches go further with cloud support, allowing replication across multiple clouds or from the cloud back to the original data center.

The primary downside of a stand-alone data replication strategy is it must be purchased, because it isn’t bundled with storage hardware. Its granularity also means dozens, if not hundreds of jobs, must be managed, although several stand-alone data replication products have added the ability to group jobs by type. Finally, there isn’t wide support from backup software vendors for these products, so any integration is a manual process, requiring custom scripts.

Modern replication features

Modern replication software should support the cloud and support it well. This requirement draws a line of suspicion around storage systems with built-in replication, because cloud support is generally so weak. Replication software should have the ability to replicate data to any cloud and use that cloud to keep a DR copy of that data. It should also let IT start up application instances in the cloud, potentially completely replacing an organization’s DR site. Last, the software should support multi-cloud replication to ensure both on-premises and cloud-based applications are protected.

Another feature to look for in modern replication is integration into data protection software. This capability can take two forms: The software can manage the replication process on the storage system, or the data protection software could provide replication. Several leading data protection products can manage snapshots and replication functions on other vendors’ storage systems. Doing so eliminates some of the concern around running several different storage system replication products.

Data protection software that integrates replication can either be traditional backup software with an added replication function or traditional replication software with a file history capability, potentially eliminating the need for backup software. It’s important for IT to make sure the capabilities of any combined product meets all backup and replication needs.

How to make the replication decision

The increased expectation of rapid recovery with almost no data loss is something everyone in IT will have to address. While backup software has improved significantly, tight RPOs and RTOs mean most organizations will need replication as well. The pros and cons of both an integrated and stand-alone data replication strategy hinge on the environment in which they’re deployed.

Each IT shop must decide which type of replication best meets its current needs. At the same time, IT planners must figure out how that new data replication product will integrate with existing storage hardware and future initiatives like the cloud.

Pretend You Have More RAM With Aorus RGB Dummy Sticks



TAIPEI—So you’ve just spent several hours and several thousand dollars on a high-end PC build with RGB lights galore to illuminate your intricate liquid cooling pipes and your SLI graphics cards. But perhaps you didn’t fill up all the memory DIMM slots, so there’s a glaring, dark hole on your motherboard.

Talk about a first-world problem. But like many first-world problems, someone is bound to find a solution, and this time the solution comes in the form of a unique DDR4 memory kit from Gigabyte, which includes two 8GB modules and two dummy modules, all of which have customizable RGB lighting.

Announced at Computex on Tuesday, the $229 Aorus memory kit lets you fill up all four memory slots on your motherboard with glorious light without having to shell out a ton of money for unneeded RAM. The lights are controlled via Gigabyte’s RGB fusion software, and include patterns that you may be familiar with from your gaming keyboard, such as ripples, waves, and pulsing.

The two actual memory modules are dual-channel and run at 3200MHz. They’re compatible with most Gigabyte motherboards, including the X299, 300 series, and 400 series Intel boards as well as the X399 and AM4 AMD boards.

With a 16GB RGB-equipped RAM kit retailing for around $200 currently, you’ll pay a slight premium for Gigabyte’s new kit, but the extra money might be worth it if you’re hesitant to spend a lot more money on memory you don’t need. As a bonus, you can always buy a second kit later on when your memory needs grow to 32GB, throw away the dummies, and your motherboard will look exactly the same.

Gigabyte also announced several other Aorus PC components at Computex, including a new AP850GM PSU with up to 90 percent efficiency, the Aorus M5 gaming mouse with removable weights, and the Aorus AC300W mid-tower case with (you guessed it) customizable RGB lights.

New 80mm Laser Cut Fan Guards

As part of a declutter exercise, have several new 80mm laser cut fan guards to sell.

£2.30 for the 1st, then £1.80 for every other, bought and posted together (50p off on multiples), including 2nd class P&P.

Varied amounts of each, and advertised elsewhere.

View attachment 1016231

View attachment 1016232

View attachment 1016233

View attachment 1016234

View attachment 1016235

View attachment 1016236

View attachment 1016237

View attachment 1016238

More in next post.

Price and…

New 80mm Laser Cut Fan Guards

New 80mm Laser Cut Fan Guards

As part of a declutter exercise, have several new 80mm laser cut fan guards to sell.

£2.30 for the 1st, then £1.80 for every other, bought and posted together (50p off on multiples), including 2nd class P&P.

Varied amounts of each, and advertised elsewhere.

View attachment 1016231

View attachment 1016232

View attachment 1016233

View attachment 1016234

View attachment 1016235

View attachment 1016236

View attachment 1016237

View attachment 1016238

More in next post.

Price and…

New 80mm Laser Cut Fan Guards

Hyper engine aims to give enterprise Tableau analytics a boost

Tableau is continuing its focus on enterprise functionality, rolling out several new features that the company hopes will make its data visualization and analytics software more attractive as an enterprise tool to help broaden its appeal beyond an existing base of line-of-business users.

In particular, the new Tableau 10.5 release, launched last week, includes the long-awaited Hyper in-memory compute engine. Company officials said Hyper will bring vastly improved speeds to the software and support new Tableau analytics use cases, like internet of things (IoT) analytics applications.

The faster speeds will be particularly noticeable, they said, when users refresh Tableau data extracts, which are in-memory snapshots of data from a source file. Extracts can reach large sizes, and refreshing larger files takes time with previous releases.

“We extract every piece of data that we work with going to production, so we’re really looking forward to [Hyper],” Jordan East, a BI data analyst at General Motors, said in a presentation at Tableau Conference 2017, held in Las Vegas last October.

East works in GM’s global telecom organization, which supports the company’s communications needs. His team builds BI reports on the overall health of the communications system. The amount of data coming in has grown substantially over the year, and keeping up with the increasing volume of data has been a challenge, he said.

Extracting the data, rather than connecting Tableau to live data, helped improve report performance. East said he hopes the extra speed of Hyper will enable dashboards to be used in more situations, like live meetings.

Faster extracts mean fresher analytics

The Tableau 10.5 update also includes support for running Tableau Server on Linux, new governance features and other additions. But Hyper is getting most of the attention. Potentially, faster extract refreshes mean customers will refresh extracts more frequently and be able to do their Tableau analytics on fresher data.

“If Hyper lives up to demonstrations and all that has been promised, it will be an incredible enhancement for customers that are struggling with large complex data,” said Rita Sallam, a Gartner analyst.

Sallam’s one caveat was that customers who are doing Tableau analytics on smaller data sets will see less of a performance upgrade, because their extracts likely already refresh and load quickly. She said she believes the addition of Hyper will make it easier to analyze data stored in a Hadoop data lake, which was typically too big to efficiently load into Tableau before Hyper. This will give analysts access to larger, more complex data sets and enable deeper analytics, Sallam said.

Focus on enterprise functionality risky

Looking at the bigger picture, though, Sallam said there is some risk for Tableau in pursuing an enterprise focus. She said moving beyond line-of-business deployments and doubling down on enterprise functionality was a necessary move to attract and retain customers. But, at the same time, the company risks falling behind on analytics functionality.

Sallam said the features in analytics software that will be most important in the years ahead will be things like automated machine learning and natural language querying and generation. By prioritizing the nuts and bolts of enterprise functionality, Tableau hasn’t invested as much in these types of features, Sallam said.

“If they don’t [focus on enterprise features], they’re not going to be able to respond to customers that want to deploy Tableau at scale,” Sallam said. “But that does come with a cost, because now they can’t fully invest in next-generation features, which are going to be the defining features of user experience two or three years from now.”

For Sale – HP Microserver N36L + Hard Disks

I am doing a clear-out of my server – HP Microserver N36L + several Hard disks as per below –

HP Microserver N36L – (added a generic pic) – SOLD
Upgrades – Nvidia Geforce 210
RAM – upgraded to 4gb
OS- Windows 10 on 60GB Samsung SSD and hence extremely fast to boot.

Looking for – £70 (happy to take reasonable offers) – Can collect from Ilford, Moorgate or London Liverpool street

Also looking to sell following Hard disks – All in perfect working condition with no bad sectors. All out of warranty

Samsung Spinpoint F3 1.5TB – £30 (Internal) – SOLD

Samsung Spinpoint F2 1.5TB (internal) – £28 delivered
WD My Book Essential Edition 1 TB 7200RPM (external) – £35 delivered

I have no idea what these are selling at second hand so happy to take offers on the on-going rate.

Price and currency: 70
Delivery: Delivery cost is not included
Payment method: PPG
Location: London
Advertised elsewhere?: Not advertised elsewhere
Prefer goods collected?: I prefer the goods to be collected

______________________________________________________
This message is automatically inserted in all classifieds forum threads.
By replying to this thread you agree to abide by the trading rules detailed here.
Please be advised, all buyers and sellers should satisfy themselves that the other party is genuine by providing the following via private conversation to each other after negotiations are complete and prior to dispatching goods and making payment:

  • Landline telephone number. Make a call to check out the area code and number are correct, too
  • Name and address including postcode
  • Valid e-mail address

DO NOT proceed with a deal until you are completely satisfied with all details being correct. It’s in your best interest to check out these details yourself.