Blue Hexagon bets on deep learning AI in cybersecurity

With corporate networks becoming a prime target for threat actors, software vendors are beginning to use deep learning and other types of AI in cybersecurity. While deep learning does show promise, industry experts are skeptical.

The threat landscape is evolving, and existing network security measures like signature-based detection techniques, firewalls and sandboxing fail to keep up, said John Petersen, CIO at Heffernan Insurance Brokers, based in Walnut Creek, Calif. He sought out a deep learning application with the intelligence built in to monitor network traffic that detects threats as they come in real-time.

“Endpoint security is not secure enough anymore,” Petersen said. “You can’t secure every device on the network; you need something watching the network. So, we started as a company looking at what options we had out there that could be monitoring the network that could learn and identify zero-day attacks as they come in.”

That led him to cybersecurity startup Blue Hexagon’s deep-learning-powered network security platform, which was able to detect an Emotet infection as soon as it hit one of Heffernan Insurance Brokers’ servers.  

John Petersen, CIO at Heffernan Insurance BrokersJohn Petersen

“Blue Hexagon was able to find it right away and alert us, so we were able to take that server offline,” he said. “Now, we have a lot more [network] visibility than we ever did.”

Nayeem Islam, chief executive and co-founder of Blue Hexagon and the former head of Qualcomm research and development, said he believes automated threat defense is the future of security. Deep learning and neural network technology are some of the most advanced techniques that can be used to help defend an enterprise from the velocity and volume of modern-day threats, Islam said.

“What we were recognizing was that deep learning was having a significant impact on image and speech recognition. And, at the same time, we were also recognizing that these techniques were not being used in computer security,” Islam said.

The Sunnyvale, Calif., network security provider emerged from stealth mode earlier this year.

Other companies use deep learning and related forms of AI in cybersecurity software, including IBM Watson for Cybersecurity, Deep Instinct and Darktrace.

Nayeem Islam, CEO and co-founder at Blue HexagonNayeem Islam

Deep learning is unique because it determines what’s good and bad by looking at network flows, Islam said.

“The automation that deep learning provides reduces the amount of human intervention needed to detect threats,” he said. “People have networking infrastructure, and we sit behind the traditional defenses and provide an additional layer of defense; that’s how you would deploy us.”

The company’s deep learning platform focuses on threats that pass through the network, Islam explained. It looks at a packet as they flow through the network and applies deep learning. The Blue Hexagon deep learning models inspect the complete network flow — payloads, headers, malicious URLs and C2 communications — and are able to deliver threat inference in less than a second, according to the company. Threat prevention can then be enabled on firewalls, endpoint devices and network proxies.

“We train our deep learning models with a very diverse set of threat data,” he said. “We actually do this in the cloud — on the AWS infrastructure — and have been working with them since inception to ensure the infrastructure is optimized for security.”

Blue Hexagon; deep learning in cybersecurity
The Blue Hexagon dashboard’s threat view

Experts provide caution on AI in cybersecurity

Deep learning is indeed an interesting machine learning technique and can be used for many security use cases, said Gartner analyst Augusto Barros. But more important than understanding what it can do is to understand what deep learning in cybersecurity cannot do, Barros added.

New threat types … won’t be magically identified by machine learning.
Augusto BarrosAnalyst, Gartner

“Many machine learning implementations, including those using deep learning, can find threats, such as new malware, for example, that has common characteristics with what we already know as malware,” Barros said in an email interview. “They can be very effective in identifying parameters that can be used to identify malware, but first we need to feed them with what we know as malware and also with what we know as not malware so they can learn. New threat types … won’t be magically identified by machine learning.”

Until a couple of years ago, malware detection technology was being developed — or trained in the case of machine learning — with file samples, Barros said. Deep Learning can be very useful in identifying which characteristics of the files are most likely to determine if something is malware or not, he added.

“But when what we call fileless attacks started to appear, all those machine-learning-based tools analyzing files were not able to detect those attacks,” Barros said. “They were just looking at the wrong place. And who does tell them where they should be looking? Humans.”

Barros said he doubts any machine-learning-based system would be faster than simple signature matching. When it comes to prevention, he said, it is important to be sure of what is detected before deciding to intervene.

“Although signatures will miss unknown threats, they are very certain about what we know; antiviruses do that quite well,” Barros said. “With machine learning, you’ll only get a percentage of certainty — the algorithms tell you the changes of something being bad is xx% — so using that for intervention can be really problematic and with chances of disrupting systems.”

With enterprise network complexity increasing over time, teaching the algorithm to tell good from bad is actually much harder than the classic deep learning success stories like face recognition, said Gartner analyst Anton Chuvakin.

“The variety of what is normal, what is legitimate, what is actually acceptable to business is so wide that the training of the algorithm is going to be really difficult,” Chuvakin said.

When it comes to domains of security like malware detection, deep learning is working because there is a pretty large pool of data about legitimate software and malware that can be used to train the algorithm, he said.

“But, to me, for [network] traffic, it has a much lower chance of working,” Chuvakin said.

To really succeed with deep learning in cybersecurity, there must be a very large volume of labeled data, he said.

“It took some of the malware analysis vendors years to accumulate [data on] malware,” he said. “But, where is that data for traffic? Nobody has been collecting malicious traffic at wide scale over many years, so there is no way to point at a repository and say, ‘OK, I’m going to train my algorithm on this traffic,’ because that doesn’t really exist. Moreover, in many cases, they don’t know whether the traffic is malicious or not.”

Go to Original Article
Author:

Start developing on Windows 10 May 2019 Update today – Windows Developer Blog

The Windows 10 SDK for the May 2019 Update is now available as well with a go-live license!  Windows 10 May 2019 Update (build 18362) is now in the Release Preview Windows Insider ring. You may also know this as Windows 10, version 1903.

Every update of Windows 10 is loaded with new APIs but don’t worry, Windows Dev Center has a full list of what is new for developers. Here are a few of my favorites.

XAML Islands v1: This first version comes with a lot of improvements from preview release inside 1809. Some highlights are: resolve airspace issues in popups, the XAML content matches host’s DPI awareness, Narrator works with the XAML content, allow islands in multiple top-level windows on one thread, support MRT localization & resource loading, and keyboard accelerators work cross frameworks. Windows Community Toolkit v6, being released in June, will include wrappers for WPF and WinForms.
Windows Subsystem for Linux: While the team did a great 1903 recap blog, here is a recap on how you can now access Linux files from within Windows, and there are better command line options!

You can now use wslconfig.exe commands in wsl.exe
There are some news commands in wsl.exe:

–user,-u : Run a distro as a specified user
–import : Import a distro into WSL from a tarball
–export : Export a distro from WSL to a tarball
–terminate, t : Terminate a running distribution

Windows UI Library 2.1: WinUI is open source and everyone can check out the WinUI GitHub repo to file issues, discuss new features, and even contribute code. Inside WinUI 2.1, we’ve added new controls such as an animated visual player, enhancing the menubar, teaching tip, item repeater and much more. There are also features where we’ve addressed many accessibility, visual and functional issues reported by developers. We encourage everyone to use WinUI in their UWP apps – it’s the best way to get the latest Fluent design, controls, and is backward-compatible to Windows 10 Anniversary Update.

The first is to update your system to Windows 10 May 2019 Update by using the Release Preview Ring.  The Insider team has a great blog post that will walk you through how to get on the Release Preview Ring.  Once you do that, just go into Visual Studio 2017 or 2019 and grab the new SDK and you’re good to go.  Once 1903 goes general availability in late May, the SDK will become the default SDK inside Visual Studio.
Today with Windows Insider Release Preview for Windows 10 1903 Update:

Run the installer or go to https://www.visualstudio.com/downloads/ and download it
Go to “Individual Components”
Go to “SDKs, libraries, and frameworks” section
Check “Windows 10 SDK (10.0. 18362)”
Click “Install”

When Windows 10 May 2019 is fully released:

Run the Visual Studio installer or go to https://www.visualstudio.com/downloads/ and download it
Select “Universal Windows Platform development” under Workloads, Windows 10 SDK (10.0.18362) will be included by default
Click “Install”

More useful tips
Do you want tools for C++ desktop or game development for UWP? Be sure one of these two are selected:

C++ Universal Windows Platform tools in the UWP Workload section
Desktop development with C++ Workload and the Windows SDK 10 (10.0.18362)
If you want the Universal Windows Platform tools, Select the Universal Windows Platform tools workload

Once your systems are updated and recompiled and your app is tested, submit your app to Dev Center.
Your take on the Windows 10 May 2019 Update
Tell us what crazy things you’ve been working on with the new update by tweeting @ClintRutkas or @WindowsDev.
Known issue that could affect you
There is a known issue that affects the following scenario when upgrading to or performing a clean installation of Windows 10, version 1903. When input–output memory management unit (IOMMU) is running on VMware Hypervisor, any guest client or server virtual machine (VM) that uses IOMMU may stop working. Typical use scenarios include when Virtualization-based Security (VBS) and Windows Credential Guard are enabled for a guest VM.
We are working to provide a solution as soon as possible. You must integrate the solution into the customer image before deployment.
Updated April 18, 2019 10:40 am

Wanted – Cheap 1150 cpu and DDR3 laptop menory

Discussion in ‘Desktop Computer Classifieds‘ started by geordieboy25, Mar 13, 2019.

  1. geordieboy25

    geordieboy25

    Active Member

    Joined:
    Feb 2, 2008
    Messages:
    1,527
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    48
    Location:
    Newcastle
    Ratings:
    +42

    Hi guys.

    Just bought a 2nd user laptop and a Lenovo M93P sff.

    None of which came with memory and I’d like a few 4gb ddr3 sticks

    Also the M93P was a barebones kit so would like a cheap cpu. Just wanted to get it up and running, nothing fancy please.

    Thanks

    Location: Newcastle

    ______________________________________________________
    This message is automatically inserted in all classifieds forum threads.
    By replying to this thread you agree to abide by the trading rules detailed here.
    Please be advised, all buyers and sellers should satisfy themselves that the other party is genuine by providing the following via private conversation to each other after negotiations are complete and prior to dispatching goods and making payment:

    • Landline telephone number. Make a call to check out the area code and number are correct, too
    • Name and address including postcode
    • Valid e-mail address

    DO NOT proceed with a deal until you are completely satisfied with all details being correct. It’s in your best interest to check out these details yourself.

  2. maddy

    maddy

    Well-known Member

    Joined:
    Jan 31, 2006
    Messages:
    1,463
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    83
    Ratings:
    +347

    CEX are cheap for 4GB DDR3 – £8, or £9.50 posted. They’re the cheapest I’ve found.

    Is your Lenovo one of the tiny range? If it is, make sure you buy an Intel from their “T” series as they tiny ones aren’t designed for the thermal load of a non-T CPU.

    Great little machines.

  3. Krooner

    Well-known Member

    Joined:
    Jun 14, 2007
    Messages:
    4,752
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    136
    Ratings:
    +513

    I have 2 stick of 4gb DDR3 at home, do you need low voltage ram in the SFF?

  4. geordieboy25

    geordieboy25

    Active Member

    Joined:
    Feb 2, 2008
    Messages:
    1,527
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    48
    Location:
    Newcastle
    Ratings:
    +42

    Not too sure, is it SODIMMS that you have?

  5. Krooner

    Well-known Member

    Joined:
    Jun 14, 2007
    Messages:
    4,752
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    136
    Ratings:
    +513

    Yes, I know the ultra small form factor units require PC3L is all, not sure about the SFF It was something I ran into on the m92p.

    I have Low voltage sticks, but if it will take standard PC3 then CEX will be 50p per stick cheaper than me.

  6. GIBSrUS

    GIBSrUS

    Active Member

    Joined:
    May 30, 2011
    Messages:
    518
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    28
    Ratings:
    +42

    Hiya. I have a pentium g3258 if that’s of interest?

  7. geordieboy25

    geordieboy25

    Active Member

    Joined:
    Feb 2, 2008
    Messages:
    1,527
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    48
    Location:
    Newcastle
    Ratings:
    +42

    Thanks but I think I need a “t” series cpu.

  8. Ozzyh

    Active Member

    Joined:
    Feb 20, 2018
    Messages:
    347
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    31
    Location:
    High Wycombe
    Ratings:
    +82

    @geordieboy25 Are you still after DDR3 laptop RAM? I have 2x 4GB matching sticks if that helps.

  9. geordieboy25

    geordieboy25

    Active Member

    Joined:
    Feb 2, 2008
    Messages:
    1,527
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    48
    Location:
    Newcastle
    Ratings:
    +42

    How much please?

  10. Ozzyh

    Active Member

    Joined:
    Feb 20, 2018
    Messages:
    347
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    31
    Location:
    High Wycombe
    Ratings:
    +82

    £18 delivered ok? I can send via 1st class recorded Monday morning and you should get Tuesday or Wednesday at the latest as I’m at work till 6pm today.

    Would like to avoid sending normal 1st class as just in case they get lost in the post.

  11. Ozzyh

    Active Member

    Joined:
    Feb 20, 2018
    Messages:
    347
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    31
    Location:
    High Wycombe
    Ratings:
    +82

    Sorry just to add that the modules are a matching pair of the below. They were inside a HP laptop which my 6yr old daughter was using for playing games to help her read. Moved her onto a PC now.

    Hynix 4GB PC3L – 12800S (part number: HMT351S6EFR8A)

    These are going for about £13 for 4GB on eBay so £18 inc delivery for 8GB is a steal

    Thanks.

  12. Ozzyh

    Active Member

    Joined:
    Feb 20, 2018
    Messages:
    347
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    31
    Location:
    High Wycombe
    Ratings:
    +82

    Not heard back so i’ll create a For Sale advert as it looks like you’re not interested.

    Thanks.

Share This Page

Loading…

Go to Original Article
Author:

How to Set up Azure Cloud Storage

Cloud-based storage has been around for decades, first becoming commercially available in the 1990s with the launch of web-based email. We interact with data we store in the cloud on a daily basis through popular applications like OneDrive, Google Drive, Dropbox and iCloud, and this familiarity has helped accelerate the adoption of cloud-based storage in the enterprise. Organizations are realizing that by storing their files, data and databases in the cloud, there are usually significant cost-savings, simplified management, higher-availability, greater security, increased accessibility, and unlimited scalability. Storage management in traditional on-premises datacenters is expensive and time-consuming, as it requires procuring expensive storage hardware, setting up racks, electricity, and cooling, and costly on-site technical support whenever an issue occurs. The main scenarios where it still makes sense to retain legacy storage is when on-site servers need real-time data access, or if you are operating within an industry that requires it for compliance, privacy or security. For the rest of us, using cloud storage should be the default option and this blog will introduce you to the different types of cloud storage provided by Microsoft Azure.

Azure File Storage

This is the most basic and common type of storage, which is similar to using a file server in the cloud. It is easy to operate, inexpensive, and allows you to migrate files from your existing datacenter to Azure with little effort. Behind the scenes, a highly-available network file share is created which is accessed using SMB, a RESTful interface, or storage client libraries. Hyper-V virtual machines (VMs) can also store their virtual hard disks here, with the VMs running anywhere in Azure. Since this storage interface has been designed to mimic traditional file servers, it has a high level of compatibility with legacy applications, so IT departments will often make it the default storage for log files, analytics data, or crash dumps. The downside is that currently Active Directory-based authentication and ACLs are only supported in ‘public preview’, so anyone with access to the file share may be able to open every file.

Azure Disk Storage

If you are looking for a dedicated (virtual) disk for a cloud-based application, the best solution is Azure Disk Storage. This provides both HDD and SSD disks with low latency and high throughput for almost any workload. There are many different sizing and performance options so that you can find the best fit and price point for your application’s needs. It is even easy to upgrade from the standard HHD to premium SSD as your business grows, and using VM Scale Sets lets you dynamically change capacity as needed. These disks also support role-based access control (RBAC) and all the data is encrypted and highly-available.

Azure Blob Storage

Some organizations, particularly those supporting consumer applications, need massive amounts of storage to support unstructured data, like photographs, videos, messages, and documents. Azure Blob Storage provides a cost-effective solution with tiered storage options that basically allows its end users to ‘dump’ files and forget about them. This is designed for businesses whose users are not internal staff, but consumers who may be accessing these files through a web browser from anywhere in the world.

Azure Archive Storage

Sometimes data needs to be retained for compliance reasons for long periods, even if it will unlikely need to be accessed. This is a great scenario for Azure Archive Storage, which could be compared to “tape archive in the cloud”. Any rarely-accessed data can be stored here, where it will be encrypted and can even be automatically moved between hot and cold storage tiers, all at very low costs.

Azure Data Lake Storage

If your business is working with big data, then consider using Azure Data Lake Storage. This service provides endless storage for analytics data, allowing organizations to aggregate logs, metrics, performance information and other structured data into a single location. The data is groomed and optimized for high-performance analytics tools like Apache Spark and Hadoop. Azure Data Lake Storage also supports the latest in enterprise security, with firewalls, encryption, Active Directory integration, and granular ACL support.

Azure Queues

If you are running a large distributed cloud application, then you have likely developed some type of messaging infrastructure to allow for communication between components. Azure Queues provides a store to save and retrieve messages. These messages and tasks can be processed asynchronously, so users or applications can retrieve them independently from where they are stored in the queue. This data can also be made accessible via the public Internet over HTTP(S).

Avere vFXT for Azure

For applications which have high-performance needs, they often need short-term access to storage at incredibly fast speeds. In this case, the storage data can be cached and function like extra CPU memory. If your application needs to process large datasets at high speed, consider using Avere vFXT to quickly scale up to support fast access of data which can be stored either on-premises (using NFS or SMB) or in Azure Blob Storage. This solution can become expensive, so it is recommended for short-term use during intense periods when the storage cannot be a bottleneck and slow down the rest of the application.

Azure Databases

If you are storing a database in the cloud, it is recommended to use a native platform as a service (PaaS) solution, rather than deploying the database inside a virtual machine running in Azure (an IaaS solution). When a database is built directly into the platform, it will get better performance, reliability, and monitoring, and usually at a lower cost. These databases can include Azure SQL Server, SQL Data Warehouse, Cosmos DB, Data Factory, PostgreSQL, MySQL, Azure Cache for Redis, Table Storage, and MariaDB. Reviewing the Azure database options is beyond the scope of this blog, but you can find more information at https://azure.microsoft.com/en-us/services/#databases.

Creating an Azure Storage Account

Once you understand what type of storage you plan to use, the next steps you will take is to create an Azure storage account. This section will provide you with the basic steps to prepare for your cloud storage.

  • Create an Azure account and add a subscription.
  • Log into the Azure Portal.
  • From the Azure Portal, select Storage Accounts.
  • From the Storage Accounts view, select Add.
  • Select the subscription which you wish to use.
  • An Azure Resource Group is required for the storage account as a management group. Either select an existing Resource Group or click Create New and provide a new Resource Group with a name as shown in the following screenshot.

Creating a Azure storage account

Figure 1 – Creating an Azure Storage Account

  • Enter a unique name for the storage account.
  • Select a location for the storage account.
  • Select the deployment model. Leaving the default (Resource Manager) is recommended.
  • Choose the performance type. Leaving the default (Standard) is recommended.
  • Choose the account kind. Leaving the default (StorageV2 [general-purpose v2]) is recommended. More information about storage accounts.
  • Choose the replication type. Leaving the default (Locally Redundant Storage [LRS]) is recommended.
  • Choose the access tier. Leaving the default (Hot) is recommended.
  • Select Review + Create to verify the settings.
  • Click Create.

Your new Azure Storage Account is now ready for use, and from here you can deploy different types of Azure storage based on the needs of your application. One of the best things about Azure storage is its compatibility with partners in the Microsoft ecosystem. For example, if you are backing up your on-premises storage to the cloud using Altaro, the backup will be stored in this newly create Azure Storage Account. Now you can take advantage of cloud-based storage for all of your enterprise’s needs.

Further reading: How to set the optimal size for your Azure Virtual Machine

Go to Original Article
Author: Symon Perriman

How should you handle Exchange Server updates?

Microsoft releases Exchange Server updates every quarter. These updates include many fixes, but they can also break things.

Microsoft calls these quarterly releases cumulative updates. Each version lists the set of bugs the company fixed. Each cumulative update includes all the previous updates, so you only need the latest cumulative update with a new Exchange Server deployment to get caught up. A good rule of thumb is to keep Exchange one or two versions behind and test a rollup update for legacy versions or a cumulative update for newer versions of Exchange.

A cumulative update is a complete server build with enhancements in functionality that Microsoft started with Exchange 2013. A rollup update, also released each quarter, consists of new security updates, as well as security fixes from earlier rollups for Exchange 2010.

With each cumulative update or rollup release, there might be a new prerequisite, such as a newer .NET Framework version. Some admins don’t update immediately when Microsoft releases a patch because they prefer to stay on a stable version.

This is where the challenges come in. Some admins think they need to install every rollup or cumulative update. If, for example, an Exchange 2013 system is on cumulative update 15, but cumulative update 21 is available, there is no need to install each cumulative update one by one until you reach the current version. You can go to the latest or the previous version. Microsoft only keeps two versions of rollups and cumulative updates available to download, so you can’t download the old ones anyway.

Another challenge occurs when the update instructions say that the Exchange system needs to be on a certain cumulative update with a specific .NET Framework version before you can install the latest cumulative update. It might sound like a Catch-22 if Microsoft has already moved the specific cumulative update. Don’t let that hold you back. Install the required version of the .NET Framework, and, after the reboot, install the new cumulative update. You must also be on the correct version of an Exchange rollup or cumulative update to get support from Microsoft.

Go to Original Article
Author:

New programs released aim to improve Google Cloud security

Google has announced a slew of new security and identity features, aimed at bringing more transparency and visibility to Google Cloud security.

New security features include the following:

Access Transparency and Access Approval: Access Transparency provides near-real-time logs when Google Cloud Platform admins access an account, which Google said is only done for contractually obligated reasons. Access Approval (beta) enables admins to approve or deny requests for access from Google employees.

Data Loss Prevention and Virtual Private Cloud (VPC): Google added a beta version of user interface that supports creating, listing, deleting, viewing detail, updating inspection templates, inspection jobs and job triggers. The VPC service controls allows users to set virtual perimeters around Google Cloud platform resources to prevent data exfiltration.

Cloud Security Command Center: This gives users the ability to centralize security management across platforms to identify and mitigate suspicious activity. Users can also integrate third-party security tools, i.e., McAfee, Capsule8.

Apigee (beta): This security reporting program gives users visibility into the security status of APIs by using built-in governance and cryptography to prevent digital transaction risks. It also allows users to require client authentication so only valid users have access to APIs.

Policy Intelligence: This provides tools that use machine learning to make suggestions and provide insights on security. The Recommender uses machine learning to automatically detect overly permissive access and adjust them based on similar users and their access patterns. Troubleshooter allows security admins to see why a resource request may have been denied and makes suggestions on the best way to fix it. The Validator allows admins to create preventative measures so that users are not granted overly permissive access.

Phishing Protection: This program protects against cyberattacks — the majority of which begin with phishing emails and websites — by preventing users from accessing phishing sites by identifying various signals of malicious content. Users can also use reCAPTCHA Enterprise to ensure only legitimate customers can gain access.

There are also a handful of new programs to help users secure software supply chain, including Binary Authorization and GKE Sandbox.

  • Binary Authorization ensures that only trusted container images are published on Google Kubernetes Engine (GKE) by requiring images be signed by trusted authorities during development, and then enforcing signature validation when deploying.
  • GKE Sandbox provides additional defense between containerized workloads on GKE by reducing the need for a container to interact directly with the host. It also adds a layer of isolation between tenants in multi-tenant environments.

Identity features include the following:

  • Context-aware access now uses the BeyondCorp security model. Google claims the upgrades will provide granular control for Google Cloud Platform workloads and web applications. Users will be able to access web applications and infrastructure resources from any device using a VPN. However, security admins will still be able to enforce application-level access controls by using Zero Trust to verify users’ identities before allowing access.
  • Cloud Identity is a platform for security teams to increase end-user efficiency, protect company data and transition to a digital workspace. Using the BeyondCorp security mode, admins can control access to SaaS apps, enforce multi-factor authentication, manage devices and investigate threats. Additionally, admins can enable single sign-on, allowing access to thousands of apps.

Go to Original Article
Author:

For Sale – Mini Water Cooled i7 4770K 256SSD GTX970

Coolermaster Case I thinks it’s the Elite 110 so 210mm x 260 x 280 H x W x D

Water Cooled I7 4770K

Asus Z971 Plus MB WiFi

Coolermaster 800W Silent Pro Gold

Zotac GTX 970 GPU

Team Group Xtreem DDR3 1333 8GB

Samsung 256GB M2
WD Black 500GB Sata

W10 Pro

PC is pretty quiet unless gaming – never had a problem with it. Plays games at 1080p on high settings. Will throw in a Dell S2309WB 1080p monitor and KB/Mouse with all cables.

Price and currency: £350
Delivery: Delivery cost is not included
Payment method: BT cash on collection
Location: Leicester LE7
Advertised elsewhere?: Not advertised elsewhere
Prefer goods collected?: I prefer the goods to be collected

______________________________________________________
This message is automatically inserted in all classifieds forum threads.
By replying to this thread you agree to abide by the trading rules detailed here.
Please be advised, all buyers and sellers should satisfy themselves that the other party is genuine by providing the following via private conversation to each other after negotiations are complete and prior to dispatching goods and making payment:

  • Landline telephone number. Make a call to check out the area code and number are correct, too
  • Name and address including postcode
  • Valid e-mail address

DO NOT proceed with a deal until you are completely satisfied with all details being correct. It’s in your best interest to check out these details yourself.

Go to Original Article
Author: