Tag Archives: that

For Sale – High End Gaming Rig – Ryzen 3800x – GTX 1080ti – 16gb ddr4

I’m looking to move on this gaming rig that I built up over lockdown. Fortunately I am no longer furloughed, unfortunately this beast isn’t getting the use it deserves now I don’t have unlimited spare time

Specs as follows –

CPU – Ryzen 3800x – Amazon (boxed)
Motherboard – MSI B450 Tomahawk Max – Purchased locally (boxed)
Ram – 16gb Ripjaws DDR4 2x8gb 3200mhz -Amazon (boxed)
GPU – MSI GTX 1080 Ti Founders Edition – Ebay (Boxed, need to double check)
PSU – EVGA Supernova B2 750w – Purchased here (boxed)
Case – EVGA DG75 – Ebuyer (boxed)
SSD – 256gb integral NVME – Amazon (boxed)

This rig is currently watercooled and for reference scores just shy of 11,000 in 3dMark Timespy. Absolute flying machine at 1440p.

I will be removing the loop and going back to air cooling for the sale, too much money invested in the custom loop to sell on. The CPU cooler will be the Wraith RGB one that came with the CPU and the graphics card can be supplied with the waterblock & stock cooler if the buyer wants. Everything has original packaging so postage wont be an issue. I would prefer not to split at this time.

Price is £1000, delivery to be figured out on top but assume it will be in around £50 inc insurance. Collection is also welcome.

Photos to follow.

Go to Original Article
Author:

NetApp Data Fabric aids manufacturer’s gene-sequencing gear

To keep up with data growth that outpaces its storage budget, Pacific Biosciences mixes cloud and on-premises storage from NetApp and data movement software from startup Komprise.

Pacific Biosciences, or PacBio, deploys NetApp Data Fabric technologies to scale capacity and performance between science teams in a shared storage environment. It turns to NetApp partner Komprise software to streamline data migration.

Flexible data movement helps PacBio dynamically assess the power and performance of its genomic sequencing equipment. The company with headquarters in Menlo Park, Calif., makes sequencers used in a variety of fields, including wildlife conservation, improving food supplies and aiding COVID-19 vaccine research.

NetApp Data Fabric enables enterprises to implement hybrid cloud data management. The Data Fabric technology is a software component inside the NetApp Cloud Volumes ONTAP operating system. Customers use Data Fabric to create a virtual NetApp platform in the public cloud that mirrors how data is stored on premises.

Adam Knight, senior IT manager, Pacific BiosciencesAdam Knight

Genomics is a data-intensive discipline that requires dense, high-performance storage. Adam Knight, PacBio’s senior IT manager, said the NetApp-Komprise combination better enables him to keep pace with data growth, currently at nearly 5 petabytes (PB) and growing.

“We use NetApp Data Fabric to shift performance around as needed to support different areas. Our storage budget isn’t increasing at the rate of our data growth. We had to get creative and manage as many items we could under one umbrella. If we had storage islands, we wouldn’t be able to leverage the combined effort of our scientists,” Knight said.

Storage management to boost manufacturing

The combined scientific effort is key to PacBio’s product development. Knight said much of the data generated internally stems from PacBio engineering teams striving to build product improvements into the equipment.

PacBio manufactures long-read sequencers, which allows DNA strands to be compiled in a manner that generates an effective reference of potential gene variations. The large machines have tight manufacturing tolerances and cost hundreds of thousands of dollars.

“The goal for manufacturing our instrument is to support more bandwidth and high-quality data throughput in less time for less cost. We generate a lot of new data each year,” Knight said.

PacBio standardized its storage environment on NetApp FAS hybrid systems, using NetApp E-Series arrays as a multi-petabyte archive. Using predefined user policies, Komprise transparently moves data between the NetApp storage, keeping active data on the primary FAS arrays.

In addition, the genomics equipment maker added NetApp StorageGrid Webscale software-defined object storage to streamline tagging and searching of metadata.

Go to Original Article
Author:

Microsoft plugs 2 zero-days on August Patch Tuesday

Microsoft shut down two zero-days, including one that had been publicly disclosed, as part of its security update releases for August Patch Tuesday.

In total, Microsoft addressed 120 unique vulnerabilities, with 17 of those rated critical. Technologies and products with security updates this month include both the HTML- and Chromium-based Microsoft Edge browsers, Microsoft’s ChakraCore JavaScript engine, Internet Explorer, Microsoft Scripting Engine, SQL Server, Microsoft JET Database Engine, .NET Framework, ASP.NET Core, Microsoft Office and Microsoft Office Services and Web Apps, Microsoft Windows Codecs Library and Microsoft Dynamics. This month marks the sixth in a row Microsoft addressed more than 100 unique vulnerabilities in its monthly security updates package.

Microsoft terminates two zero-days

One zero-day (CVE-2020-1464) fixed by the August Patch Tuesday releases is a Windows spoofing vulnerability rated important that would allow an attacker to sidestep the OS security features and load an improperly signed file. This bug affects all supported versions of Windows — as well as Windows 7 and Windows 2008/2008 R2 for customers who paid for Extended Security Update (ESU) licenses for continued support of these systems that reached end of life in January. A bug that allows a malicious actor to bypass this security feature could open the door to put malicious files on a Windows system.  

“Typically, files get signed by a trusted vendor, and that signature validation is critically important to a lot of security mechanisms,” said Chris Goettl, director of product management and security at Ivanti, a security and IT management vendor based in South Jordan, Utah. “The fact that an attacker can bypass that means that they can introduce improperly validated malicious files to the operating system, and technologies that should be able to validate based on signature might be able to be tricked because of this.”

Chris Goettl, director of product management and security, IvantiChris Goettl

Microsoft’s notes on this CVE lacked the usual details about potential attack scenarios, which seems to indicate an attacker would have some additional hurdles to take advantage of the flaw. This might be why a Windows zero-day also got a relatively low CVSS base score of 5.3.

“The attacker would need to execute an asset that is improperly signed, so it’s not something they can just send to somebody. Microsoft doesn’t really get into details about how some attacker might be able to take advantage of that,” Goettl said.

The second zero-day (CVE-2020-1380) is remote code execution vulnerability in the Microsoft Scripting Engine used in Internet Explorer 11 rated critical in Windows desktop systems and moderate on Windows Server 2008 R2, Windows Server 2012 and 2012 R2. Because the Microsoft Scripting Engine is also used in Microsoft Office, which widens the attack vector for this vulnerability.

“The vulnerability could be exploited a couple of different ways: by setting up a specially crafted website via advertisements that may be compromised, or it could be loaded up using an application or an Office document that uses the IE rendering,” Goettl said.

Windows Server hit by domain controller bug

Microsoft provided a lengthy description for handling CVE-2020-1472, a critical Netlogon elevation-of-privilege flaw affecting supported Windows Server OSes, including Windows Server 2008 and 2008 R2 for ESU customers. On an unpatched domain controller — the Active Directory component tasked with managing security authentication requests — an attacker could acquire domain administrator access without needing system credentials. 

Microsoft said it is using a “phased two-part rollout” to patch the bug with the first part of the deployment executed in the August Patch Tuesday security update.

“The updates will enable the [domain controllers] to protect Windows devices by default, log events for non-compliant device discovery, and have the option to enable protection for all domain-joined devices with explicit exceptions,” according to the CVE instructions.

Microsoft plans the second phase on February Patch Tuesday in 2021, which it calls “the transition into the enforcement phase.

“The [domain controllers] will be placed in enforcement mode, which requires all Windows and non-Windows devices to use secure Remote Procedure Call (RPC) with Netlogon secure channel or to explicitly allow the account by adding an exception for any non-compliant device,” the company wrote.

Goettl said administrators should begin testing the patch in a lab and testing it before the hard enforcement occurs, which requires all domain controllers — even those in read-only mode — to be updated. Microsoft provided further guidance in the support documentation at this link

Other notable corrections from August Patch Tuesday

  • Microsoft Outlook has two CVEs this month. CVE-2020-1483 is a memory-corruption vulnerability rated critical that could let an attacker run arbitrary code in the context of the current user using several different attack vectors, including the preview pane. CVE-2020-1493 is an information-disclosure vulnerability rated important that could let an attacker view a restricted file from the preview pane by sending it as a file attachment.
  • CVE-2020-1455 is a Microsoft SQL Server Management Studio denial-of-service vulnerability rated important that, if exploited, could let an attacker disrupt the use of the application.
  • The .NET Framework has two CVEs. CVE-2020-1046 is a critical remote code execution vulnerability that an attacker could use to control the unpatched system using a specially crafted file. CVE-2020-1476 is an important elevation-of-privilege vulnerability in ASP.NET or .NET web applications running on IIS that could let an attacker access restricted files.
  • Microsoft resolved an elevation-of-privilege vulnerability (CVE-2020-1337) rated important for supported Windows systems on both the client and server side. The patch resolved a lingering printer spooler issue that had been patched multiple times — most recently in May — but security researchers found a way to bypass the patch and gave a recent Black Hat USA presentation on the flaw, which has its origins in the Stuxnet worm from 2010. Despite public knowledge of the bug, Microsft’s CVE did not report this as publicly disclosed.

Go to Original Article
Author:

Microsoft plugs 2 zero-days on August Patch Tuesday

Microsoft shut down two zero-days, including one that had been publicly disclosed, as part of its security update releases for August Patch Tuesday.

In total, Microsoft addressed 120 unique vulnerabilities, with 17 of those rated critical. Technologies and products with security updates this month include both the HTML- and Chromium-based Microsoft Edge browsers, Microsoft’s ChakraCore JavaScript engine, Internet Explorer, Microsoft Scripting Engine, SQL Server, Microsoft JET Database Engine, .NET Framework, ASP.NET Core, Microsoft Office and Microsoft Office Services and Web Apps, Microsoft Windows Codecs Library and Microsoft Dynamics. This month marks the sixth in a row Microsoft addressed more than 100 unique vulnerabilities in its monthly security updates package.

Microsoft terminates two zero-days

One zero-day (CVE-2020-1464) fixed by the August Patch Tuesday releases is a Windows spoofing vulnerability rated important that would allow an attacker to sidestep the OS security features and load an improperly signed file. This bug affects all supported versions of Windows — as well as Windows 7 and Windows 2008/2008 R2 for customers who paid for Extended Security Update (ESU) licenses for continued support of these systems that reached end of life in January. A bug that allows a malicious actor to bypass this security feature could open the door to put malicious files on a Windows system.  

“Typically, files get signed by a trusted vendor, and that signature validation is critically important to a lot of security mechanisms,” said Chris Goettl, director of product management and security at Ivanti, a security and IT management vendor based in South Jordan, Utah. “The fact that an attacker can bypass that means that they can introduce improperly validated malicious files to the operating system, and technologies that should be able to validate based on signature might be able to be tricked because of this.”

Chris Goettl, director of product management and security, IvantiChris Goettl

Microsoft’s notes on this CVE lacked the usual details about potential attack scenarios, which seems to indicate an attacker would have some additional hurdles to take advantage of the flaw. This might be why a Windows zero-day also got a relatively low CVSS base score of 5.3.

“The attacker would need to execute an asset that is improperly signed, so it’s not something they can just send to somebody. Microsoft doesn’t really get into details about how some attacker might be able to take advantage of that,” Goettl said.

The second zero-day (CVE-2020-1380) is remote code execution vulnerability in the Microsoft Scripting Engine used in Internet Explorer 11 rated critical in Windows desktop systems and moderate on Windows Server 2008 R2, Windows Server 2012 and 2012 R2. Because the Microsoft Scripting Engine is also used in Microsoft Office, which widens the attack vector for this vulnerability.

“The vulnerability could be exploited a couple of different ways: by setting up a specially crafted website via advertisements that may be compromised, or it could be loaded up using an application or an Office document that uses the IE rendering,” Goettl said.

Windows Server hit by domain controller bug

Microsoft provided a lengthy description for handling CVE-2020-1472, a critical Netlogon elevation-of-privilege flaw affecting supported Windows Server OSes, including Windows Server 2008 and 2008 R2 for ESU customers. On an unpatched domain controller — the Active Directory component tasked with managing security authentication requests — an attacker could acquire domain administrator access without needing system credentials. 

Microsoft said it is using a “phased two-part rollout” to patch the bug with the first part of the deployment executed in the August Patch Tuesday security update.

“The updates will enable the [domain controllers] to protect Windows devices by default, log events for non-compliant device discovery, and have the option to enable protection for all domain-joined devices with explicit exceptions,” according to the CVE instructions.

Microsoft plans the second phase on February Patch Tuesday in 2021, which it calls “the transition into the enforcement phase.

“The [domain controllers] will be placed in enforcement mode, which requires all Windows and non-Windows devices to use secure Remote Procedure Call (RPC) with Netlogon secure channel or to explicitly allow the account by adding an exception for any non-compliant device,” the company wrote.

Goettl said administrators should begin testing the patch in a lab and testing it before the hard enforcement occurs, which requires all domain controllers — even those in read-only mode — to be updated. Microsoft provided further guidance in the support documentation at this link

Other notable corrections from August Patch Tuesday

  • Microsoft Outlook has two CVEs this month. CVE-2020-1483 is a memory-corruption vulnerability rated critical that could let an attacker run arbitrary code in the context of the current user using several different attack vectors, including the preview pane. CVE-2020-1493 is an information-disclosure vulnerability rated important that could let an attacker view a restricted file from the preview pane by sending it as a file attachment.
  • CVE-2020-1455 is a Microsoft SQL Server Management Studio denial-of-service vulnerability rated important that, if exploited, could let an attacker disrupt the use of the application.
  • The .NET Framework has two CVEs. CVE-2020-1046 is a critical remote code execution vulnerability that an attacker could use to control the unpatched system using a specially crafted file. CVE-2020-1476 is an important elevation-of-privilege vulnerability in ASP.NET or .NET web applications running on IIS that could let an attacker access restricted files.
  • Microsoft resolved an elevation-of-privilege vulnerability (CVE-2020-1337) rated important for supported Windows systems on both the client and server side. The patch resolved a lingering printer spooler issue that had been patched multiple times — most recently in May — but security researchers found a way to bypass the patch and gave a recent Black Hat USA presentation on the flaw, which has its origins in the Stuxnet worm from 2010. Despite public knowledge of the bug, Microsft’s CVE did not report this as publicly disclosed.

Go to Original Article
Author:

Value stream management tames DevOps chaos at Eli Lilly

Pharma giant Eli Lilly discovered that simply using software delivery automation tools doesn’t bring about IT efficiency — achieving that goal required a top-down view and detailed measurements of IT and business processes, a practice known as value stream management.

Eli Lilly began searching for a way to sort out multiple DevOps pipeline tools and processes five years ago, as the company’s use of tools such as Heroku PaaS, Atlassian Jira application lifecycle management and CloudBees Jenkins continuous integration tools increased, but so did software testing and delivery delays.

Despite the use of IT automation tools such as Chef, procuring a software testing environment could take up to 30 days, and deployment processes often became bogged down in compliance-related approval processes.

“Our environments were managed via Excel, and even the Excel [spreadsheets] didn’t make sense,” said Marvin Stigter, head of platform services in the test management office at Eli Lilly. “And then we had project managers sitting up in the middle of the night, coordinating [jobs] between onshore and offshore [teams].”

That year, Eli Lilly execs became aware of Plutora, and the company eventually chose to deploy it over a competing value stream management product from their incumbent IT service management vendor ServiceNow.

Value stream management is an IT product category created by Forrester Research analysts in 2002. The term stems from value stream mapping, a practice with a long history in Lean manufacturing environments such as Toyota.

Marvin Stigter, head of platform services, test management office, Eli LillyMarvin Stigter

Value stream mapping documents the repeatable steps required to deliver a product or service to a customer, and then analyzes them to make improvements. Value stream management tools apply these principles to software delivery, measure the performance of DevOps teams against improvement goals, and in some cases, orchestrate DevOps workflow automation.

Value stream management is a growing product category. In its third-quarter 2020 Wave report on value stream management tools, Forrester assessed 11 products, from vendors that also included Digital.ai, Tasktop, Targetprocess, IBM, ConnectAll, CloudBees, Atlassian, GitLab and Blueprint.

Eli Lilly evolves from DevOps efficiency to COVID-19 data management

It took IT pros at Eli Lilly 18 months to master Plutora’s value stream management products, Stigter said. The process included creating some custom webhooks to integrate earlier versions of the Plutora product with third-party tools such as Jira. However, once that was finished, the Plutora tool had a transformative effect on Lilly’s software delivery, Stigter said.

“Since that time, we have not had one bad release go out,” he said. “It was way clearer to everybody how to make the right decision at the right time than before, [where] our release schedule was nobody knew about it until something went wrong.”

It was way clearer to everybody how to make the right decision at the right time than before, [where] our release schedule was nobody knew about it until something went wrong.
Marvin StigterHead of platform services, Eli Lilly

Value stream management measured the time it took to complete certain tasks, such as standing up a testing environment or signing an approval request. It also pinpointed the cause of lags. The Plutora tool sent automated email reminder messages to people about 15 minutes before they were expected to contribute to a process, in order to keep pipelines running on time. Plutora also added automated schedule adjustments to optimize these processes, rescheduling certain tasks and updating the right people with notifications to make delivery as fast as possible.

“You do have to be careful with the emails … you can use it the wrong way as well and get overwhelmed,” Stigter said. “But if you implement [them] correctly, it’s a huge timesaver.”

In fact, value stream management also allowed Stigter to calculate precisely how much of his team’s time was saved. Building a new test environment now takes five hours at most, compared to the previous maximum of 30 days. Blackout periods for software updates shrank from as much as two full days down to between two and four hours. Between optimized team workflows and automation bots deployed via Plutora, Stigter estimates the company has saved about $16 million per year since 2017.

This year, Eli Lilly has begun to expand its use of the Plutora tool beyond the software delivery lifecycle to workflows such as clinical trials of a potential treatment for COVID-19. Plutora helped streamline the process of loading report data from clinical trial sites into a data warehouse, tracking how many reports have been loaded and visualizing the inflow of data for business and IT stakeholders.

“Our customers in our senior management [now] get a higher level of detail with what’s going on, so we kill all the manual communications saying, ‘Hey, where are you? What’s going on?'” Stigter said. “When COVID came down, we had [study data] uploaded within two days, which had never happened before. Normally, at a minimum, it’s five weeks.”

Plutora reveals value stream management roadmap

Stigter and his team have begun to beta test new Plutora product features set for release later this year, including enhancements to how the tool orchestrates multistep automation tasks.

“If you have code in Chef that you want to kick off, now somebody doesn’t have to do it manually,” Stigter said. “If you have two or three different automations, one after another, [Plutora] will now also go automatically to the next one.”

Those multistep automation triggers have been inconsistent at times during tests, Stigter said, but continue to improve.

The upcoming Plutora release will also revamp how Plutora integrates with third-party data sources, replacing a RESTful API architecture with an event-driven system for faster data ingestion, with less custom integration required. This is similar to plans for IT automation vendor Puppet’s Relay product, which also aims to streamline IT automation workflows.

Stigter said he looks forward to faster, even real-time, data flows into Plutora’s dashboards as a possible result of the event-driven overhaul.

“[The] reports are just not real-time enough, and that’s really the lifeline of the tool,” he said. “[Without it,] if I say, ‘We saved a lot of time during this release over the weekend,’ nobody really understands what that means if they were not involved with it.”

Go to Original Article
Author:

TigerGraph adds cloud support, helps standardize graph coding

TigerGraph wants to bring graph database technology to the masses.

To that end, the vendor, founded in 2012 and based in Redwood City, Calif., recently unveiled TigerGraph on Microsoft Azure and took steps to make its proprietary coding language, designed to resemble SQL, part of the standard coding language for the development and querying of graph databases so it can be quickly learned and used by anyone with experience in coding.

Graph databases differ from more traditional relational databases by attempting to simplify the connection of data points while simultaneously allowing data points to connect with more than just one other data point at a time. In theory, they speed up the query process and enable users to easily extract data from disparate sources.

“Democratization of graph is our mission statement for the foreseeable future,” said Gaurav Deshpande, TigerGraph’s vice president of marketing. “We don’t want to see customers using it like it’s a fancy thing and use it just for one little project. We want to them to use it as an everyday part of their lives. Our objective is to make graph accessible and available.”

Expansion in the cloud

TigerGraph emerged from stealth in 2016, and in September 2019 launched TigerGraph Cloud, the database-as-a-service version of its graph database which has both a free tier and pay-as-you-go tier. Up until just a few weeks ago, however, TigerGraph Cloud only offered support for Amazon Web Services. TigerGraph Cloud users that deployed on other clouds couldn’t use TigerGraph natively, and that proved to be a barrier as TigerGraph tried to attract more paying customers.

Meanwhile, according to Deshpande, users asked for native support for their cloud and said they’d upgrade if TigerGraph could provide it. So on July 16, TigerGraph added support for Microsoft Azure, and support for the Google Cloud Platform (GCP) will be added within the next few months.

Our objective is to make graph accessible and available.
Gaurav DeshpandeVice president of marketing, TigerGraph

“We saw a lot of people coming on board, and they were all Azure people,” he said. “They told us, ‘We like TigerGraph Cloud, but even to start in the public cloud I cannot start with AWS — I need to have my workload inside an Azure Cloud,'” he said. “That was the impetus for us adding Azure as an option. Azure support was specifically in response to a market force.”

He added that TigerGraph conducted a survey of 300 customers who said that if TigerGraph offered native support of GCP they would use it.

Support for multiple clouds, meanwhile, is becoming commonplace among analytics vendors, and according to Mike Leone, senior analyst at Enterprise Strategy Group, is important for a relatively new vendor like TigerGraph as it attempts to expand its customer base.

“It’s about bringing graph to where the data is,” he said. “And for 80-plus percent of organizations, it’s in multiple locations, on premises and in multiple clouds.”

A no-code visual query is displayed on a sample screenshot from TigerGraph.
A sample screenshot from TigerGraph shows a no-code visual query under development.

Language arts

Beyond adding more cloud options to attract more paying customers, TigerGraph is trying to expand the use of graph databases by making its graph query language as simple to use as possible.

The vendor’s query language is called GSQL, and from the start was designed to be as close to SQL as possible so that data scientists and application designers familiar with SQL would be able to easily adapt to TigerGraph and turn their data into actionable models data sets.

GSQL essentially starts with SQL, but then adds what TigerGraph calls accumulators that are designed to enable users to perform complex computations on connected data sets within graph databases in ways they’re not able to within relational databases.

In addition to writing GSQL itself, however, TigerGraph is attempting to help standardize the coding language behind graph databases, and recently submitted a paper to SIGMOD — a special interest group on data management that’s part of the Association for Computing Machinery — and presented it at SIGMOD 2020 in June.

The paper made the machinery behind GSQL public so that aspects of GSQL — in particular the accumulators — can be integrated into GQL, the language for graph querying being developed by the International Organization for Standardization that already oversees the development of SQL.

“It will take a few years because it’s a fairly involved functionality, but we’re excited that what we have learned from customers we are donating back to the graph query language and the movement for having a standardized graph query language,” Deshpande said.

Leone, meanwhile, said that easing the transition to graph databases is critical for TigerGraph and its continued growth.

“Simplicity, simplicity, simplicity,” he said. “Simplifying adoption of graph databases is the greatest barrier to broader adoption across the industry.

“Organizations have the data, they have the need, and TigerGraph is focused on simplifying ramp up of using graph technology. And it’s not just about experts. Enabling generalists to use the technology is critical to success.”

Regarding the move to make GSQL as easy for programmers to learn as possible and help create a standardized graph query language, Leone added that too will only help TigerGraph in the long run.

“Enabling a seamless and easy transition from SQL to QSQL matters for many potential customers,” he said.

Development roadmap

Just as TigerGraph’s addition of support for Microsoft Azure and the publication of its coding language harken back to the vendor’s goal of making graph technology accessible to the masses, its roadmap is similarly shaped by that aim.

Future development, Deshpande said, will center around adding more in-database machine learning capabilities and more no-code capabilities so that business users can do increasingly complex queries without needing to learn GSQL.

TigerGraph sees itself becoming the “de facto standard” for relationship analysis and as a vendor that simplifies complex relationships for users, Deshpande said.

“Expect us to do more on that front, both driven by customer demand, as well as our own mission of making graph available to everybody in the world,” he said.

Go to Original Article
Author:

Office 365 Advanced eDiscovery goes beyond search basics

Advanced eDiscovery in Microsoft 365 — or Office 365, depending on your subscription — is a powerful tool that can index data sets and make it easily searchable.

This tool also supports the import and analysis of external data. This is a useful feature for legal and human resources positions or any other job in which you need to search through data for keywords and use the AI features of the platform to weed out undesirable information. Before we look at Office 365 Advanced eDiscovery, let’s start with some basics.

The standard eDiscovery feature is available to any Business or Enterprise licensed customer of the Microsoft 365 or Office 365 suite. It provides several functions, including:

Once you configure the data set, the data will be indexed for review. More searches, known as queries, can be run and the results exported for use outside of eDiscovery.

What is Office 365 Advanced eDiscovery?

Office 365 Advanced eDiscovery is more of an end-to-end product for eDiscovery requirements. Advanced eDiscovery follows the Electronic Discovery Reference Model framework, which provides more granularity to the settings you can control over a case.

Advanced eDiscovery requires licensing above the common Microsoft/Office 365 E3 license such as the E5 compliance add-on, the eDiscovery and Audit add-on or just a switch to the Microsoft/Office 365 E5 tier.

Some of the Advanced eDiscovery highlights include:

  • search and analytics functionality thread emails, rather than dealing with each email separately;
  • optical character recognition to convert images to text-searchable documents;
  • built-in ability to analyze, query, review, tag and export content as a group, rather than individual files;
  • better visibility of long-running tasks to check progress;
  • predictive coding allows you to train the system to determine which data is relevant; and
  • detecting files that are the same or almost the same to avoid a review of the same content.

Organizations can benefit from the import feature

Another key feature for Advanced eDiscovery is the ability to import non-Microsoft 365 content.

To start, create a review set which is only available in Advanced eDiscovery. A review set is a defined group of data sets, which can be used with the group functionality listed above.

You can add external data to a review set with the Manage review set option. Clicking the View uploads link in the Non-Office 365 data section takes you to a page with an Upload files button, which starts a wizard after it is clicked.

The wizard asks you to prepare the files you want to upload. You’ll need to create a top-level folder with a subfolder for each custodian you have data for in User Principal Name UPN format for the account, such as [email protected] Upload the data in those user-named folders.

Click on this link to see the file types you can import into Advanced eDiscovery. Microsoft supports several archive and container formats such as PST mailbox files that you can upload in a batch and run searches against.

To upload the files, you will need to install the AzCopy utility. When you have all the files ready, input the path in the wizard to the user folders, which will generate a command to paste into a command prompt that will trigger the upload using AzCopy. The utility will show statistics, such as the progress, time elapsed and the upload speed during the transfer process.

Once finished, you can go back to the webpage and click the Next: Process files button. Do not click this button until the upload completes or you’ll start a one-time processing of the incomplete data that you can’t cancel or run again.

The time it takes to process depends on the amount of data uploaded. When the transfer completes, the Manage review sets page will show a summary report of the data, allow you to train the system for relevant information, and manage tags to help identify and discover data based on your search criteria.

You need to apply the license to the account(s) you are searching; administrators and staff reviewing the data do not need this extra license, so for occasional ad-hoc Advanced eDiscovery requirements, your organization might only need a few licenses that you can move as required — assuming you don’t want the other benefits that come with each license.

Advanced eDiscovery takes some time to understand from both the administrator’s point of view and from the perspective of the user with access to a case. But it’s a powerful — and relatively inexpensive — tool for an organization that might have access as part of the E5 license. It’s still very cost-effective if you only need a few add-on licenses compared to any other eDiscovery product on the market.

Go to Original Article
Author:

Dell confirms a possible VMware spinoff

After weeks of speculation, Dell Technologies confirmed reports that it is exploring a potential spinoff of VMware next year. Company officials said if it does so, it will continue to have a strategic relationship with VMware.

Hoping to assure VMware users, Dell chairman and CEO Michael Dell said in a prepared statement that “the strategic relationship between Dell Technologies and VMware has never been stronger.” He added that despite the options being explored, the two companies will accelerate their current strategies, including jointly creating a number of integrated products.

Dell gained an 80% stake in VMware when it acquired EMC in 2015. EMC acquired VMware in 2004.

Possible Dell-VMware spinoff at least a year off

Dell also confirmed what analysts have said over the past few weeks: that any spinoff would not happen until September of 2021 so the company can sidestep the heavy federal taxes associated with the deal.

Dell said if it decides to follow through on a spinoff, it will agree to specific terms and conditions under the auspices of a special committee made up of the Board of Directors of VMware and Dell. Dell said it would also negotiate the payment of a special cash dividend by VMware to be paid on a pro rata basis to all VMware shareholders.

With its commanding position in virtualization and concentrated focus on hybrid clouds and containers, most analysts believe a spinoff provides VMware with more competitive advantage than it has with Dell, which is traditionally a hardware-oriented company. But some believe the economic advantages Dell would gain through a spinoff outweigh any strategic technology disadvantages. Wall Street doesn’t like the capitalization structure between Dell and VMware, which investors believe has held down the market value of Dell.

“The crazy thing is, Dell has a market worth of around $30-something billion, but they own 81% of VMware, which should make their value about $50 billion,” said Patrick Moorhead, president and principal analyst at Moor Insights and Strategy. “It almost forces [Dell’s] hand to sell VMware,” he said.

While many IT market analysts favored VMware, investors were encouraged by Dell’s confirmation of a potential spinoff sending the company’s stock up 17% the day of the announcement on Thursday.

Even with the financial upside to a spinoff, Dell would still need to find another software partner, or multiple software partners, to replace VMware, Moorhead said, which could prove difficult.

“They certainly aren’t going to go out and buy another company because the capitalization doesn’t work out there,” Moorhead said. “The other route is to do what HP did and pull together various pieces and build their own stack, but I don’t see them pulling that off.”

Another analyst believes Dell, in part, is exploring a spinoff is to gain more freedom to cozy up to other top-tier cloud providers.

“The only possible reason [Dell] would float this out there is because of two possible suitors; namely Google and Amazon,” said Dana Gardner, principal analyst at Interarbor Solutions, LLC. “Those two would like the chance to take that installed VMware base across, but neither one of them would see Dell as that strategic.”

Go to Original Article
Author:

Microsoft Teams users seek better channel management tools

Users of Microsoft Teams are increasingly frustrated that the software vendor has kept two highly demanded channel management features on the backlog for years.

Thousands of customers have asked Microsoft to introduce the ability to archive channels and move them between teams. But Microsoft has kept both feature requests on the back burner.

The features’ absence causes headaches for IT admins as they try to keep their organization’s Teams account organized.

Sometimes users create channels under the wrong team. But the only way to fix the mistake is to delete the channel and all the user work inside.

Other times, a project moves from one team to another. In those cases, users want to move the channel associated with the project to the new group.

After only six months of using Teams, a data management consulting firm in London had heavily active channels that it needed to move but couldn’t.

“It is frankly ridiculous,” said a senior manager. The admin requested anonymity because he wasn’t authorized to talk for the firm.

The lack of such an essential feature as moving channels demonstrates how Teams is still relatively immature as a collaboration app, said Eric Prosser, who oversees IT and facilities for the Santa Clara County Housing Authority.

“It just doesn’t have a lot of the bells and whistles that other tools have,” Prosser said.

Channel archiving would be another important feature for keeping Teams organized. Users can already hide channels from view. But they also want to archive them so that they no longer count towards the 200-channel limit within a team.

With more than 21,000 votes, the ability to move a channel between teams is the third most popular request on Microsoft’s user feedback site for Teams. Channel archiving, at more than 14,500 votes, is the seventh most popular.

The features are two examples of necessary enhancements to Teams that users have been waiting on for years. For instance, users are also seeking improvements to the calendar and the ability to use multiple Teams accounts simultaneously on the desktop version of the app. 

Instead of fixing these problems, Microsoft has made it a priority to improve its video conferencing capabilities to catch up to Zoom. The vendor recently expanded its group video display and planned to launch a new virtual reality-style video mode.

“It’s Microsoft,” Prosser said. “Microsoft thinks that they know everything for you.”

Go to Original Article
Author:

Distributed SQL database capabilities come to MariaDB X5

Database vendor MariaDB updated its flagship platform with a new release out Thursday that brings a series of enhancements, including distributed SQL capabilities.

The new MariaDB Platform X5 is the second major release from the open source database vendor in 2020, following the MariaDB X4 release on Jan. 14. Based in Redwood City, Calif., the company has also been busy in 2020 launching its SkySQL cloud database as a service on March 31 and announcing a $25 million round of funding on July 9.

Matt Aslett, research director at S&P Global Market Intelligence, said the vendor’s move to enable distributed SQL capabilities, which enable a database to scale out, as being of interest to a portion of MariaDB’s customers.

“Its [MariaDB’s] combination of multi-version concurrency control and consensus to support strong consistency will certainly be of interest to customers that require support for globally distributed transactions,” Aslett said.

Michael Howard, CEO of MariaDB, said certain customers will be interested in distributed SQL capabilities. MariaDB is now engaging with customers for use cases that previously would not have been easily possible, he said.

While he declined to name specific companies, Howard said that some of the new use cases include large SaaS vendors that are now able to use MariaDB, due to the distributed capabilities in the X5 update. Howard said there is interest in the technology from customers in the financial services industry as well.

This is a full distributed SQL parallel query capability to partition tables any which way up and down across clusters and have five nines and above of reliability for global and far-reaching applications.
Michael HowardCEO, MariaDB

MariaDB X5 Xpand smart engine powers distributed SQL

The vendor has branded as Xpand the technology that enables MariaDB X5’s distributed SQL capabilities.

The Xpand smart engine distributes table data, as well as indexes across database instances, to enable users to do distributed queries across all data. Howard said the Xpand approach is not an application layer sharding mechanism, in which only parts of the data distributed. Rather, Xpand provides a data structure that manages the data in a more scalable approach, he noted.

“This is a full distributed SQL parallel query capability to partition tables any which way up and down across clusters and have five nines and above of reliability for global and far-reaching applications,” Howard said.

Columnstore and InnoDB enhanced in MariaDB X5

MariaDB X5 also benefits from columnstore improvements that boost scalability, Howard said, adding that in X5 the ColumnStore is now faster and more coherent and integrates a full high-availability capability.

The InnoDB storage engine that is a core element of MariaDB also is being updated to accelerate database operations.

“In this version of InnoDB we really wanted to increase the throughput of high velocity write applications,” Howard said.

Better Apache Kafka integration

Like other databases, MariaDB is often used alongside Apache Kafka, the open source event streaming technology.

While MariaDB has long enabled applications to bring Kafka data into the database, Howard said that with X5 the whole process has been simplified. There is no revolutionary approach to Kafka in X5, but rather a series of incremental improvements.

“So, we just improved the interface and the real-time capabilities,” he said.

Go to Original Article
Author: