Tag Archives: still

Deepfakes: Security experts undecided on the threat level

Deepfake technology has advanced at a rapid pace, but the infosec community is still undecided about how much of a threat deepfakes represent.

Many are familiar with deepfakes in their video and image form, where machine learning technology generates a celebrity saying something they didn’t say or putting a different celebrity in their place. However, deepfakes can also appear in audio and even text-based forms. Several sessions at RSA Conference 2020 examined how convincing these fakes can be, as well as technical approaches to refute them. But so far, threat researchers are unsure if deepfakes have been used for cyberattacks in the wild.

In order to explore the potential risk of deepfakes, SearchSecurity asked a number of experts about the threat deepfakes pose to society. In other words, should we be worried about deepfakes?

There was a clear divide in the responses between those who see deepfakes as a real threat and those who were more lukewarm on the idea.

Concern about deepfakes

Some security experts at RSA Conference 2020 feared that deepfakes would be used as part of disinformation campaigns in U.S. elections. McAfee senior principal engineer and chief data scientist Celeste Fralick said that with the political climate being the way it is around the world, deepfakes are absolutely something that we should be worried about.”

Fralick cited a demonstration of deepfake technology during an RSAC session presented by Sherin Mathews, senior data scientist at McAfee, and Amanda House, data scientist at McAfee.

“We have a number of examples, like Bill Hader morphing into Tom Cruise and morphing back. I never realized they looked alike, but when you see the video you can see them morph. So certainly in this political climate I think that it’s something to be worried about. Are we looking at the real thing?”

Jake Olcott, BitSight’s vice president of communications and government affairs, agreed, saying that deepfakes are “a huge threat to democracy.” He notes that the platforms that own the distribution of content, like social media sites, are doing very little to stop the spread of misinformation.

“I’m concerned that because the fakes are so good, people are either not interested in distinguishing between what’s true and what’s not, but also that the malicious actors, they recognize that there’s sort of just like a weak spot and they want to just continue to pump this stuff out.”

CrowdStrike CTO Mike Sentonas made the point that they’re getting harder to spot and easier to create.

“I think it’s something we’ll more and more have to deal with as a community.”

Deepfake threats aren’t pressing

Other security experts such as Patrick Sullivan, Akamai CTO of security strategy, weren’t as concerned about the potential use of deepfakes in cyberattacks.

“I don’t know if we should be worrying. I think people should be educated. We live in a democracy, and part of that is you have to educate yourself on things that can influence you as someone who lives in a democracy,” Sullivan said. “I think people are much smarter about the ways someone may try to divide online, how bots are able to amplify a message, and I think the next thing people need to get their arms around is video, which has always been an unquestionable point of data, which you may have to be more skeptical about.”

Malwarebytes Labs director Adam Kujawa said that while he’s not so worried about the ever-publicized deepfake videos, he does show concern with deepfake text and systems that automatically predict or create text based on a user’s input.

“I see as being pretty dangerous because if you utilize that with limited input derived from social media accounts, anything you want to create a pretty convincing spear phishing email, almost on the fly.”

That said, he echoed Sullivan’s point that people are generally able to spot when something is obviously not real.

“They are getting better [however], and we need to develop technology that can identify these things you and I won’t be able to, because eventually that’s going to happen,” Kujawa said.

Greg Young, Trend Micro’s vice president of cybersecurity, went as far as to call deepfakes “not a big deal.”

However, he added, ” I think where it’s going to be used is business email compromise where you try to get a CEO or CFO to send you a Western Union payment. So if I can imitate that person’s voice, deepfake for voice alone would be very useful because I can tell the CFO to do this thing if I’m the person pretending to be the CEO, and they’re going to do it. We don’t leave video messages today, so the video side I’m less concerned about. I think deepfakes will be used more in disinformation campaigns. We’ve already seen some of that today.”

Go to Original Article
Author:

For Sale – Nvidia MSI RTX 2070 ARMOR 8G

Nvidia MSI RTX 2070 ARMOR 8G bought from Scan October 2018 and still under warranty until October 2020,

–– ADVERTISEMENT ––​

–– ADVERTISEMENT ––​

Not seen too much use as only game 4/5 times a year on a LAN weekend so in good condition and boxed.IMG_20200310_224051.jpg

Location
Hornchurch
Price and currency
£300
Delivery cost included
Delivery is NOT included
Prefer goods collected?
I have no preference
Advertised elsewhere?
Advertised elsewhere
Payment method
PPG/BT

Last edited:

Go to Original Article
Author:

For Trade – EVGA RTX 2060 XC Black Gaming GPU Brand New – Trade for a 2-Slot Card with Warranty

Just received this RMA replacement which is still brand new and factory sealed. This is however a 3 slot card and the case I want to put it in only supports 2 slot cards, so looking to trade it for 2 slot card. I don’t mind if its a faster or slower card (within reason) and I’m happy to adjust either way with cash on top etc. Pretty much anything considered but in an ideal world it should have warranty and be from a smoke free home.

EVGA North America’s #1 NVIDIA partner.

Go to Original Article
Author:

For Sale – Dell Latitude E7250 – i5 – 250GB SSD – 8GB RAM

Got an older, yet cool laptop for sale.
It performs really well still for work and home usage. This includes video streaming, video calls, etc.
I have attached a speccy pic, but here’s a breakdown:

  • i5 5300U 2.30GHz
  • 250GB SSD
  • 8GB RAM 1600MHz
  • Intel 5500 Graphics

The screen is 12.5″ and is touch screen. 1920×1080. Really beautiful, gorilla glass, great colours and brightness. It’s in pretty great shape other than the top (check pictures). This scratches are cosmetic. The laptop feels strong and sturdy still.
It has a dedicated SIM Card port under the battery, if that’s ya ting.
Keyboard is lovely and backlit.

It includes the Dell charger. Has Windows 10 Pro.

Check pics!

Go to Original Article
Author:

For Sale – Dell Latitude E7250 – i5 – 250GB SSD – 8GB RAM

Got an older, yet cool laptop for sale.
It performs really well still for work and home usage. This includes video streaming, video calls, etc.
I have attached a speccy pic, but here’s a breakdown:

  • i5 5300U 2.30GHz
  • 250GB SSD
  • 8GB RAM 1600MHz
  • Intel 5500 Graphics

The screen is 12.5″ and is touch screen. 1920×1080. Really beautiful, gorilla glass, great colours and brightness. It’s in pretty great shape other than the top (check pictures). This scratches are cosmetic. The laptop feels strong and sturdy still.
It has a dedicated SIM Card port under the battery, if that’s ya ting.
Keyboard is lovely and backlit.

It includes the Dell charger. Has Windows 10 Pro.

Check pics!

Go to Original Article
Author:

Quantum computing in business applications is coming

Quantum computing may still be in its infancy stages, but it’s rapidly approaching and is already showing significant potential for business applications across several industries.

At MIT Technology Review’s Future Compute conference in December 2019, Alan Baratz, CEO of D-Wave Systems Inc., a Canadian quantum computing company, discussed the benefits of quantum computing in business applications and the new capabilities it can offer.

Editor’s note: The following has been edited for clarity and brevity.

Why should CIOs be thinking about quantum computing in business applications?

Alan Baratz: Quantum computing can accelerate the time to solve the hard problems. If you are in a logistics business and you need to worry about pack and ship or vehicle scheduling; or you’re in aerospace and you need to worry about crew or flight scheduling; or a drug company and you need to worry about molecular discovery computational chemistry, the compute time to solve those problems at scale can be very large.

Typically, what companies do is come up with heuristics — they try to simplify the problem. Well, quantum computing has the potential to allow you to solve the complete problem much faster to get better solutions and optimize your business.

Would you say speed is considered as the most significant factor of quantum computing?

Baratz: Well, sometimes it’s speed, sometimes it’s quality of the solution [or] better solution within the same amount of time. Sometimes, it’s diversity of solution. One of the interesting things about the quantum computer is that maybe you don’t necessarily want the optimal solution but [rather] a set of good solutions that you can then use to optimize other things that weren’t originally a part of the problem. The quantum computer is good at giving you a set of solutions that are close to optimal in addition to the optimal solution.

What’s limiting quantum computing in terms of hardware?

Baratz: Well, up until now, in [D-Wave’s] case, it’s been the size and topology of the system because in order to solve your problem, you have to be able to map it onto the quantum processor. Remember, we aren’t gate-based, so it’s not like you specify the sequence of instructions or the gates. In order to solve your problem in our system, you have to take the whole problem and be able to map it into our hardware. That means with 2,000 qubits and each qubit connected to six others, there’s only certain size problems that you can actually map into our system. When we deliver the Advantage system next year, we double that — over 5,000 qubits [and] each qubit connected to 15 others — so, that will allow us to solve significantly larger problems.

In fact, for a doubling of the processor size, you get about a tripling of the problem size. But we’ve done one other thing: We’ve developed brand new hybrid algorithms [and] these are not types of hybrid algorithms that people typically think about. It’s not like you try to take the problem and divide it into chunks and solve [them]. This is a hybrid approach where we use a classical technique to find the core of the problem, and then we send that off to the quantum processor. With that, we can get to the point where we can solve large problems even with our 5,000-qubit system. We think once we deliver the 5,000-qubit system Advantage [and] the new hybrid solver, we’ll see more and more companies being able to solve real production problems.

Do you think companies are in a race toward quantum computing?

Baratz: No, they’re not because in some cases they’re not aware, and in some cases they don’t understand what’s possible. The problem is there are very large companies that make a lot of noise in the quantum space and their approach to quantum computing is one where it will take many, many years before companies can actually do something useful with the system. And because they’re the loudest out there, many companies think that’s all there is. We’ve been doing a better job of getting the D-Wave message out, but we still have a ways to go.

What excites you most about quantum computing in business settings?

Baratz: First, just the ability to solve problems that can’t otherwise be solved to me is exciting. When I was at MIT, my doctorate was in theory of computation. I was kind of always of the mindset that there are just some problems that you’re not going to be able to solve with one computer. I wouldn’t say quantum computing removes, but [it] reduces that limitation — that restriction — and it makes it possible to solve problems that couldn’t otherwise be solved.

But more than that, the collection of technologies that we have to use to build our system, even that is exciting. As I mentioned during the panel, we develop new superconducting circuit fabrication recipes. And we’re kind of always advancing the state of the art there. We’re advancing the state of the art and refrigeration and how to remove contaminants from refrigerators because it can take six weeks to calibrate a quantum computer. Once you cool it down, would you cool the chip down? Well, if you get contaminants in the refrigerator and you have to warm up the system and remove those contaminants, you’re going to lose two months of compute time. We have systems that run for two and a half, three years and nobody else really has the technology to do that.

It’s been said that one of the main concerns with quantum computers is increased costs. Can you talk a little more about that?

Baratz: Well, there was that power question [from the panel] about when is it going to stop all the power? So, the truth of the matter is our system runs on about 20 kilowatts. The reason is, the only thing we really need power for is the refrigerator — and we can put many chips in one refrigerator. So, as the size of the chip grows, as the number of chips grow, the power doesn’t grow. Power is the refrigerator.

Second, the systems are pricey if you want to buy one [but through] cloud access, anybody can get it. I mean, we even give free time, but we do sell it and currently we sell it at $2,000 an hour, but you can run many, many problems within that amount of time.

Will this be the first technology that CIOs will never have on premises?

Baratz: Oh, never say never. We continue to work on shrinking the size of the system as well so, who knows?

Go to Original Article
Author:

For Sale – TP-Link Archer MR200 AC750 733Mbps Dual Band 4G LTE Mobile Wi-Fi Router – Version 4

Hi,

For sale is a TP-Link Archer MR200 4G Router, its still in as new condition (still has plastic film covering the top) and hardly used (was only to cover me whilst I had no broadband at home)

Will come fully boxed, please note this is the latest version 4

£75 INC

Location
Crewe
Price and currency
£75
Delivery cost included
Delivery Is Included
Prefer goods collected?
I have no preference
Advertised elsewhere?
Advertised elsewhere
Payment method
BT/PP

Go to Original Article
Author:

On-premises server monitoring tools meet business needs, budget

Although the market has shifted and more vendors are providing cloud-based monitoring, there are still a wide range of feature-rich server monitoring tools for organizations that must keep their workloads on site for security and compliance reasons.  

Here we examine open source and commercial on-premises server monitoring tools from eight vendors. Although these products broadly achieve the same IT goals, they differ in their approach, complexity of setup — including the ongoing aspects of maintenance and licensing — and cost. 

Cacti

Cacti is an open source network monitoring and graphing front-end application for RRDtool, an industry-standard open source data logging tool. RRDtool is the data collection portion of the product, while Cacti handles network graphing for the data that’s collected. Since both Cacti and RRDtool are open source, they may be practical options for organizations that are on a budget. Cacti support is community-driven.

Cacti can be ideal for organizations that already have RRDtool in place and want to expand on what it can display graphically. For organizations that don’t have RRDtool installed, or aren’t familiar with Linux commands or tools, both Cacti and RRDtool could be a bit of a challenge to install, as they don’t include a simple wizard or agents. This should be familiar territory for Linux administrators, but may require additional effort for Windows admins. Note that Cacti is a graphing product and isn’t really an alerting or remediation product. 

ManageEngine Applications Manager

The ManageEngine system is part of an extensive line of server monitoring tools that include application-specific tools as well as cloud and mobile device management. The application monitoring framework enables organizations to purchase agents from various vendors, such as Oracle and SAP, as well as customer application-specific tools. These server monitoring tools enable admins to perform cradle-to-grave monitoring, which can help them troubleshoot and resolve application server issues before they impact end-user performance. ManageEngine platform strengths include its licensing model and the large number of agents available. Although the monitoring license per device is all-inclusive for interfaces or sensors needed per device, the agents are sold individually.

Thirty-day trials are available for many of the more than 100 agents. Licensing costs range from less than $1,000 for 25 monitors and one user to more than $7,000 for 250 monitors with one user and an additional $245 per user. Support costs are often rolled into the cost of the monitors. This can be ideal for organizations that want to make a smaller initial investment and grow over time.

Microsoft System Center Operations Manager

The product monitors servers, enterprise infrastructure and applications, such as Exchange and SQL, and works with both Windows and Linux clients. Microsoft System Center features include configuration management, orchestration, VM management and data protection. System Center isn’t as expansive on third-party applications as it is with native Microsoft applications. System Center is based on core licensing to match Server 2016 and later licensing models.

The base price for Microsoft System Center Operations Manager starts at $3,600, assuming two CPUs and 16 cores total and can be expanded with core pack licenses. With Microsoft licensing, the larger the environment in terms of CPU cores, the more a customer site can expect to pay. While Microsoft offers a 180-day trial of System Center, this version is designed for the larger Hyper-V environments. Support is dependent on the contract the organization selects.  

Nagios Core

Nagios Core is free open source software that provides metrics to monitor server and network performance. Nagios can help organizations provide increased server, services, process and application availability. While Nagios Core comes with a graphical front end, the scope of what it can monitor is somewhat limited. But admins can deploy additional community-provided front ends that offer more views and additional functionality. Nagios Core natively installs and operates on Linux systems and Unix variants.

For additional features and functionality, the commercial Nagios XI product offers true dashboards, reporting, GUI configuration and enhanced notifications. Pricing for this commercial version ranges from less than $7,000 for 500 nodes and an additional $1,500 per enterprise for reporting and capacity planning tools. In addition to agents for OSes, users can also add network monitoring for a single point of service. Free 60-day trials and community support are available for the products that work with the free Nagios Core download.

Opsview

Opsview system monitoring software includes on-premises agents as well as agents from all the major cloud vendors. While the free version provides 25 hosts to monitor, the product’s main benefit is that it can support both SMBs and the enterprise. Pricing for a comprehensive offering that includes 300 hosts, reporting, multiple collectors and network analyzer is less than $20,000 a year, depending on the agents selected.  

Enterprise packages are available via custom quote. The vendor offers both on-premises and cloud variations. The list of agents Opsview can monitor is one of the most expansive of any of the products, bridging cloud, application, web and infrastructure. Opsview also offers a dedicated mobile application. Support for most packages is 24/7 and includes customer portals and a knowledgebase.

Paessler PRTG Network Manager

PRTG can monitor from the infrastructure to the application stack. The licensing model for PRTG Network Monitor follows a sensor model format over a node, core or host model. This means a traditional host might have more than 20 sensors monitoring anything from CPU to bandwidth. Services range from networking and bandwidth monitoring to other more application-specific services such as low Microsoft OneDrive or Dropbox drive space. A fully functional 30-day demo is available and pricing ranges from less than $6,000 for 2,500 sensors to less than $15,000 for an unlimited number of sensors. Support is email-based.

SolarWinds Server and Application Monitor

SolarWinds offers more than 1,000 monitoring templates for various applications and systems, such as Active Directory, as well as several virtualization platforms and cloud-based applications. It also provides dedicated virtualization, networking, databases and security monitoring products. In addition to standard performance metrics, SolarWinds provides application response templates to help admins with troubleshooting. A free 30-day trial is available. Pricing for 500 nodes is $73,995 and includes a year of maintenance.  

Zabbix

This free, open source, enterprise-scale monitoring product includes an impressive number of agents that an admin can download. Although most features aren’t point and click, the dashboards are similar to other open source platforms and are more than adequate. Given the free cost of entry and the sheer number of agents, this could be an ideal product for organizations that have the time and Linux experience to bring it online. Support is community-based and additional support can be purchased from a reseller.

The bottom line on server monitoring tools

The products examined here differ slightly in size, scope and licensing model. Outside of the open source products, many commercial server monitoring tools are licensed by node or agent type. It’s important that IT buyers understand all the possible options when getting quotes, as they can be difficult to understand.

Pricing varies widely, as do the features of the dashboards of the various server monitoring tools. Ensure the staff is comfortable with the dashboard and alerting functionality of each system as well as mobile ability and notifications. If an organization chooses an open source platform, keep in mind that the installation could require more effort if the staff isn’t Linux savvy.  

The dashboards for the open source monitors typically aren’t as graphical as the paid products, but that’s part of the tradeoff with open source. Many of the commercial products are cloud-ready or have that ability, so even if an organization doesn’t plan to monitor its servers in the cloud today, they can take advantage of this technology in the future. 

Go to Original Article
Author:

For Sale – Dell Latitude E7250 – i5 – 250GB SSD – 8GB RAM

Got an older, yet cool laptop for sale.
It performs really well still for work and home usage. This includes video streaming, video calls, etc.
I have attached a speccy pic, but here’s a breakdown:

  • i5 5300U 2.30GHz
  • 250GB SSD
  • 8GB RAM 1600MHz
  • Intel 5500 Graphics

The screen is 12.5″ and is touch screen. 1920×1080. Really beautiful, gorilla glass, great colours and brightness. It’s in pretty great shape other than the top (check pictures). This scratches are cosmetic. The laptop feels strong and sturdy still.
It has a dedicated SIM Card port under the battery, if that’s ya ting.
Keyboard is lovely and backlit.

It includes the Dell charger. Has Windows 10 Pro.

Check pics!

Go to Original Article
Author:

Tableau 2020.1 unveiled for beta testing

Though a couple of weeks still remain in 2019, Tableau is turning its attention to next year, and on Wednesday rolled out Tableau 2020.1 for beta testing.

Though not yet available to the general public, the beta version of Tableau 2020.1 includes 21 features.

Among them are an update to Explain Data, an augmented intelligence product built directly in Tableau that uses statistical algorithms to analyze data and then explain what is driving specific data points. The update aims to improve the performance of Explain Data — first unveiled in Tableau 2019.3 — for wide data sets, and includes refined models to help customers derive deeper insight from their data.

In addition, Tableau 2020.1 includes Dynamic Parameters, which saves users the cumbersome task of republishing a workbook with parameters every time the underlying data changes by performing automatic updates. It also includes enhanced map building prowess, an add-on to Tableau Data Management that will speed up the process of getting to the right data and improved connectors to Salesforce and Snowflake.

Despite the array of updates and new offerings, the 21 features included in the beta version of Tableau 2020.1 are modest improvements rather than major new capabilities, analysts said.

“It’s all organic growth, incremental improvements,” said Boris Evelson, principal analyst at Forrester. “[Tools like] Explain Data have been a core feature of leading enterprise BI platforms for a while now.”

Similarly, Wayne Eckerson, founder and principal consultant of Eckerson Group, noted that the platform contains upgrades but he said they are not innovative new features that will force other vendors to react.

Tableau's latest platform update, now in beta testing, includes an update to Explain Data.
Tableau 2020.1, just released for beta testing, includes an update to Explain Data, an AI tool from the vendor that attempts to explain the reasons behind data points.

“There are a lot of incremental improvements,” he said, “and there’s more movement to Tableau Server and [Tableau Online] to achieve parity with Tableau Desktop.”

There are a lot of incremental improvements, and there’s more movement to Tableau Server and [Tableau Online] to achieve parity with Tableau Desktop.
Wayne EckersonFounder and principal consultant, Eckerson Group

One feature not included in Tableau 2020.1 is a low-code data modeling tool.

Tableau, which is based in Seattle, revealed on its website that it plans to provide new data modeling capabilities that will allow customers to analyze data without having to learn advanced database concepts or write custom SQL code.

The capability, however, is only in the alpha testing stage at this point.

“That could be interesting,” Eckerson. “I suspect it’s a semantic layer, which Tableau has never really had. That would be big news. They need that to keep up with Power BI, which is one of its key differences with Tableau.”

Though not a specific feature, something else not evident in Tableau 2020.1 — at least not in any obvious way — is influence of Tableau’s acquisition by Salesforce.

Tableau 2020.1 marks Tableau’s second platform update since Nov. 5, when Salesforce and Tableau finally received regulatory approval to proceed with their merger and were finally allowed to begin working together. But the first update, Tableau 2019.4, came just a day after the companies were freed from their regulatory holdup and they never had a chance to join forces.

Five weeks have passed since the lifting of regulatory restrictions and the beta release of Tableau 2020.1, but that’s still not enough time for Salesforce and Tableau to significantly collaborate on technology.

The only mention of Salesforce among the 21 features in Tableau 2020.1 is the improved connector.

“I’ve seen no indications of Tableau and Salesforce doing any integrations as of yet,” Evelson said, “so this is all business as usual for Tableau.”

Go to Original Article
Author: