Tag Archives: wants

Xiaomi Air 13

Looking to spend around £500, already own one but the offer half now want’s one.

What have you got?

Thanks.

Location: Suffolk

______________________________________________________
This message is automatically inserted in all classifieds forum threads.
By replying to this thread you agree to abide by the trading rules detailed here.
Please be advised, all buyers and sellers should satisfy themselves…

Xiaomi Air 13

1080p Graphics card

Helping a friend build a gaming PC. He’s not fussed about running on ultra, but wants something decent. So, I’m looking for a good mid-range graphics card – something like a 960/1050/770 that kinda ballpark.

Lemme know if you’ve got one going. Would be much appreciated.

Location: Barton-upon-Humber

______________________________________________________
This message is automatically inserted in all classifieds forum threads.
By replying to this thread you…

1080p Graphics card

Wanted – 1080p Graphics card

Helping a friend build a gaming PC. He’s not fussed about running on ultra, but wants something decent. So, I’m looking for a good mid-range graphics card – something like a 960/1050/770 that kinda ballpark.

Lemme know if you’ve got one going. Would be much appreciated.

Location: Barton-upon-Humber

______________________________________________________
This message is automatically inserted in all classifieds forum threads.
By replying to this thread you agree to abide by the trading rules detailed here.
Please be advised, all buyers and sellers should satisfy themselves that the other party is genuine by providing the following via private conversation to each other after negotiations are complete and prior to dispatching goods and making payment:

  • Landline telephone number. Make a call to check out the area code and number are correct, too
  • Name and address including postcode
  • Valid e-mail address

DO NOT proceed with a deal until you are completely satisfied with all details being correct. It’s in your best interest to check out these details yourself.

Google got faster pulling bad Android apps from Play Store

Google wants to reinforce that the Play Store is the safest place for Android users to get apps with a new set of stats on how its efforts to block bad Android apps have improved.

Andrew Ahn, product manager for Google Play, said the company has “halved the probability” of users installing bad Android apps and also made the Play Store “a more challenging place for those who seek to abuse the app ecosystem for their own gain.”

“In 2017, we took down more than 700,000 apps that violated the Google Play policies, 70% more than the apps taken down in 2016. Not only did we remove more bad apps, we were able to identify and action against them earlier,” Ahn wrote in a blog post. “In fact, 99% of apps with abusive contents were identified and rejected before anyone could install them. This was possible through significant improvements in our ability to detect abuse — such as impersonation, inappropriate content, or malware — through new machine learning models and techniques.”

Liviu Arsene, senior e-threat analyst at Romania-based antimalware firm Bitdefender, said it is “commendable that Google is going through great lengths to optimize be malicious app bouncing process,” considering the more than 3.5 million apps in the Play Store.

“However, malware developers don’t necessarily have to submit ‘bad Android apps’ when they can simply create something that’s barely functional with the sole purpose of getting past the vetting process. Some apps may offer deceptive descriptions and functionalities just to get installed on devices, from which they can request all sorts of permissions for tracking users or for bombarding them with ads,” Arsene told SearchSecurity. “There have been instances where apps walk a very fine line between complying with Google’s advertising policy and spamming users with nag screens, browser redirects, and unsolicited pop-ups just for the sole purpose of generating revenue for the developer. While, granted, they don’t install malware or pilfer personal data, some of them can still be borderline legitimate.”

Will the Play Store catch all the bad apps?

A Google spokesperson told SearchSecurity that there will always be a chance for bad Android apps to slip through because “they evade detection in a sneaky way, or seem to be very borderline cases,” and in those cases Google relies on analyzing how apps are being distributed, monitoring user community flagging and reviewing data from post-install Google Play Protect scans in order to take action on a potentially harmful app.

“Apps submitted to Google Play are automatically scanned for potentially malicious code as well as spammy developer accounts before they are published on the Google Play Store. To complement that effort, we recently introduced a proactive app review process to catch policy offenders earlier in the process, while still ensuring that developers can get their apps to market as soon as possible — in a matter of hours, not days or weeks,” the spokesperson said. “During that process, apps are specifically reviewed for compliance against our Google Play Developer Content Policy and Developer Distribution Agreement, which prevents things like apps that are impersonating legitimate companies or deceptive behavior.”

Arsene applauded the work done by Google to block bad Android apps “because Android is one of the most popular operating systems.”

“Some built in app scanning features even let users know if they’ve downloaded something malicious from a third-party marketplace, which acts as an additional line of defense,” Arsene said. “However, it’s recommended that everyone owning an Android device, regardless if they install apps from official marketplaces or not, install a mobile security solution as it will have the ability to protect them from much more than just malicious apps, but also against web-based attacks and other online threats.”

Wanted – 1080p Graphics card

Helping a friend build a gaming PC. He’s not fussed about running on ultra, but wants something decent. So, I’m looking for a good mid-range graphics card – something like a 960/1050/770 that kinda ballpark.

Lemme know if you’ve got one going. Would be much appreciated.

Location: Barton-upon-Humber

______________________________________________________
This message is automatically inserted in all classifieds forum threads.
By replying to this thread you agree to abide by the trading rules detailed here.
Please be advised, all buyers and sellers should satisfy themselves that the other party is genuine by providing the following via private conversation to each other after negotiations are complete and prior to dispatching goods and making payment:

  • Landline telephone number. Make a call to check out the area code and number are correct, too
  • Name and address including postcode
  • Valid e-mail address

DO NOT proceed with a deal until you are completely satisfied with all details being correct. It’s in your best interest to check out these details yourself.

New VEP Charter promises vulnerability transparency

The White House wants more transparency in how federal agencies determine whether or not to disclose software vulnerabilities, but there are early questions regarding how it might work.

The Vulnerabilities Equities Process (VEP) was designed to organize how federal agencies would review vulnerabilities and decide if a flaw should be kept secret for use in intelligence or law enforcement operations or disclosed to vendors. The new VEP Charter announced by Rob Joyce, special assistant to the President and cybersecurity coordinator for the National Security Council, aims to ensure the government conducts “the VEP in a manner that can withstand a high degree of scrutiny and oversight from the citizens it serves.”

“I believe that conducting this risk/benefit analysis is a vital responsibility of the Federal Government,” Joyce wrote in a public statement. “Although I don’t believe withholding all vulnerabilities for operations is a responsible position, we see many nations choose it. I also know of no nation that has chosen to disclose every vulnerability it discovers.”

Joyce laid out the “key tenets” of the new VEP Charter, including increased transparency and an annual report, improved standardization of the process regarding the interests of various stakeholders and increased accountability.

“We make it clear that departments and agencies with protective missions participate in VEP discussions, as well as other departments and agencies that have broader equities, like the Department of State and the Department of Commerce. We also clarify what categories of vulnerabilities are submitted to the process and ensure that any decision not to disclose a vulnerability will be reevaluated regularly,” Joyce wrote. “There are still important reasons to keep many of the specific vulnerabilities evaluated in the process classified, but we will release an annual report that provides metrics about the process to further inform the public about the VEP and its outcomes.”

Questions about the VEP Charter

The VEP has previously been criticized by experts for being optional rather than being codified into law, but the VEP Charter does not include language making the process a requirement nor does it acknowledge the PATCH Act, a bill proposed in Congress which would enforce a framework for using the VEP.

Heather West, senior policy manager and Americas principal at Mozilla, noted in a blog post that “many of the goals of the PATCH Act [are] covered in this process release, [but] our overarching goal in codifying the VEP in law to ensure compliance and permanence cannot be met by unilateral executive action.”

Early readings of the VEP Charter have revealed what some consider a conflict of interest, in that the NSA is designated as the VEP Executive Secretariat with the responsibility to “facilitate information flow, discussions, determinations, documentation, and recordkeeping for the process.”

However, the VEP Charter also states that any flaw found in NSA-certified equipment or systems should be “reported to NSA as soon as practical. NSA will assume responsibility for this vulnerability and submit it formally through the VEP Executive Secretariat.”

Additionally some have taken issue with the following clause in the VEP Charter: “The [U.S. government’s] decision to disclose or restrict vulnerability information could be subject to restrictions by foreign or private sector partners of the USG, such as Non-Disclosure Agreements, Memoranda of Understanding, or other agreements that constrain USG options for disclosing vulnerability information.”

Edward Snowden said on Twitter that this could be considered an “enormous loophole permitting digital arms brokers to exempt critical flaws in U.S. infrastructure from disclosure” by using an NDA.

SAP boosts data integration with SAP Data Hub and Vora

SAP wants to cover as many data integration bases as possible with the recent release of the new SAP Data Hub and updates to SAP Vora.

SAP Data Hub and Vora both attempt to solve similar data integration challenges and provide businesses a means to extract value out of the reams of data they are collecting, but the specific goals for each product are quite different, according to Ken Tsai, SAP’s global vice president and head of cloud platform and data management product marketing.

The recently released Data Hub has the much broader mission, because it is intended to help organizations manage complex data landscapes by building pipelines between a variety of data sources, Tsai said. SAP Vora, which has been on the scene for two years, provides a way to get at data stored in Hadoop data lakes via an Apache Spark framework. SAP Data Hub uses Vora underneath the covers, but the products are not the same.

Similar products, different aims

“SAP Data Hub has a much bigger purpose in terms of building up data flow in order to ensure a more efficient data operation, rather than just doing computing, which SAP Vora aims to do,” Tsai said. “Both are very complementary, and we’re seeing good results so far from SAP Vora. And the idea for SAP Data Hub wouldn’t have come about if we hadn’t been investing in building computing solutions in the Hadoop big data space and seeing what customers needed beyond the vast computing engine directly into Hadoop.”

SAP Data Hub is important now because locating data centrally is not feasible anymore, Tsai said, and it centralizes the data management while keeping the data in the source repositories.

“We’re targeting not only the developers, but also the enterprise architects, data scientists [and] business analysts,” he said. The IT department today is evolving into multiple zones — the application zone, the data warehouse zone, the data lake zone — and each one of them needs to participate in a kind of data flow, as data has to flow from one zone to the other.”

Pharmaceutical manufacturer McKesson Corp. is one SAP customer that has deployed SAP Data Hub to consolidate data across multiple systems to derive one single source of truth from that data.

“Our work is about helping our customers improve patient care and driving efficiencies across the healthcare value chain,” Adam Fecadu, chief information architect at McKesson, based in St. Paul, Minn., said in a press release. “It starts with relentless focus on helping our customers and partners solve their toughest challenges. With numerous data sources, types and IT landscapes, we need a unified data solution across departments and business units to produce actionable insights and continuous innovation. SAP Data Hub is aligned with this vision.”

SAP Vora dives into the big data lake

Vora is designed to allow businesses to process big data from Hadoop data lakes and derive business value from the data, Tsai said. SAP Vora 2.0 has been rearchitected using Kubernetes containers to improve the scalability and reduce the complexity in deployment.

CenterPoint Energy, an electricity and natural gas utility based in Houston, is using SAP Vora along with SAP HANA to manage and analyze data that it gets from smart meters. Its application uses HANA to track and analyze the health of its infrastructure and grid in real time and moves older data into Hadoop. Vora is used to process and analyze the historical data in Hadoop to determine usage patterns and trends, and this data can be combined with the current HANA data, allowing insights that result in more proactive energy delivery and pricing, Tsai said.

Processing data where it lives

Data Hub is a good direction for SAP, because it allows users to work on data where it lives without having to move it, according to Stewart Bond, research director of data integration software at IDC.

Data is getting too big to move around anymore, and people don’t want to move the data around.
Stewart Bondresearch director of data integration software, IDC

“It’s kind of a departure from where SAP’s been in the past, where you have to pull the data into the SAP environment to be able to work with it. But it’s also similar to what we’re seeing in the rest of the market,” Bond said. “Data is getting too big to move around anymore, and people don’t want to move the data around. And the data that is getting moved is a subset. Organizations that use the big data repositories like Hadoop preprocess data before it ends up going into an enterprise data warehouse. And in the preprocessing, things get filtered out, things get cleansed, things get put into smaller shapes — data sets that are smaller than what we have in big data.”

SAP Vora is similar, but tries to solve a different problem, Bond explained.

“Vora has been about plugging into the Hadoop big data ecosystem, whereas Data Hub is more of a broader data play, with more of a variety of data sources that they want to connect to and capabilities for working with data in motion or data pipelining,” Bond said. “They’re leveraging the investment that they have in Vora by making that technology and those capabilities available in Data Hub for those times when Data Hub solutions need to tap into a Hadoop ecosystem to do something with the data. But I think they are slightly different problem spaces, and they might be going after a slightly different audience.”

These tools are important because they are creating more opportunities for businesses to solve problems with technology, said Ezra Gottheil, senior analyst with Technology Business Research, a research and analysis firm based in Hampton, N.H. This is a confluence of the core technologies becoming more manageable, the convenience of the cloud as a platform and the next-generation technologies like big data and internet of things.

“There are more different-shaped Lego blocks that are available for those who are creating applications to build with, so everybody is extremely eager to get those tools in the hands of not just developers, but business people,” Gottheil said. “SAP makes applications, and they make some pretty specialized ones as well, but they can only begin to address all the applications that are needed. So, they’ll have to come from customers and third parties, too, but putting out the tools and promoting them is the way they get at that market.”

Need to keep up with the competition

SAP faces a steep competition curve in the market, Bond said, and needs to have a fully developed product to keep up. Oracle, for example, has the Data Integration Platform Cloud, which brings data integration, data quality and data governance to a cloud platform.

“Data Hub is doing something very similar, so the challenge that they’re up against is that they’re talking about going to market with data governance, data pipeline and data integration, but there are still parts of that three-pronged story that need to be developed,” Bond said. “The data governance piece was demonstrated in the launch, but what was demonstrated was more technical-level data governance and really wasn’t that business metadata. So, it’s going to be critical that when they go to market that they have a truly competitive product, because their competitors are going to be there as quickly as they are.”

Huawei wants to grow public cloud market share

Huawei wants to gain public cloud market share and become a dominant public cloud provider, according to Brad Shimmin, an analyst at Current Analysis in Sterling, Va. At its annual Huawei Connect event, the Chinese vendor laid out its plans to grow public cloud market share to compete directly with Google, Microsoft, Amazon and IBM. However, as Shimmin noted, Huawei’s plan is not to dominate in the same way as its competitors — instead the vendor aims to create an open platform that interoperates with other clouds.

Huawei will initially focus its attention on growing public cloud market share among telcos and in its home market, with clients such as China Telecom and China Central TV. Shimmin doubts that Huawei can match other hyperscale cloud providers in scope and scale. Although the vendor recently launched Huawei Enterprise Intelligence AI, Shimmin still sees Huawei’s machine learning ranking far behind the AI capabilities of its competitors. “In my opinion, where Huawei is most likely to succeed with its cloud play is in helping partners and customers apply Huawei’s significant hardware expertise to trenchant problems like cross-cloud security, AI-scale hardware and IoT edge computing,” Shimmin said.

Read more of Shimmin’s thoughts on Huawei.

Achieving container workload performance

Dan Conde, an analyst at Enterprise Strategy Group in Milford, Mass., said instead of fretting over competition between containers, virtual machines (VMs) and serverless machines, professionals need to focus on the architecture of new applications. Many emerging applications are geared for microservices and depend on new infrastructure to scale and interoperate.

In Conde’s view, what really matters with containers and similar technologies is the performance of the workload, not how the workload is actually run. Having choices is vital — even if it means mixing and matching containers and VMs. The traditional system of underlying platforms and operating systems has been displaced by a much more complicated set of services such as Cassandra, NATS or Apache Spark; cloud platforms; and lower-level offerings such as Apache Mesos or Red Hat OpenShift. “The old notion of a highly curated platform as a service (PaaS) is effectively dead because developers demand choices and the world is changing too rapidly to choose a narrow path. …The analogy would be the five-year plans of the old planned economies. The current world is too dynamic to go down such a narrow path,” Conde said.

Dig deeper into Conde’s thoughts on container workload performance.

Cisco emphasizes LISP for enterprise campuses

Ivan Pepelnjak, writing in ipSpace, responded to questions he received from readers asking why Cisco was pushing LISP instead of EVPN for VXLAN-based enterprise systems. While Pepelnjak admitted that he wasn’t certain of the exact reasons, he suggested that Cisco and a few other large vendors still see a need for large Layer 2 domains. “It looks like the networking industry is in another lemming rush. Everyone is rolling out VXLAN to solve large VLAN challenges, or even replacing MPLS with VXLAN for L3VPN deployments,” Pepelnjak said.

He added that every large vendor is deploying EVPN as a control plane for VXLAN, including Cumulus Networks, Juniper Networks, Cisco and Arista Networks. According to Pepelnjak, LISP is a system searching for a problem that Cisco has chosen to deploy as an additional control plane, without any technical factors driving the decision.

Read more of Pepelnjak’s thoughts on LISP.