Tag Archives: data

Tibco analytics capabilities get upgrade in Spotfire X

Spotfire X, the latest iteration of the Tibco analytics and data visualization platform, aims to give users a more streamlined experience by incorporating more AI and machine learning capabilities when the upgraded platform is released this fall.

Notably, the platform update, characterized by what Tibco has dubbed a new “A(X) Experience,” will enable users to type in requests to navigate and visualize their data through natural language processing (NLP), to automatically record dataflows that can later be explored and edited. It also will natively stream data in real time from dozens of sources.

The new Spotfire X features are designed to create a faster and simpler user experience, according to Brad Hopper, vice president of product strategy, analytics and streaming at the integration and analytics software vendor. “This will allow us to take a complete novice off the street, put them in front of the tool, and no matter what they will get something back,” he said.

Search for simple

With the rise of citizen data scientists, it has become a trend for self-service analytics vendors to design platforms that are easier to use and more automatic, turning to employing AI and machine learning algorithms to do so.

Brad Hopper, TibcoBrad Hopper

Earlier this year, a Tibco competitor, Tableau, acquired MIT AI startup Empirical Systems, whose technology is expected to provide Tableau platforms with more advanced predictive analytics capabilities and better automated models. Also this year, Qlik, another big-name self-service analytics vendor, acquired startup Podium Data in a bid to better automate parts of its platforms and make them simpler to use.

“There is a trend in the market … for AI and machine learning to be used to explore all the possible data, all the possible variables,” said Rita Sallam, a Gartner analyst.

With the new Spotfire X features, Tibco analytics is looking forward, even if the features aren’t necessarily innovative on their own, she said.

“They’re leveraging natural language as a way to initiate a question and they are, based on that question, generating all the statistically meaningful insight on that data so the user can see all the possible insights on that data,” Sallam said.

A(X) Experience in Tibco Spotfire X
The A(X) Experience in Tibco’s Spotfire X enables faster and easier analytics with NLP tools and improved AI

AI advice

With the A(X) Experience, Spotfire X also will deliver AI-driven recommendations for users.

“We’ve built in a fairly sophisticated machine learning model behind the scenes,” Hopper said.

The Tibco analytics platform can already use AI to automatically index different pieces of data and suggest relationships between them.

Now from the Spotfire X’s NLP-powered search box, users will be able to receive a list of visualization recommendations, starting first with “classical recommendations” before getting to “a ranked list of interesting structural variations,” Hopper explained.

Forrester analyst Boris Evelson said the Tibco analytics and Spotfire X moves are “yet another confirmation of a trend that leading BI products need a dose of AI to remain effective.”

While AI is not replacing BI, BI tools that infuse AI functionality will displace the tools that don’t.
Boris Evelsonanalyst, Forrester

“While AI is not replacing BI, BI tools that infuse AI functionality will displace the tools that don’t,” Evelson said.

Tibco made the Spotfire X announcements during the Tibco Now conference in Las Vegas in early September 2018. 

The enhancements to Tibco analytics capabilities were among other product developments unveiled at the event. Others included the a user-partner collaboration program called Tibco Labs, more tools for Tibco Cloud, and a new collaboration between Tibco and manufacturing services company Jabil.

Zoomdata unveils data visualization and analytics channel program

Zoomdata, a data visualization and analytics vendor, has launched a global partner program as it looks to expand its roster of systems integrators.

Unveiled this week, the Zoomdata Application Partner program offers access to support representatives and integrated support systems, training, and sales and marketing resources. Zoomdata’s SI partners can also tap deal registration and tracking through a partner portal, the vendor said.

Zoomdata said in the last year it experienced “3X growth” in channel sales of its data visualization and analytics technology. Much of those sales were derived from SI partners in European and Asia-Pacific markets. About 30% of Zoomdata’s business is international, said Russ Cosentino, co-founder and vice president of channel sales at Zoomdata, based in Reston, Va.

Cosentino said Zoomdata aims to have a global base of about 100 SI partners within the next year. About 30 partners, including global systems integrators Deloitte, Atos, Hitachi INS and Infosys, currently provide the vendor’s data visualization and analytics tools. Zoomdata also has alliances with regional SIs focused on specific geographic and vertical markets such as pharmaceuticals and life sciences, telecom, and financial services.

“For us, a good partner is a partner that brings to the table skilled resources [from] across the big data ecosystem,” Cosentino said.

Quisitive inks blockchain pact with SaaS provider

Quisitive Technology Solutions Inc., a Microsoft national solution provider, is under contract with Jumptuit, a SaaS company, to build a blockchain solution that tracks subscriptions and entitlements.

Jumptuit, based in New York, offers a search assistant service that uses AI to find users’ documents, photos, audio and video files. The SaaS offering supports Microsoft Office 365, Microsoft OneDrive and other cloud services.

Scotty Perkins, senior vice president of product innovation at Quisitive, said Microsoft referred Jumptuit to Quisitive after the SaaS provider approached Microsoft about a blockchain solution. Quisitive met with Jumptuit, held a requirements workshop and is now in the process of completing a proof of concept, he said. 

Quisitive’s propriety blockchain solution, which uses Microsoft Azure, will let Jumptuit track active subscriptions, who they belong to and when they are up for renewal. On the entitlement side, the blockchain offering will determine the level of information access users have based on the terms of their subscriptions.

Quisitive, a portfolio company of Fusion Agiletech Partners Inc., focuses on Microsoft technology and has made private blockchain deployment one of its areas of emphasis. Quisitive has offices in Dallas, Denver and Toronto.

Zebra Technologies targets healthcare partners

Zebra Technologies Corp. is offering preferred product pricing to participants in its recently launched healthcare specialization program.

The Lincolnshire, Ill., company targets a number of verticals, but has created a series of purpose-built products for the healthcare industry. Those include mobile computers, barcode scanners, printers, patient wristbands and barcode label supplies.

Bill Cate, vice president of global channel strategy, programs and operations at Zebra Technologies, said healthcare has emerged as the company’s strongest growth opportunity, with the North American market experiencing particularly rapid expansion.

Zebra Technologies’ healthcare specialization, part of the company’s PartnerConnect Partner Program, has four segments:

  • Healthcare solutions specialists — Companies in this category dedicate their entire business model to healthcare. Cate pointed to Cerner and McKesson as examples.
  • Group purchasing organization (GPO) provider specialists — GPOs aggregate purchasing to negotiate vendor discounts for healthcare organizations. This specialization segment is designed for partners that have built healthcare sales groups. Cate cited CDW and Insight Enterprises as examples.
  • Advanced specialists — This segment is for hardware and services integrators that have devoted a large percentage of their businesses to healthcare.
  • Specialists — Partners in this segment include systems integrators that maintain a focus on healthcare, but dedicate a smaller percentage of their business to that vertical compared with advanced specialists.

Benefits for partners obtaining Zebra Technologies’ healthcare specialization include preferred pricing on the company’s purpose-built healthcare products. Other features include performance rebates, deal registration and specialized channel account managers.

Battery tech promises longer flights for drone service providers

Impossible Aerospace has unveiled a drone capable of flying two hours on a single charge, a development the company said dramatically increases the battery-powered flight time available to drone services providers.

The company manufactures the quad-rotor US-1 drones in Sunnyvale, Calif., and plans to begin deliveries in the fourth quarter of 2018. Impossible Aerospace also announced a $9.4 million round of funding, in which Bessemer Venture Partners is taking the lead. The company has raised more than $11 million in funding overall.

Spencer Gore, founder and CEO at Impossible Aerospace, said the two-hour unladen flight time can make an important difference for drone service providers accustomed to fight times in the 20-minute range — or less. Drone missions may be worth hundreds of dollars per hour for drone service providers, but the amount of time such companies spend changing or charging batteries proves an economic drain.

The US-1 drone’s flight time can “help drone service providers counting on more endurance in order to do their work,” Gore said.

Impossible Aerospace is positioning its initial drone as a surveillance product, geared toward law enforcement, public safety and private security organizations. Gore said the longer flight time stems from the company’s engineering approach, which starts with the battery as opposed to the airframe. The company’s design principle is to make sure “everything in the aircraft serves at least two roles,” one of which should be storing, using or transporting electricity, Gore said.

Other news

  • Bomgar’s pending acquisition of BeyondTrust in the privileged access management space could expand sales opportunities for channel partners working with those companies. The combined company will create an integrated channel program. “Based on the sizable scale and mass of BeyondTrust’s channel presence, we will continue to build upon what has already been successful,” said Matt Dircks, CEO at Bomgar. “We expect that the integration process will enable legacy partners to cross-sell both BeyondTrust and Bomgar products by 2019.” The deal is expected to close in October 2018.
  • Adtran, a networking solutions provider, unveiled an enterprise Wi-Fi solution for service providers. The company said the offering features machine learning technology and a cloud-managed IT model.
  • Identity and access management vendor OneLogin has bolstered its partner program. The OneLogin Accelerate program will now feature expanded training programs, new sales tools and incentives, and a partner portal featuring deal management and marketing campaigns, the company said. OneLogin noted that for a limited time it will offer extra margins for enterprise deals.
  • InterVision, a solution provider headquartered in Santa Clara, Calif., and St. Louis, expanded its cloud services capabilities with the acquisition of Infiniti Consulting Group. Infiniti, based in Folsom, Calif., will provide InterVision with expertise in AWS and Azure, hybrid cloud, on-premises cloud services, and software design and development, according to InterVision.
  • Green House Data, a company that provides managed services, cloud hosting and Microsoft advisory services, announced an international expansion. The company said it has open positions in Costa Rica and Sri Lanka, noting that it now provides IT services and consulting from six countries.
  • Kofax has entered an alliance with PricewaterhouseCoopers to provide intelligent automation solutions. Kofax offers robotic process automation and digital transformation products.
  • HyperGrid, a hybrid cloud management vendor, announced the closing of a $25 million Series C funding round. The move follows a year of 300% revenue growth across the company’s enterprise and MSP customer base.
  • Software analytics company New Relic revealed a developer program. The program helps customers and partners take advantage of application and infrastructure data, enhance their New Relic data capabilities, and automate New Relic into their workflows, New Relic said.
  • Nonprofit IT trade association CompTIA introduced a member community focused on emerging technologies. The CompTIA Emerging Technology Community will explore opportunities involving a number of developing technologies, such as IoT, 5G wireless, 3D printing and quantum computing, CompTIA said.
  • NetEnrich, a managed cloud services provider, appointed David Dragonetti to vice president of global sales.
  • Convey Services said it added Mango Voice and ComTec Cloud to its roster of vertical market solutions available through its Channel Accelerator program. Mango Voice, a hosted voice provider, specializes in the hospitality and dental industries. ComTec, meanwhile, provides a unified communications voice platform in the healthcare and education markets.

Market Share is a news roundup published every Friday.

British Airways data breach may be the work of Magecart

The British Airways data breach may have been the handiwork of the threat actor group known as Magecart.

Security researchers at the threat intelligence company RiskIQ Inc., reported that they suspect Magecart was behind the late August British Airways data breach, based on their analysis of the evidence. The Magecart group focuses on online credit card skimming attacks and is believed to be behind the Ticketmaster data breach discovered in June 2018.

British Airways reported it had suffered a breach on Sept. 6 that affected around 380,000 customers. The company said personal and payment information were used in payment transactions made on the website and the mobile app between Aug. 21 and Sept. 5.

In a blog post published a week later, RiskIQ researcher Yonathan Klijnsma said that because the British Airways data breach announcement stated that the breach had affected the website and mobile app but made no mention of breaches of databases or servers, he noticed similarities between this incident and the Ticketmaster breach.

The Ticketmaster breach was caused by a web-based credit card skimming scheme that targeted e-commerce sites worldwide. The RiskIQ team said that the Ticketmaster breach was the work of the hacking group Magecart, and was likely not an isolated incident, but part of a broader campaign run by the group.

The similarities between the Ticketmaster breach and the reports of the British Airways data breach led Klijnsma and the RiskIQ team to look at Magecart’s activity.

“Because these reports only cover customer data stolen directly from payment forms, we immediately suspected one group: Magecart,” Klijnsma wrote. “The same type of attack happened recently when Ticketmaster UK reported a breach, after which RiskIQ found the entire trail of the incident.”

Klijnsma said they were able to expand the timeline of the Ticketmaster activity and discover more websites affected by online credit card skimming.

“Our first step in linking Magecart to the attack on British Airways was simply going through our Magecart detection hits,” Klijnsma explained. “Seeing instances of Magecart is so common for us that we get at least hourly alerts for websites getting compromised with their skimmer-code.”

He noted that in the instance of the British Airways data breach, the research team had no notifications of Magecart’s activity because the hacking group customized their skimmer. However, they examined British Airways’ web and mobile apps specifically and noticed the similarities — and the differences.

The fact they likely had access long before the attack even started is a stark reminder about the vulnerability of web-facing assets.
Yonathan Klijnsmathreat researcher, RiskIQ

“This attack is a simple but highly targeted approach compared to what we’ve seen in the past with the Magecart skimmer which grabbed forms indiscriminately,” Klijnsma wrote. “This particular skimmer is very much attuned to how British Airway’s (sic) payment page is set up, which tells us that the attackers carefully considered how to target this site instead of blindly injecting the regular Magecart skimmer.”

Klijnsma also said it was likely Magecart had access to the British Airways website and mobile app before the attack reportedly started.

“While we can never know how much reach the attackers had on the British Airways servers, the fact that they were able to modify a resource for the site tells us the access was substantial, and the fact they likely had access long before the attack even started is a stark reminder about the vulnerability of web-facing assets,” he wrote.

Magecart, RiskIQ noted, has been active since 2015 and has been growing progressively more threatening as it customizes its skimming schemes for particular brands and companies.

In other news

  • President Donald Trump signed an executive order this week that imposes sanctions on anyone who attempts to interfere with U.S. elections. After Russian interference in the 2016 U.S. presidential election, there are fears that there will be further interference in the upcoming 2018 midterm election. In response to those fears, Trump signed an executive order that sanctions would be placed on foreign companies, organizations or individuals who have interfered with U.S. elections. The order says that government agencies must report any suspicious, malicious activity to the director of national intelligence, who will then investigate the report and determine its validity. If the director of national intelligence finds that the suspect group or individual has interfered, there will be a 45-day review and assessment period during which the Department of Justice and Homeland Security will decide whether sanctions are warranted. If they are, the foreign group or individual could have their U.S. assets frozen or be banned from the country.
  • A vulnerability in Apple’s Safari web browser enables attackers to launch phishing attacks. Security researcher Rafay Baloch discovered the vulnerability and was also able to replicate it in the Microsoft Edge browser. Baloch published the proof of concept for both browser vulnerabilities early this week, and while Microsoft had addressed the issue in its August Patch Tuesday release — citing an issue with properly parsing HTTP content as the cause — Apple has yet to issue any patches for it. The vulnerability in Safari iOS 11.3.1 could thus still be used to spoof address bars and trick users into thinking they are visiting a legitimate site that is actually malicious.
  • The hacker known as “Guccifer” will be extradited to the U.S. to serve a 52-month prison sentence. A Romanian court ruled that the hacker, who is known for exposing the misuse of Hillary Clinton’s private email server before the 2016 U.S. presidential election and whose real name is Marcel Lehel Lazar, will be extradited to America to serve his 52-month sentence after finishing his seven-year sentence in Romania — his home country. Lazar pleaded guilty in May 2016 to charges of unauthorized access to a protected computer and aggravated identity theft. Lazar is believed to have hacked into the accounts of around 100 people between 2012 and 2014, including former Secretary of State Colin Powell, CBS Sports’ Jim Nantz and Sidney Blumenthal, a former political aide to Bill Clinton and adviser to Hillary Clinton.

The new business imperative: A unified cloud security strategy

As more businesses begin to explore the benefits of moving on-premises data and applications to the cloud, they’re having to rethink their traditional approaches to data security. Not only are cybercriminals developing more sophisticated attacks, but the number of employees and users who can access, edit, and share data has increased the risk of breaches. In fact, Gartner indicates* that “through 2022, 95 percent of cloud security incidents failures will be the customer’s fault. CIOs can combat this by implementing and enforcing policies on cloud ownership, responsibility and risk acceptance. They should also be sure to follow a life cycle approach to cloud governance and put in place central management and monitoring planes to cover the inherent complexity of multicloud use.”

Instead of relying on a patchwork of third-party security solutions that don’t always speak to each other, potentially leaving systems vulnerable to attack, companies are now adopting a unified, end-to-end cloud security defense. This typically involves choosing a cloud provider that can integrate security controls right into existing corporate systems and processes. When these controls span the entire IT infrastructure, they make it easier to protect data and maintain user trust by offering increased compatibility, better performance, and more flexibility.

Protection that’s always compatible

A holistic, cloud-supported threat warning and detection system can be designed to work seamlessly across every asset of an IT environment. For instance, built-in security management solutions can give IT teams the ability to constantly monitor the entire system from a centralized location, rather than manually evaluating different machines. This allows them to sense threats early, provide identity monitoring, and more—all without any compatibility issues.

Container shipping company Mediterranean Shipping Company (MSC) has gone this route. As in many businesses, MSC’s IT environment is spread across a variety of locations, networks, and technologies, such as container ships, trucking networks, and offices. Its previous security strategy employed a mixture of third-party solutions that often ran into compatibility issues between different components, giving attackers a large surface area to probe. This made MSC vulnerable to threats such as fileless attacks, phishing, and ransomware. However, after transitioning to a unified cloud security solution, it has been able to guard against attacks using protection that integrates effortlessly into its existing environment.

Reliable performance, more efficiently

The more complex an IT environment gets, the more time employees spend testing, maintaining, and repairing third-party security solutions. A unified cloud security approach improves performance by not only providing a consistent, layered defense strategy, but by also automating it across the entire IT infrastructure. At MSC, software and security updates are now done automatically and deployed without delay across the cloud. Information about possible threats and breaches can quickly be shared across devices and identities, speeding up response and recovery times so that employees can focus on other issues.

Security with flexibility to grow

Scalability is another factor driving adoption. A cloud environment can easily scale to accommodate spikes in traffic, additional users, or data-intensive applications. A patchwork of third-party security solutions tends not to be so nimble. At MSC, security controls are integrated into multiple levels of the existing IT infrastructure—from the operating system to the application layer—and can be dynamically sized to meet new business needs. For example, continuous compliance controls can be established to monitor regulatory activities and detect vulnerabilities as they grow.

A unified security approach: becoming the standard

The best security solutions perform quietly in the background, protecting users without them noticing. Unified cloud security does this while also reducing the resources required to keep things running smoothly. “Once you have true defense in depth, there’s less chance of having to single out a user and impact their productivity because you have to reimage an infected machine,” said Aaron Shvarts, chief security officer at MSC Technology North America.

After moving its workloads to Azure and upgrading its previous third-party security solutions to the native protection of Windows Defender, MSC now has a defense strategy that suits the complexity of its business. Learn more about Azure security solutions and how Microsoft can help you implement unified security across your cloud.

To stay up to date on the latest news about Microsoft’s work in the cloud, bookmark this blog and follow us on Twitter, Facebook, and LinkedIn.

*Gartner, Smarter with Gartner, Is the Cloud Secure?, 27 March 2018, https://www.gartner.com/smarterwithgartner/is-the-cloud-secure/

Trend Micro apps on Mac accused of stealing data

Researchers charged that multiple apps in the Mac App Store were stealing data and Apple removed the offending apps from the store, but now Trend Micro is refuting the claims against its apps.

At least eight apps — six Trend Micro apps and two published by a developer who goes by the name “Yongming Zhang” — were found to be gathering data, including web browsing history, App Store browsing history and a list of installed apps, from user systems. Reports about the apps potentially stealing data first appeared on the Malwarebytes forum in late 2017, but the issues were confirmed recently by at least three individuals: Patrick Wardle, CEO and founder of Digita Security, a security researcher based in Germany who goes by the Twitter handle @privacyis1st, and Thomas Reed, director of Mac and mobile at Malwarebytes Labs.

Wardle dug into claims by @privacyis1st that the number four ranked paid app, published by “Yongming Zhang” in the Mac App Store — Adware Doctor — was stealing data. At first Wardle saw the app was behaving normally until it came time to “clean” the user system, when he observed the app stealing browser history data and a list of installed apps.

“From a security and privacy point of view, one of the main benefits of installing applications from the official Mac App Store is that such applications are sandboxed. (The other benefit is that Apple supposedly vets all submitted applications – but as we’ve clearly shown here, they (sometimes?) do a miserable job),” Wardle wrote in a blog post. “When an application runs inside a sandbox it is constrained by what files or user information it can access. For example, a sandboxed application from the Mac App Store should not be able to access a user’s sensitive browser history. But Adware Doctor clearly found [a way].”

Trend Micro apps and company response

Adware Doctor and another app — Open Any Files: RAR Support — were developed by an unknown developer whose identity is based on the name of a notorious Chinese serial killer, Zhang Yongming, who was executed in 2013 after being convicted on killing 11 boys and young men. In addition to these apps stealing data, Reed noted in his analysis that at least two Trend Micro apps appeared to be acting improperly.

Reed said he “saw the same data being collected and also uploaded in a file named file.zip to the same URL used by Open Any Files” in the app Dr. Antivirus. Reed said Open Any Files and the Trend Micro apps were uploading the zip file to Trend Micro servers.

“Unfortunately, other apps by the same developer are also collecting this data. We observed the same data being collected by Dr. Cleaner, minus the list of installed applications,” Reed wrote in his analysis. “There is really no good reason for a ‘cleaning’ app to be collecting this kind of user data, even if the users were informed, which was not the case.”

Trend Micro admitted that its apps — Dr Cleaner, Dr Cleaner Pro, Dr. Antivirus, Dr. Unarchiver, Dr. Battery, and Duplicate Finder — were removed from the Mac App Store, but denied that the apps were “stealing” data and sending that data to Chinese servers.

The company said in its response that the Trend Micro apps were collecting and uploading “a small snapshot of the browser history on a one-time basis, covering the 24 hours prior to installation,” but claimed this functionality was “for security purposes” and that the actions were permitted by users as part of the EULA agreed to on installation.

Trend Micro linked to a support page for Dr. Cleaner that showed browser history as one of the types of data collected with user permission, but Reed said on Twitter that he kept archived copies of the apps and he did not find any in-app notifications about data collection.

Despite denying any wrongdoing, Trend Micro said it was taking steps to “reassure” users that their data was safe.

“First, we have completed the removal of browser collection features across our consumer products in question. Second, we have permanently dumped all legacy logs, which were stored on US-based AWS servers. This includes the one-time 24 hour log of browser history held for three months and permitted by users upon install,” Trend Micro wrote. “Third, we believe we identified a core issue which is humbly the result of the use of common code libraries. We have learned that browser collection functionality was designed in common across a few of our applications and then deployed the same way for both security-oriented as well as the non-security oriented apps such as the ones in discussion. This has been corrected.”

It is unclear why Open Any Files was uploading data to Trend Micro servers or if Trend Micro was the only company with access to the data uploaded by any of the Trend Micro apps.

Trend Micro did not respond to questions at the time of this post.

Apple’s responsibility in the Mac App Store

Despite being a central figure in the story of the Trend Micro apps being removed from the Mac App Store, the one company that has kept quiet has been Apple. Apple has not made a public statement and did not respond to requests for comment at the time of this post.

Apple claims, “The safest place to download apps for your Mac is the Mac App Store. Apple reviews each app before it’s accepted by the store, and if there’s ever a problem with an app, Apple can quickly remove it from the store.” But, Wardle said “it’s questionable whether these statements actually hold true,” given the number of apps found to be stealing data and Wardle pointed out that the Mac App Store has known issues with fake reviews propping up bad apps.

Stefan Esser, CEO of Antid0te UG, a security audit firm based in Cologne, Germany, also criticized Apple’s response to the claims apps in its store were stealing data.

“The fact that Apple was informed about this weeks ago and [chose] to ignore and that they finally reacted after bad press like two days before their announcement of new products for you to buy is for sure just coincidence,” Esser wrote on Twitter.

And Reed said it’s best to not trust certain apps in the Mac App Store.

Create and configure a shielded VM in Hyper-V

Creating a shielded VM to protect your data is a relatively straightforward process that consists of a few simple steps and PowerShell commands.

A shielded VM depends on a dedicated server separate from the Hyper-V host that runs the Host Guardian Service (HGS). The HGS server must not be domain-joined because it is going to take on the role of a special-purpose domain controller. To install HGS, open an administrative PowerShell window and run this command:

Install-WindowsFeature -Name HostGuardianServiceRole -Restart

Once the server reboots, create the required domain. Here, the password is P@ssw0rd and the domain name is PoseyHGS.net. Create the domain by entering these commands:

$AdminPassword = ConvertTo-SecureString -AsPlainText ‘P@ssw0rd’ -Force

Install-HgsServer -HgsDomainName ‘PoseyHGS.net’ -SafeModeAdministratorPassword $AdminPassword -Restart

Install the HGS server.
Figure A. This is how to install the Host Guardian Service server.

The next step in the process of creating and configuring a shielded VM is to create two certificates: an encryption certificate and a signing certificate. In production, you must use certificates from a trusted certificate authority. In a lab environment, you can use self-signed certificates, such as those used in the example below. To create these certificates, use the following commands:

$CertificatePassword = ConvertTo-SecureString -AsPlainText ‘P@ssw0rd’ -Force
$SigningCert = New-SelfSignedCertificate -DNSName “signing.poseyhgs.net”
Export-PfxCertificate -Cert $SigningCert -Password $CertificatePassword -FilePath ‘c:CertsSigningCert.pfx’
$EncryptionCert=New-SelfSignedCertificate -DNSName “encryption.poseyhgs.net”
Export-PfxCertificate -Cert $EncryptionCert -Password $CertificatePassword -FilePath ‘C:certsEncryptionCert.pfx’

Create the certificates.
Figure B. This is how to create the required certificates.

Now, it’s time to initialize the HGS server. To perform the initialization process, use the following command:

Initialize-HGSServer -HGSServiceName ‘hgs’ -SigningCertificatePath ‘C:certsSigningCert.pfx’ -SigningCertificatePassword $CertificatePassword -EncryptionCertificatePath ‘C:certsEncryptionCert.pfx’ -EncryptionCertificatePassword $CertificatePassword -TrustTPM

The initialization process
Figure C. This is what the installation process looks like.

The last thing you need to do when provisioning the HGS server is to set up conditional domain name service (DNS) forwarding. To do so, use the following commands:

Add-DnsServerConditionalForwardZone -Name “PoseyHDS.net” -ReplicationScope “Forest” -MasterServers

Netdom trust PoseyHDS.net /domain:PoseyHDS.net /userD:PoseyHDS.netAdministrator /password: /add

In the process of creating and configuring a shielded VM, the next step is to add the guarded Hyper-V host to the Active Directory (AD) domain that you just created. You must create a global AD security group called GuardedHosts. You must also set up conditional DNS forwarding on the host so the host can find the domain controller.

Once all of that is complete, retrieve the security identifier (SID) for the GuardedHosts group, and then add that SID to the HGS attestation host group. From the domain controller, enter the following command to retrieve the group’s SID:

Get-ADGroup “GuardedHosts” | Select-Object SID

Once you know the SID, run this command on the HGS server:

Add-HgsAttestationHostGroup -Name “GuardedHosts” -Identifier “

Now, it’s time to create a code integrity policy on the Hyper-V server. To do so, enter the following commands:

New-CIPPolicy -Level FilePublisher -Fallback Hash -FilePath ‘C:PolicyHWLCodeIntegrity.xml’

ConvertFrom-CIPolicy -XMLFilePath ‘C:PolicyHwlCodeIntegrity.xml’ -BinaryFilePath ‘C:PolicyHWLCodeIntegrity.p7b’

Now, you must copy the P7B file you just created to the HGS server. From there, run this command:

Add-HGSAttestationCIPolicy -Path ‘C:HWLCodeIntegrity.p7b’ -Name ‘StdGuardHost’

Get-HGSServer

At this point, the server should display an attestation URL and a key protection URL. Be sure to make note of both of these URLs. Now, go back to the Hyper-V host and enter this command:

Set-HGSClientConfiguration -KeyProtectionServerURL “” -AttestationServerURL “

To wrap things up on the Hyper-V server, retrieve an XML file from the HGS server and import it. You must also define the host’s HGS guardian. Here are the commands to do so:

Invoke-WebRequest “/service/metadata/2014-07/metadata.xml” -OutFile ‘C:certsmetadata.xml’

Import-HGSGuardian -Path ‘C:certsmetadata.xml’ -Name ‘PoseyHGS’ -AllowUntrustedRoot

Shield a Hyper-V VM.
Figure D. Shield a Hyper-V VM by selecting a single checkbox.

Once you import the host guardian into the Hyper-V server, you can use PowerShell to configure a shielded VM. However, you can also enable shielding directly through the Hyper-V Manager by selecting the Enable Shielding checkbox on the VM’s Settings screen, as shown in Figure D above.

For Sale – Netgear GS724T 24-port Gigabit Smart Switch

Had this from new and never thought I’d be selling it, so didn’t keep the box. Been in a data cabinet it’s whole life. I’ve lost track on when I bought it, but it has been faultless.

Really is like new, but I seem to have lost one of the plastic plugs for one of the SFP ports. Obviously has the rack ears to install in a rack.

Had my whole house wired for cat6, so went for a rack mount switch and it has been excellent.

These have a lifetime warranty I believe. This is the v3. I see there is a v4 out now, but these things don’t change that much.

Pick up most welcome from N10 (North London) or plus £8 post.

I can include a cat6 network cable and have more available.

Need it gone as I’ve move into a smaller place, so may try elsewhere soon.

Thanks for looking.
.
IMG_6617.JPG

IMG_6618.JPG

IMG_6619.JPG

IMG_6620.JPG

IMG_6621.JPG

Price and currency: 50
Delivery: Delivery cost is not included
Payment method: BT ppF&F
Location: N10
Advertised elsewhere?: Advertised elsewhere
Prefer goods collected?: I prefer the goods to be collected

______________________________________________________
This message is automatically inserted in all classifieds forum threads.
By replying to this thread you agree to abide by the trading rules detailed here.
Please be advised, all buyers and sellers should satisfy themselves that the other party is genuine by providing the following via private conversation to each other after negotiations are complete and prior to dispatching goods and making payment:

  • Landline telephone number. Make a call to check out the area code and number are correct, too
  • Name and address including postcode
  • Valid e-mail address

DO NOT proceed with a deal until you are completely satisfied with all details being correct. It’s in your best interest to check out these details yourself.

Samsung adds Z-NAND data center SSD

Samsung’s lineup of data center solid-state drives– including a Z-NAND model — introduced this week targets smaller organizations facing demanding workloads such as in-memory databases, artificial intelligence and IoT.

The fastest option in the Samsung data center SSD family — the 983 ZET NVMe-based PCIe add-in card — uses the company’s latency-lowering Z-NAND flash chips. Earlier this year, Samsung announced its first Z-NAND-based enterprise SSD, the SZ985, designed for the OEM market. The new 983 ZET SSD targets SMBs, including system builders and integrators, that buy storage drives through channel partners.

The Samsung data center SSD lineup also adds the first NVMe-based PCIe SSDs designed for channel sales in 2.5-inch U.2 and 22-mm-by-110-mm M.2 form factors. At the other end of the performance spectrum, the new entry-level 2.5-inch 860 DCT 6 Gbps SATA SSD targets customers who want an alternative to client SSDs for data center applications, according to Richard Leonarz, director of product marketing for Samsung SSDs.

Rounding out the Samsung data center SSD product family is a 2.5-inch 883 DCT SATA SSD that uses denser 3D NAND technology, which Samsung calls V-NAND, than comparable predecessor models. Samsung’s PM863 and PM863a SSDs use 32-layer and 48-layer V-NAND respectively, but the new 883 DCT SSD is equipped with triple-level cell (TLC) 64-layer V-NAND chips, as are the 860 DCT and 983 DCT models, Leonarz said.

Noticeably absent from the Samsung data center SSD product line is 12 Gbps SAS. Leonarz said research showed SAS SSDs trending flat to downward in terms of units sold. He said Samsung doesn’t see a growth opportunity for SAS on the channel side of the business that sells to SMBs such as system builders and integrators. Samsung will continue to sell dual-ported enterprise SAS SSDs to OEMs.

Samsung 983 ZET NVMe SSD
The Samsung 983 ZET NVMe SSD uses its latency-lowering Z-NAND flash chips.

Z-NAND-based SSD uses SLC flash

The Z-NAND technology in the new 983 ZET SSD uses high-performance single-level cell (SLC) V-NAND 3D flash technology and builds in logic to drive latency down to lower levels than standard NVMe-based PCIe SSDs that store two or three bits of data per cell.

Samsung positions the Z-NAND flash technology it unveiled at the 2016 Flash Memory Summit as a lower-cost, high-performance alternative to new 3D XPoint nonvolatile memory that Intel and Micron co-developed. Intel launched 3D XPoint-based SSDs under the brand name Optane in March 2017, and later added Optane dual inline memory modules (DIMMs). Toshiba last month disclosed its plans for XL-Flash to compete against Optane SSDs.

Use cases for Samsung’s Z-NAND NVMe-based PCIe SSDs include cache memory, database servers, real-time analytics, artificial intelligence and IoT applications that require high throughput and low latency.

“I don’t expect to see millions of customers out there buying this. It’s still going to be a niche type of solution,” Leonarz said.

Samsung claimed its SZ985 NVMe-based PCIe add-in card could reduce latency by 5.5 times over top NVMe-based PCIe SSDs. Product data sheets list the SZ985’s maximum performance at 750,000 IOPS for random reads and 170,000 IOPS for random writes, and data transfer rates of 3.2 gigabytes per second (GBps) for sequential reads and 3 GBps for sequential writes.

The new Z-NAND based 983 ZET NVMe-based PCIe add-in card is also capable of 750,000 IOPS for random reads, but the random write performance is lower at 75,000 IOPS. The data transfer rate for the 983 ZET is 3.4 GBps for sequential reads and 3 GBps for sequential writes. The 983 ZET’s latency for sequential reads and writes is 15 microseconds, according to Samsung.

Both the SZ985 and new 983 ZET are half-height, half-length PCIe Gen 3 add-in cards. Capacity options for the 983 ZET will be 960 GB and 480 GB when the SSD ships later this month. SZ985 SSDs are currently available at 800 GB and 240 GB, although a recent product data sheet indicates 1.6 TB and 3.2 TB options will be available at an undetermined future date.

Samsung’s SZ985 and 983 ZET SSDs offer significantly different endurance levels over the five-year warranty period. The SZ985 is rated at 30 drive writes per day (DWPD), whereas the new 983 ZET supports 10 DWPD with the 960 GB SSD and 8.5 DWPD with the 480 GB SSD.

Samsung data center SSD endurance

The rest of the new Samsung data center SSD lineup is rated at less than 1 DWPD. The entry-level SATA 860 DCT SATA SSD supports 0.20 DWPD for five years or 0.34 DWPD for three years. The 883 DCT SATA SSD and 983 DCT NVMe-based PCIe SSD are officially rated at 0.78 DWPD for five years, with a three-year option of 1.30 DWPD.

Samsung initially targeted content delivery networks with its 860 DCT SATA SSD, which is designed for read-intensive workloads. Sequential read/write performance is 550 megabytes per second (MBps) and 520 MBps, and random read/write performance is 98,000 IOPS and 19,000 IOPS, respectively, according to Samsung. Capacity options range from 960 GB to 3.84 TB.

“One of the biggest challenges we face whenever we talk to customers is that folks are using client drives and putting those into data center applications. That’s been our biggest headache for a while, in that the drives were not designed for it. The idea of the 860 DCT came from meeting with various customers who were looking at a low-cost SSD solution in the data center,” Leonarz said.

He said the 860 DCT SSDs provide consistent performance for round-the-clock operation with potentially thousands of users pinging the drives, unlike client SSDs that are meant for lighter use. The cost per GB for the 860 DCT is about 25 cents, according to Leonarz.

The 883 DCT SATA SSD is a step up, at about 30 cents per GB, with additional features such as power loss protection. The performance metrics are identical to the 860 DCT, with the exception of its higher random writes of 28,000 IOPS. The 883 DCT is better suited to mixed read/write workloads for applications in cloud data centers, file and web servers and streaming media, according to Samsung. Capacity options range from 240 GB to 3.84 TB.

The 983 DCT NVMe-PCIe SSD is geared for I/O-intensive workloads requiring low latency, such as database management systems, online transaction processing, data analytics and high performance computing applications. The 2.5-inch 983 DCT in the U.2 form factor is hot swappable, unlike the M.2 option. Capacity options are 960 GB and 1.92 TB for both form factors. Pricing for the 983 DCT is about 34 cents per GB, according to Samsung.

The 983 DCT’s sequential read performance is 3,000 MBps for each of the U.2 and M.2 983 DCT options. The sequential write performance is 1,900 MBps for the 1.92 TB U.2 SSD, 1,050 MBps for the 960 GB U.2 SSD, 1,400 MBps for the 1.92 TB M.2 SSD, and 1,100 MBps for the 960 GB M.2 SSD. Random read/write performance for the 1.92 TB U.2 SSD is 540,000 IOPS and 50,000 IOPS, respectively. The read/write latency is 85 microseconds and 80 microseconds, respectively.

The 860 DCT, 883 DCT and 983 DCT SSDs are available now through the channel, and the 983 ZET is due later this month.

For Sale – Netgear GS724T 24-port Gigabit Smart Switch

Had this from new and never thought I’d be selling it, so didn’t keep the box. Been in a data cabinet it’s whole life. I’ve lost track on when I bought it, but it has been faultless.

Really is like new, but I seem to have lost one of the plastic plugs for one of the SFP ports. Obviously has the rack ears to install in a rack.

Had my whole house wired for cat6, so went for a rack mount switch and it has been excellent.

These have a lifetime warranty I believe. This is the v3. I see there is a v4 out now, but these things don’t change that much.

Pick up most welcome from N10 (North London) or plus £8 post.

I can include a cat6 network cable and have more available.

Need it gone as I’ve move into a smaller place, so may try elsewhere soon.

Thanks for looking.
.
IMG_6617.JPG

IMG_6618.JPG

IMG_6619.JPG

IMG_6620.JPG

IMG_6621.JPG

Price and currency: 50
Delivery: Delivery cost is not included
Payment method: BT ppF&F
Location: N10
Advertised elsewhere?: Advertised elsewhere
Prefer goods collected?: I prefer the goods to be collected

______________________________________________________
This message is automatically inserted in all classifieds forum threads.
By replying to this thread you agree to abide by the trading rules detailed here.
Please be advised, all buyers and sellers should satisfy themselves that the other party is genuine by providing the following via private conversation to each other after negotiations are complete and prior to dispatching goods and making payment:

  • Landline telephone number. Make a call to check out the area code and number are correct, too
  • Name and address including postcode
  • Valid e-mail address

DO NOT proceed with a deal until you are completely satisfied with all details being correct. It’s in your best interest to check out these details yourself.

Graph processing gives credit analysis firms an edge

Graph databases have emerged as yet another way to connect data points — but graph processing requirements for big data have sometimes kept them out of the reach of real-time, operational analytics.

Startup TigerGraph has come along recently with its take on graph processing, one the company said will prove to be a fit in fintech and other applications that seek to disrupt business as usual.

Led by former Teradata Hadoop engineer Yu Xu, TigerGraph came out of stealth in 2017, claiming such notable users as Visa and the online payment platform Alipay for its graph database technology. The software has been used by these players, for example, to speed up credit checks and other traditionally time-consuming financial processes. 

Credit worthy

According to the company, TigerGraph supports a massively parallel processing architecture in which graph nodes — the company uses the less common term “vertices” — exhibit both compute and storage features; employs a parallel loader to speed data ingestion; and has fashioned a GSQL analytics language to produce parallel graph queries.

IceKredit has found those features useful in its efforts to expand the availability of credit ratings and risk assessments, according to Minqi Xie, vice president and director of modeling and business intelligence at the financial technology company.

“We have very large data sets with hundreds of millions of [nodes], and we need to mine the relationships at depth,” said Xie, who works to ensure IceKredit provides useful online credit ratings and risk monitoring services for companies and individuals.

Xie indicated IceKredit uses graph analytics to uncover connections within data sets that can identify patterns of risk. Such patterns must be uncovered more and more quickly in the fast-moving fintech space, where credit approvals that once took a week are now accomplished in minutes.

“TigerGraph made it feasible for us to leverage the features from relationship networks for real-time scoring,” he said.

Developer view of graph data
TigerGraph graph database development centers around graph nodes, or, in the company’s parlance, vertexes. Users can define vertex types and edge types to model a data schema, which can be loaded into working graphs.

Graph processing cuts through data molasses

Like others, Gaurav Deshpande, a longtime IBM data hand who earlier this year became vice president of marketing at TigerGraph, points to graph processing as superior to relational database schemes when it comes to representing interconnected relationship between data points.

Historically, almost all the graph database vendors have focused on operational capabilities.
Philip Howardanalyst, Bloor Research

But, Deshpande remarks, how quickly these connections can be analyzed can be an obstacle.

Traditionally, he said graph databases “weren’t ‘operational’ when it came to data volumes beyond 50 to 100 GBs. Performance slowed down. When you got to more than 500 GBs, performance was like molasses.”

Such systems were useful for visualizing data, but “they were nothing that could be used operationally,” Deshpande said. The parallelizing tactics TigerGraph applies to its graph database, he indicated, are intended to bring graph processing technology closer to operations.

Complex analytics at scale

TigerGraph is looking to join the ranks of still fairly limited graph NoSQL database makers that employ a number of diverse approaches to implementing the technology. These include vendors such as Neo4J, IBM, and Cambridge Semantics. Among others are Amazon and Microsoft, which have taken steps to bring graph processing technology more mainstream with, respectively, AWS Neptune and Azure Cosmos cloud offerings.

“Historically, almost all the graph database vendors have focused on operational capabilities. They support a certain level of analytics and query processing but not complex analytics at scale,” said Philip Howard, analyst at Bloor Research.

“TigerGraph, on the other hand, is specifically targeted at precisely those sorts of use cases, so you can’t really compare it with most of the other suppliers,” he said.

There are some precedents, however, and signs of change. Bloor noted supercomputer maker Cray offers graph analytics as part of its portfolio. And he sees others ready to follow suit.