Tag Archives: major

IBM Cloud Pak for Security aims to unify hybrid environments

IBM this week launched Cloud Pak for Security, which experts say represents a major strategy shift for Big Blue’s security business

The aim of IBM’s Cloud Pak for Security is to create a platform built on open-source technology that can connect security tools from multiple vendors and cloud platforms in order to help reduce vendor lock-in. IBM Cloud Paks are pre-integrated and containerized software running on Red Hat OpenShift, and previously IBM had five options for Cloud Paks — Applications, Data, Integration, Automation and Multicloud Management — which could be mixed and matched to meet enterprise needs.

Chris Meenan, director of offering management and strategy at IBM Security, told SearchSecurity that Cloud Pak for Security was designed to tackle two “big rock problems” for infosec teams. The first aim was to help customers get data insights through federated search of their existing data without having to move it to one place. Second was to help “orchestrate and take action across all of those systems” via built-in case management and automation. 

Meenan said IT staff will be able to take actions across a multi-cloud environment, including “quarantining users, blocking IP addresses, reimaging machines, restarting containers and forcing password resets.”

“Cloud Pak for Security is the first platform to take advantage of STIX-Shifter, an open-source technology pioneered by IBM that allows for unified search for threat data within and across various types of security tools, datasets and environments,” Meenan said. “Rather than running separate, manual searches for the same security data within each tool and environment you’re using, you can run a single query with Cloud Pak for Security to search across all security tools and data sources that are connected to the platform.” 

Meenan added that Cloud Pak for Security represented a shift in IBM Security strategy because of its focus on delivering “security solutions and outcomes without needing to own the data.”

“That’s probably the biggest shift — being able to deliver that to any cloud or on-premise the customer needs,” Meenan said. “Being able to deliver that without owning the data means organizations can deploy any different technology and it’s not a headwind. Now they don’t need to duplicate the data. That’s just additional overhead and introduces friction.”

One platform to connect them all

Meenan said IBM was “very deliberate” to keep data transfers minimal, so at first Cloud Pak for Security will only take in alerts from connected vendor tools and search results.

“As our Cloud Pak develops, we plan to introduce some capability to create alerts and potentially store data as well, but as with other Cloud Paks, the features will be optional,” Meenan said. “What’s really fundamental is we’ve designed a Cloud Pak to deliver applications and outcomes but you don’t have to bring the data and you don’t have to generate the alerts. Organizations have a SIEM in place, they’ve got an EDR in place, they’ve got all the right alerts and insights, what they’re really struggling with is connecting all that in a way that’s easily consumable.”

In order to create the connections to popular tools and platforms, IBM worked with clients and service providers. Meenan said some connectors were built by IBM and some vendors built their own connectors. At launch, Cloud Pak for Security will include integration for security tools from IBM, Carbon Black, Tenable, Elastic, McAfee, BigFix and Splunk, with integration for Amazon Web Services and Microsoft Azure clouds coming later in Q4 2019, according to IBM’s press release.

Ray Komar, vice president of technical alliances at Tenable, said that from an integration standpoint, Cloud Pak for Security “eliminates the need to build a unique connector to various tools, which means we can build a connector once and reuse it everywhere.”

“Organizations everywhere are reaping the benefits of cloud-first strategies but often struggle to ensure their dynamic environments are secure,” Komar told SearchSecurity. “With our IBM Cloud Pak integration, joint customers can now leverage vulnerability data from Tenable.io for holistic visibility into their cloud security posture.”

Jon Oltsik, senior principal analyst and fellow at Enterprise Strategy Group, based in Milford, Mass., told SearchSecurity that he likes this new strategy for IBM and called it “the right move.”

“IBM has a few strong products but other vendors have much greater market share in many areas. Just about every large security vendor offers something similar, but IBM can pivot off QRadar and Resilient and extend its footprint in its base. IBM gets this and wants to establish Cloud Pak for Security as the ‘brains’ behind security. To do so, it has to be able to fit nicely in a heterogeneous security architecture,” Oltsik said. “IBM can also access on-premises data, which is a bit of unique implementation. I think IBM had to do this as the industry is going this way.”

Martin Kuppinger, founder and principal analyst at KuppingerCole Analysts AG, based in Wiesbaden, Germany, said Cloud Pak for Security should be valuable for customers, specifically “larger organizations and MSSPs that have a variety of different security tools from different vendors in place.”

“This allows for better incident response processes and better analytics. Complex attacks today might span many systems, and analysis requires access to various types of security information. This is simplified, without adding yet another big data lake,” Kuppinger told SearchSecurity. “Obviously, Security Cloud Pak might be perceived competitive by incident response management vendors, but it is open to them and provides opportunities by building on the federated data. Furthermore, a challenge with federation is that the data sources must be up and running for accessing the data — but that can be handled well, specifically when it is only about analysis; it is not about real-time transactions here.”

The current and future IBM Security products

Meenan told SearchSecurity that Cloud Pak for Security would not have any special integration with IBM Security products, which would “have to stand on their own merits” in order to be chosen by customers. However, Meenan said new products in the future will leverage the connections enabled by the Cloud Pak.

“Now what this platform allows us to do is to deliver new security solutions that are naturally cross-cutting, that require solutions that can sit across an EDR, a SIEM, multiple clouds, and enable those,” Meenan said. “When we think about solutions for insider threat, business risk, fraud, they’re very cross-cutting use cases so anything that we create that cuts across and provides that end-to-end security, absolutely the Cloud Pak is laying the foundation for us — and our partners and our customers — to deliver that.”

Oltsik said IBM’s Security Cloud Pak has a “somewhat unique hybrid cloud architecture” but noted that it is “a bit late to market and early versions won’t have full functionality.”

“I believe that IBM delayed its release to align it with what it’s doing with Red Hat,” Oltsik said. “All that said, IBM has not missed the market, but it does need to be more aggressive to compete with the likes of Cisco, Check Point, FireEye, Fortinet, McAfee, Palo Alto, Symantec, Trend Micro and others with similar offerings.”

Kuppinger said that from an overall IBM Security perspective, this platform “is rather consequent.”

“IBM, with its combination of software, software services, and implementation/consultancy services, is targeted on such a strategy of integration,” Kuppinger wrote via email. “Not owning data definitely is a smart move. Good architecture should segregate data, identity, and applications/apps/services. This allows for reuse in modern, service-oriented architectures. Locking-in data always limits that reusability.”

Go to Original Article
Author:

Microsoft challenges Amazon with Dynamics 365 Commerce

Microsoft filled a major gap in its customer experience stack with the Dynamics 365 Commerce online sales platform, giving customers that own physical stores more technology to drive bottom-line revenues. The e-commerce platform is joined by another new app, the Dynamics 365 Connected Store, which combines data collected online with data collected at brick-and-mortar stores.

The idea is not only to enable online sales for traditional retailers, but to also help customers continue their online shopping experiences when they set foot inside a store location, said Alysa Taylor, corporate vice president for business applications and global industry at Microsoft, in a blog post.

Together with other new AI features and data tools added to existing Dynamics 365 applications, Microsoft is giving retailers a strong alternative to Amazon’s platform — but more importantly, it’s challenging integrated CX stacks from Salesforce and Oracle, said Forrester analyst Kate Leggett.

“You can’t support the customer through their end-to-end journey without an e-commerce pillar,” said Leggett, who added that Dynamics 365 Commerce might not be a great leap forward as an e-commerce platform, but it catches Microsoft up to the pack. “It was a real hole in Microsoft’s portfolio.”

Microsoft is focusing its e-commerce platform for B2C retailers for now, Leggett said. Technology vendors sometimes have separate e-commerce platforms for B2B and B2C customers, but Microsoft said it plans to build the B2C side and add B2B-centric features later.

Dynamics 365 Connected Store adds data insights

Dynamics 365 Commerce paired with Connected Store creates a platform for AI and machine learning for behavioral data analysis that can trace customer journeys from online research to their movements through a physical store as they shop. Moreover, Dynamics 365 Connected Store helps store employees personalize their interactions with individual customers by showing them, for example, what the customer was looking at online before they came in.

You can’t support the customer through their end-to-end journey without an e-commerce pillar.
Kate LeggettAnalyst, Forrester Research

Connected Store’s data tools can help optimize store operations on a day-to-day basis by, for example, summoning clerks via phone notifications to help check out customers during busy times. It also analyzes video and inventory data to report on longer-term buying patterns to promote inventory and merchandising efficiencies within a store or region.

“It’s about real-time insights, connected data and analytics — having that data available to deliver outcomes you need,” Leggett said.

Also previewed by Microsoft were related new features for existing applications, including Dynamics 365 Customer Insights, which aggregates IoT data from goods such as connected kitchen appliances that contain sensors transmitting data back to the manufacturer. Another was a set of tools within Dynamics 365 Virtual Agent for Customer Service to make Microsoft chatbots easier to customize and deploy.

Dynamics 365 Connected Store currently is in private preview, while Dynamics 365 Commerce is in public preview. A Microsoft spokesperson said the general availability date would be revealed in the “coming months,” as well as pricing information.

Go to Original Article
Author:

For Sale – Synology DS413 4 bay NAS – £135

Dismiss Notice

We’re migrating the forum software to the next major version and this will require the site to be down for the best part of Monday 16th.
More details here. Apologies for the inconvenience.

Discussion in ‘Desktop Computer Classifieds‘ started by DougAP, Sep 7, 2019 at 8:12 PM.

  1. DougAP

    Well-known Member

    Joined:
    Oct 14, 2012
    Messages:
    1,786
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    116
    Ratings:
    +906

    Bought from the below thread but never used (or even opened!):

    For Sale – Synology DS413 4 bay NAS

    Price and currency: £150 £135
    Delivery: Delivery cost is included within my country
    Payment method: Ppg/bacs
    Location: Hampshire
    Advertised elsewhere?: Not advertised elsewhere
    Prefer goods collected?: I have no preference

    ______________________________________________________
    This message is automatically inserted in all classifieds forum threads.
    By replying to this thread you agree to abide by the trading rules detailed here.
    Please be advised, all buyers and sellers should satisfy themselves that the other party is genuine by providing the following via private conversation to each other after negotiations are complete and prior to dispatching goods and making payment:

    • Landline telephone number. Make a call to check out the area code and number are correct, too
    • Name and address including postcode
    • Valid e-mail address

    DO NOT proceed with a deal until you are completely satisfied with all details being correct. It’s in your best interest to check out these details yourself.

    Last edited: Sep 9, 2019 at 10:11 AM
  2. tmknight

    tmknight

    Active Member

    Joined:
    Oct 22, 2010
    Messages:
    1,255
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    48
    Location:
    Birmingham UK
    Ratings:
    +72

    £120 delivered any good to you?

  3. DougAP

    Well-known Member

    Joined:
    Oct 14, 2012
    Messages:
    1,786
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    116
    Ratings:
    +906

    That’s a bit low I’m afraid, especially as I’m in no rush and it’s just been listed. If you could meet in the middle at £135 I’d do that (my very lowest).

  4. tmknight

    tmknight

    Active Member

    Joined:
    Oct 22, 2010
    Messages:
    1,255
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    48
    Location:
    Birmingham UK
    Ratings:
    +72

    Thank you for the counter offer, if you change your mind my offer is on the table.

  5. DougAP

    Well-known Member

    Joined:
    Oct 14, 2012
    Messages:
    1,786
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    116
    Ratings:
    +906
  6. tmknight

    tmknight

    Active Member

    Joined:
    Oct 22, 2010
    Messages:
    1,255
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    48
    Location:
    Birmingham UK
    Ratings:
    +72

    Having thought about it, I’ll take it for £135 delivered. If you can pm me your details, I’ll BT the funds to you.

    Cheers

  7. DougAP

    Well-known Member

    Joined:
    Oct 14, 2012
    Messages:
    1,786
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    116
    Ratings:
    +906

    Great! Will PM now.

  8. tmknight

    tmknight

    Active Member

    Joined:
    Oct 22, 2010
    Messages:
    1,255
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    48
    Location:
    Birmingham UK
    Ratings:
    +72

    Payment made.

  9. DougAP

    Well-known Member

    Joined:
    Oct 14, 2012
    Messages:
    1,786
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    116
    Ratings:
    +906

    And received.

Share This Page

Loading…

Go to Original Article
Author:

Experts say there’s still a long road ahead for the FHIR standard

A major issue hindering interoperability in healthcare is a lack of data standardization, something federal regulators are trying to change by pushing adoption of the Fast Healthcare Interoperability Resources standard.

FHIR is an interoperability standard developed by Health Level Seven International (HL7) for the electronic exchange of health data. The FHIR standard has gone through multiple iterations and taken five years to develop. It sets a consistent description for healthcare data formats and application programming interfaces that healthcare organizations can use to exchange electronic health records.

In a set of proposed rules for interoperability from the Office of the National Coordinator (ONC) for Health IT and the Centers for Medicare and Medicaid Services (CMS), the agencies would require healthcare organizations to use FHIR-enabled healthcare APIs that would allow patients to download their standardized electronic health information into a healthcare app on their smartphones.

During a panel discussion on the future of interoperability at ONC’s 3rd Interoperability Forum in Washington, D.C., Thursday, panelists including Kisha Hawthorne, CIO of Children’s Hospital of Philadelphia, focused on the reality of using the FHIR standard, and whether the standard will help achieve interoperability in healthcare.

The reality of FHIR standard use today

Will the FHIR standard be a key facilitator of interoperability in healthcare? Panelists agreed that it will — in time. Right now, though, the standard still needs work in the implementation department.

In the provider space there’s a ways to go. But we’re excited and we think it will take hold.
Kisha HawthorneCIO, Children’s Hospital of Philadelphia

Hawthorne said her team at Children’s Hospital of Philadelphia is looking to use the FHIR standard in the provider space to bridge the gaps between the different software vendors with which the organization works.

The hospital uses an Epic EHR, and Hawthorne said that while she thinks vendors like Epic are beginning to implement and use the FHIR standard, she hopes to see that work “fast forward” with Epic and other vendors to make it easier to gather and share, as well as use, data in the provider space. FHIR standard use is something that’s not quite there yet, she said.

“In the provider space, there’s a ways to go,” Hawthorne said. “But we’re excited and we think it will take hold.”

The potential of the FHIR standard is exciting and it will “open a lot of doors,” but the reality is that the standard is immature, said Kristen Valdes, CEO of personal health app b.well Connected Health.

Valdes said that although she thinks the FHIR standard will create a push toward interoperability in healthcare, challenges associated with implementation of the FHIR standard are hindering progress.

A significant number of providers and organizations aren’t “using a fraction” of the implementation guidelines that have been made available for the FHIR standard, she said. While organizations are thinking about the operational impacts of using FHIR on behalf of users, she said there continues to be ongoing debate about the proper HIPAA rules to provide consumers access to their own data, which also hinder its implementation.

“We really have to think about the operational workflows and how it’s going to affect the people who are expected to implement and deploy FHIR,” she said.

The problem with the FHIR standard isn’t the technical aspects of the standard, but the process and people implementing it, said Vik Kheterpal, principal of interoperability product vendor CareEvolution.

As a technology standard, Kheterpal said it makes sense and has already seen relative success in the launch of programs such as CMS’ Blue Button 2.0 program. Blue Button 2.0 uses the FHIR standard for beneficiary data, such as drug prescriptions, primary care cost and treatment. Yet, the problem with the rest of healthcare often lies in misinterpretation of policy when it comes to sharing patient data.

Anil Jain, chief health informatics officer at IBM Watson Health, said he thinks the value of the FHIR standard is real, and organizations already need to think about what’s next once the standard matures.

As use of the FHIR standard grows among healthcare organizations, Jain said it’s important to create businesses cases and models for sharing data that will work using the standard. Otherwise, providers and patients will continue to lack trust in the data, something a standard like FHIR alone won’t give healthcare.

Go to Original Article
Author:

At HR Technology Conference, Walmart says virtual reality works

LAS VEGAS — Learning technology appears to be heading for a major upgrade. Walmart is using virtual reality, or VR, to train its employees, and many other companies may soon do the same.

VR adoption is part of a larger tech shift in employee learning. For example, companies such as Wendy’s are using simulation or gamification to help employees learn about food preparation.

Deploying VR technology is expensive, with cost estimates ranging from tens of thousands of dollars to millions, attendees at the HR Technology Conference learned. But headset prices are declining rapidly, and libraries of VR training tools for dealing with common HR situations — such as how to fire an employee — may make this tool affordable to firms of all sizes.

For Walmart, a payoff of using virtual reality comes from higher job certification test scores. Meanwhile, Wendy’s has been using computer simulations to help employees learn their jobs. It is also adapting its training to the expectations of its workers, and its efforts have led to a turnover reduction. Based on presentations and interviews at the HR Technology Conference, users deploying these technologies are enthusiastic about them.

Walmart employees experience VR’s 3D

“It truly becomes an experience,” said Andy Trainor, senior director of Walmart Academies, in an interview about the impact of VR and augmented reality on training. It’s unlike a typical classroom lesson. “Employees actually feel like they experience it,” he said.

Walmart has adopted virtual reality for its training program.
Walmart’s training and virtual reality team, from left to right: Brock McKeel, senior director of digital operations at Walmart and Andy Trainor, senior director of Walmart Academies.

Walmart employees go to “academies” for training, testing and certification on certain processes, such as taking care of the store’s produce section, interacting with customers or preparing for Black Friday. As one person in a class wears the VR headset or goggles, what that person sees and experiences displays on a monitor for the class to follow.

Walmart has been using VR in training from startup STRIVR for just over a year. In classes using VR, Trainor said the company is seeing an increase in test scores as high as 15% over traditional methods of instruction. Trainor said his team members are convinced VR, with its ability to create 3D simulations, is here to stay as a training tool. 

“Life isn’t 2D,” said Brock McKeel, senior director of digital operations at Walmart. For problems ranging from customer service issues to emergency weather planning, “we want our associates to be the best prepared that we can get them to be.”

Walmart has also created a simulation-type game that helps employees understand store management. The company plans to soon release its simulation as an app for anyone to experience, Trainor said.

The old ways of training are broken

The need to do things differently in learning was a theme at the HR Technology Conference.

Life isn’t 2D.
Brock McKeelsenior director of digital operations at Walmart

The idea that employees will take time out of their day to watch a training video or read material that may not be connected to their task at hand is not effective, said David Mallon, a vice president and chief analyst at Bersin, Deloitte Consulting, based in Oakland, Calif.

The traditional methods of learning “have fallen apart,” Mallon said. Employees “want to engage with content on their terms, when they need it, where they need it and in ways that make more sense.”

Mallon’s point is something Wendy’s realized about its restaurant workers, who understand technology and have expectations about content, said Coley O’Brien, chief people officer at the restaurant chain. Employees want the content to be quick, they want the ability to swipe, and videos should be 30 seconds or less, he said.

“We really had to think about how we evolve our training approach and our content to really meet their expectations,” said O’Brien, who presented at the conference.

Wendy’s also created simulations that reproduce some of the time pressures faced with certain food-preparation processes. Employees must make choices in simulations, and mistakes are tracked. The company uses Cornerstone OnDemand’s platform.

Restaurants in which employees received a certain level of certification see higher sales of 1% to 2%, increases in customer satisfaction and a turnover reduction as high as 20%, O’Brien said.

Tech giants support FHIR standard. Will that make a difference?

During a White House meeting about the new Blue Button 2.0 API for Medicare, six major technology players signed a joint statement pledging to work toward healthcare interoperability with a particular focus on the cloud and artificial intelligence.

The companies — Amazon, Microsoft, Google, IBM, Oracle and Salesforce — promised to support the goal of  “frictionless” interoperability using established industry standards, including the HL7 FHIR standard API. They offered a vision of a robust ongoing dialogue that would include every healthcare entity from payers to patients and application developers, according to a statement released by the Information Technology Industry Council.

Pushing the FHIR standard forward

The statement comes at a time when patient demand for easy access to healthcare data has never been greater. Large hospitals have responded with nascent efforts to improve data exchange based on the FHIR standard API, but there is widespread acknowledgement that healthcare lags far behind other industries when it comes to tech innovation and particularly interoperability. The idea of what could effectively be a consortium of mainstream technology companies working on this tricky problem and promoting the FHIR standard was received warmly by some this week and with a healthy dose of skepticism by others.

The fact that the statement called out cloud usage specifically, is telling, because, for reasons ranging from security to cost, a significant portion of healthcare organizations continue to avoid the cloud. A 2017 report from KLAS Research found 31% of hospitals either won’t expand their cloud efforts or won’t move to the cloud. “The cloud really is a double-edged sword,” said Kathy Downing, vice president of information governance,  informatics, standards, privacy and security at the American Health Information Management Association (AHIMA), in an interview. While the cloud might offer a more secure environment than some smaller health organizations could achieve, Downing isn’t convinced the cloud itself is pivotal to interoperability. “I don’t know that the cloud really has a dog in this interoperability hunt,” she said. “You want to think through the safeguards and do all the assessments. That’s more important than whether you’re using a server or the cloud.”

I’m not sure how any of these entities will solve the issue of semantic interoperability.
John Moorefounder and managing partner of Chilmark Research

It’s a positive sign for the healthcare industry that it’s attracted the attention of these major players, said Coray Tate, vice president of clinical research at KLAS, in an email. But the market has to be there for this to work. “We’re at the base of the mountain and early steps are the easiest,” he said. “It remains to be seen if the market will provide a business case that will sustain the long climb.”

And the business case may not be there because this group of tech companies isn’t in most hospitals in any significant way today, said John Moore, founder and managing partner of Chilmark Research, in an email. “As big and influential as these companies are their collective presence in healthcare is quite disparate and at the end of the day it is what a clinician is using in their workflow that matters,” he explained. “These companies are simply not there. I’m not sure how any of these entities will solve the issue of semantic interoperability.” To further complicate matters, most hospitals don’t want to share patient data with competitors, he said. “They have instead opted to let patients themselves take direct responsibility.”

Tech support potentially a good thing

Attention from tech giants, however, should be seen as a good thing as long as everyone is thoughtful about how to proceed, said Stan Huff, M.D., chief medical informatics officer at Intermountain Healthcare and co-chair of the Health Level 7(HL7) Clinical Information Modeling Initiative, which developed the FHIR standard API. “This is significant because it creates faith in HL7 FHIR and will encourage investment in FHIR development,” he said. “The thing I would want to encourage is that this group work with existing organizations like HL7, ONC, HSPC and CIIC to ensure they all implement the FHIR standard the same way so we get to true semantic interoperability at some point.”

The joint statement offered few details on future plans but stressed the need to get everyone involved, including the open source community. “I think we will need to wait a few weeks to hear specific projects to know what additional impact they will have,” Huff said.

2018 Pwnie Awards cast light and shade on infosec winners

The Meltdown and Spectre side-channel attacks that exploit weaknesses in major processors scored the top spot in two of three Pwnie Award categories — Best Privilege Escalation Bug and Most Innovative Research — but missed on the prize for the most overhyped vulnerability.

The Pwnie Awards, a longtime staple of the Black Hat security conference, are often compared to the Academy Awards, but with spray-painted pony statues, fewer movie stars and more questionable prizes for things like Lamest Vendor Response and Most Overhyped Bug.

This year, the Pwnie Award for Most Innovative Research went to the researchers who discovered the Meltdown and Spectre design flaws. That prize goes to “the most interesting and innovative research in the form of a paper, presentation, tool or even a mailing list post,” according to the Pwnie Awards website. The Pwnie Awards website described Meltdown and Spectre in its nomination for most overhyped bug:

Meltdown and Spectre were vulnerabilities in the way branch prediction worked which would allow attackers the ability to read memory. It was pretty awesome and affected most systems. But at some point, they [sic] hype train jumped the tracks a bit. The normally extremely accurate Fox News called it the worst computer bug in history. One of the researchers who discovered it agreed, calling it ‘probably one of the worst CPU bugs ever found.’ Bloomberg agreed, the Verge said it was a catastrophe.

Meltdown and Spectre also got the Pwnie Award for Best Privilege Escalation Bug — a nod toward the seriousness of the flaws, given how unusual it is for a research team to win in more than one category.

Also worthy of honor

Other Pwnie Awards honored more of the best of security research from the past year, including the following:

  • The Pwnie for Best Server-Side Bug went to the Intel Advanced Management Technology remote vulnerability, a flaw which enabled an exploit that could bypass endpoint protections, including the Windows firewall.
  • The Pwnie for Best Client-Side Bug went to researchers Georgi Geshev and Rob Miller, who built an exploit chain against Android that used 11 bugs in six different applications and was referred to by the Pwnie Awards as “The 12 Logic Bug Gifts of Christmas.”
  • Pwnie for Best Cryptographic Attack went to researchers Hanno Böck, Juraj Somorovsky and Craig Young for their work on the Return Of Bleichenbacher’s Oracle Threat, also known as the ROBOT attack.

The Pwnie Awards initially solicited nominations in 16 categories, but awarded prizes only in the eight categories that received the most nominations, including a Lifetime Achievement Award given to Michal Zalewski, also known as lcamtuf, former director of information security engineering at Google and author of the classic hacker field guide, Silence on the Wire.

Lamest Vendor Response and Most Overhyped Bug

Some of the stiffest competition may have been for the booby prizes.

The competition for overhyped bugs has been fierce recently, as contenders continue to commission websites, logos and social media handles for bugs that might be less than compelling. The nominees for this Pwnie Award honor this year included the Meltdown and Spectre vulnerabilities in microprocessors reported in January, as well as the apparent EFAIL vulnerability in end-to-end encryption technology that turned out to be an issue in email clients.

The winner was a not-quite-tongue-in-cheek parody, Holey Beep, complete with website, logo and tracking assignment as CVE-2018-0492. Beep, a Unix command, “does what you’d expect: it beeps,” according to the description from the Holey Beep website. “Beep allows you to control pitch, duration, and repetitions” of the tone.

But it also can give an attacker root on the target system. “Its job is to live inside shell/perl scripts and allow more granularity than one has otherwise. It is controlled completely through command line options. It’s not supposed to be complex, and it isn’t — but it makes system monitoring (or whatever else it gets hacked into) much more informative. Also it gives you root.”

Meanwhile, Bitfi, maker of the Bitfi Wallet, was the late-entry surprise winner of the Pwnie Award for Lamest Vendor Response. Although the Bifi situation played out just days before Black Hat, The Register reported it received thousands of nominations after hackers comprehensively cracked the devices and demonstrated numerous security failures in the design. Bitfi backed off its offer of a six-figure bounty to any hacker who could manage to hack it by standing behind a very narrow definition of what constituted a hack — namely, pulling the private key off of a device that doesn’t store the key.

The well-documented hacks came after Bitfi’s executive chairman, John McAfee, extolled the device as “the world’s first unhackable storage for cryptocurrency and digital assets.”

As Rev. Robert Ballecer put it on Twitter:

Web cache poisoning attacks demonstrated on major websites, platforms

Major websites and platforms may be vulnerable to simple yet devastating web cache poisoning attacks, which could put millions of users in jeopardy.

James Kettle, head of research at PortSwigger Web Security, Ltd., a cybersecurity tool publisher headquartered near Manchester, U.K., demonstrated several such attacks during his Black Hat 2018 session titled “Practical Web Cache Poisoning: Redefining ‘Unexploitable.'” Kettle first unveiled his web cache poisoning hacks in May, but in the Black Hat session he detailed his techniques and showed how major weaknesses in HTTPS response headers allowed him to compromise popular websites and manipulate platforms such as Drupal and Mozilla’s Firefox browser.

“Web cache poisoning is about using caches to save malicious payloads so those payloads get served up to other users,” he said. “Practical web cache poisoning is not theoretical. Every example I use in this entire presentation is based on a real system that I’ve proven can be exploited using this technique.”

As an example, Kettle showed how he was able to use a simple technique to compromise the home page of Linux distributor Red Hat. He created an open source extension for PortSwigger’s Burp Suite Scanner called Param Miner, which detected unkeyed inputs in the home page. From there, Kettle was able to change the X-Forwarded-Host header and load a cross-site scripting payload to the site’s cache and then craft responses that would deliver the malicious payload to whoever visited the site. “We just got full control over the home page of RedHat.com, and it wasn’t very difficult,” he said.

In another test case, Kettle used web cache poisoning on the infrastructure for Mozilla’s Firefox Shield, which gives users the ability to push application and plug-in updates. When the Firefox browser initially loads, it contacts Shield for updates and other information such as “recipes” for installing extensions. During a different test case on a Data.gov site, he found an “origin: null” header from Mozilla and discovered he could manipulate the “X-Forwarded-Host” header to trick the system so that instead of going to Firefox Shield to fetch recipes, Firefox would instead be directed to a domain Kettle controlled.

Kettle found that Mozilla signed the recipes, so he couldn’t simply make a malicious extension and install it on 50 million computers. But he discovered he could replay old recipes, specifically one for an extension with a known vulnerability; he could then compromise that extension and forcibly inflict that vulnerable extension on every Firefox browser in the world.

“The end effect was I could make every Firefox browser on the planet connect to my system to fetch this recipe, which specified what extensions to install,” he said. “So that’s pretty cool because that’s 50 million browsers or something like that.”

Kettle noted in his research that when he informed Mozilla of the technique, they patched it within 24 hours; but, he wrote, “there was some disagreement about the severity so it was only rewarded with a $1,000 bounty.”

Kettle also demonstrated techniques that allowed him to compromise GoodHire.com, blog.Cloudflare.com and several sites that use Drupal’s content management platform. While the web cache poisoning attacks he demonstrated were potentially devastating, Kettle said they could be mitigated with a few simple steps. First, he said, organizations should “cache with caution” and if possible, disable it completely.

However, Kettle acknowledged that may not be realistic for larger enterprises, so in those cases he recommended diligently scanning for unkeyed inputs. “Avoid taking input from HTTP headers and cookies as much as possible,” he said, “and also audit your applications with Para Miner to see if you can find any unkeyed inputs that your framework has snuck in support for.”

Microsoft Azure Dev Spaces, Google Jib target Kubernetes woes

To entice developers to create more apps on their environments, major cloud platform companies will meet them where they live.

Microsoft and Google both released tools to help ease app development on their respective platforms, Microsoft Azure and the Google Cloud Platform. Microsoft’s Azure Dev Spaces and Google Jib help developers build applications for the Kubernetes container orchestrator and Java environments and represent a means to deliver simpler, developer-friendly technology.

Microsoft’s Azure Dev Spaces, now in public preview, is a cloud-native development environment for the company’s Azure Kubernetes Service (AKS), where developers can work on applications while connected with the cloud and their team. These users can build cloud applications with containers and microservices on AKS and do not deal with any infrastructure management or orchestration, according to Microsoft.

As Kubernetes further commoditizes deployment and orchestration, cloud platform vendors and public cloud providers must focus on how to simplify customers’ implementation of cloud-native development methods — namely DevOps, CI/CD and microservices, said Rhett Dillingham, an analyst at Moor Insights & Strategy in Austin, Texas.

“Azure Dev Spaces has the potential to be one of Microsoft’s most valuable recent developer tooling innovations, because it addresses the complexity of integration testing and debugging in microservices environments,” he said.

Edwin Yuen, analyst, Enterprise Strategy GroupEdwin Yuen

With the correct supporting services, developers can fully test and deploy in Microsoft Azure, added Edwin Yuen, an analyst at Enterprise Strategy Group in Milford, Mass.

“This would benefit the developer, as it eases the process of container development by allowing them to see the results of their app without having to set up a Docker or Kubernetes environment,” he said.

Meanwhile, Google’s Jib containerizer tool enables developers to package a Java application into a container image with the Java tools they already know to create container-based advanced applications. And like Azure Dev Spaces, it handles a lot of the underlying infrastructure and orchestration tasks.

It’s about simplifying the experience … the developer is eased into the process by using existing tools and eliminating the need to set up Docker or Kubernetes.
Edwin Yuenanalyst, Enterprise Strategy Group

Integration with Java development tools Maven and Gradle means Java developers can skip the step to create JAR, or Java ARchive, files and then containerize them, Yuen said.

“Like Azure Dev Spaces, it’s about simplifying the experience — this time, not the laptop jump, but the jump from JAR to container,” he said. “But, again, the developer is eased into the process by using existing tools and eliminating the need to set up Docker or Kubernetes.”

Jib also extends Google’s association with the open source community to provide Java developers an easy path to containerize their apps while using the Google Cloud Platform, Yuen added.

Microsoft bills Azure network as the hub for remote offices

Microsoft’s foray into the rapidly growing SD-WAN market could solve a major customer hurdle and open Azure to even more workloads.

All the major public cloud platforms have increased their networking functionality in recent months, and Microsoft’s latest service, Azure Virtual WAN, pushes the boundaries of those capabilities. The software-defined network acts as a hub that links with third-party tools to improve application performance and reduce latency for companies with multiple offices that access Azure.

IDC estimates the software-defined wide area network (SD-WAN) market will hit $8 billion by 2021, as cloud computing continues to proliferate and employees must access cloud-hosted workloads from various locations. So far, the major cloud providers have left that work to partners.

But this Azure network service solves a big problem for customers that make decisions about network transports and integration with existing routers, as they consume more cloud resources from more locations, said Brad Casemore, an IDC analyst.

“Now what you’ve got is more policy-based, tighter integration within the SD-WAN,” he said.

Azure Virtual WAN uses a distributed model to link Microsoft’s global network with traditional on-premises routers and SD-WAN systems provided by Citrix and Riverbed. Microsoft’s decision to rely on partners, rather than provide its own gateway services inside customers’ offices, suggests it doesn’t plan to compete across the totality of the SD-WAN market, but rather provide an on-ramp to integrate with third-party products.

Customers can already use various SD-WAN providers to easily link to a public cloud, but Microsoft has taken the level of integration a step further, said Bob Laliberte, an analyst at Enterprise Strategy Group in Milford, Mass. Most SD-WAN vendors are building out security ecosystems, but Microsoft already has that in Azure, for example.

This could also simplify the purchasing process, and it would make sense for Microsoft to eventually integrate this virtual WAN with Azure Stack to help facilitate hybrid deployments, Laliberte said.

It’s unclear if customers trust Microsoft — or any single hyperscale cloud vendor — at the core of their SD-WAN implementation, as their architectures spread across multiple clouds.

The Azure Virtual WAN service is billed as a way to connect remote offices to the cloud, and also to each other, with improved reliability and availability of applications. But that interoffice linkage also could lure more companies to use Azure for a whole host of other services, particularly customers just starting to embrace the public cloud.

There are still questions about the Azure network service, particularly around multi-cloud deployments. It’s unclear if customers trust Microsoft — or any single hyperscale cloud vendor — at the core of their SD-WAN implementation, as their architectures spread across multiple clouds, Casemore said.

Azure updates boost network security, data analytics tools

Microsoft also introduced an Azure network security feature this week, Azure Firewall, with which users can create and enforce network policies across multiple endpoints. A stateful firewall protects Azure Virtual Network resources and maintains high availability without any restrictions on scale.

Several other updates include an expanded Azure Data Box service, still in preview, which provides customers with an appliance onto which they can upload data and ship directly to an Azure data center. These types of devices have become a popular means to speed massive migrations to public clouds. Another option for Azure users, Azure Data Box Disk, uses SSD disks to transfer up to 40 TB of data spread across five drives. That’s smaller than the original box’s 100 TB capacity, and better suited to collect data from multiple branches or offices, the company said.

Microsoft also doubled the query performance of Azure SQL Data Warehouse to support up to 128 concurrent queries, and waived the transfer fee for migrations to Azure of legacy applications that run on Windows Server and SQL Server 2008/2008 R2, for which Microsoft will end support in July 2019. Microsoft also plans to add features to Power BI for ingestions and integration across BI models that are similar to Microsoft customers’ experience with Power Query for Excel.