An update on disabling VBScript in Internet Explorer 11 – Microsoft Edge Blog

In early 2017, we began the process of disabling VBScript in Internet Explorer 11 to give the world the opportunity to prepare for it to be disabled by default.The change to disable VBScript will take effect in the upcoming cumulative updates for Windows 7, 8, and 8.1 on August 13th, 2019. VBScript will be disabled by default for Internet Explorer 11 and WebOCs for Internet and Untrusted zones on all platforms running Internet Explorer 11. This change is effective for Internet Explorer 11 on Windows 10 as of the July 9th, 2019 cumulative updates.
The settings to enable or disable for VBScript execution in Internet Explorer 11 will remain configurable per site security zone, via Registry, or via Group Policy, should you still need to utilize this legacy scripting language.
To provide feedback on this change, or to report any issues resulting from this change, you can use the Feedback Hub app on any Windows 10 device. Your feedback goes directly to our engineers to help make Windows even better.
– Brent Mills, Senior Program Manager

Colorado builds API integration platform to share data, speed services

Integrating data across corporate departments can be challenging. There invariably are technical and cultural hurdles that must be cleared.

The state of Colorado’s ambitious plan to integrate data across dozens of agencies by using APIs had to contend with another issue: It was being launched under intense media scrutiny.

“We typically think of IT as a back-office function,” said Jon Gottsegen, chief data officer for the state of Colorado. Not so in Colorado. A deeply troubled benefits eligibility system — more than a decade in development and charged with making improper payments — had put IT in the limelight — and not in a good way, he said. The newspaper term above the fold became part of his vocabulary, Gottsegen grimly joked.

Work on the state’s new API integration platform began in 2017 with a major transformation of the infamous benefits system. Partnering with Deloitte, IT rewrote the system’s code, migrated services to Salesforce and AWS, and used APIs to drive integration into various databases, Gottsegen said. This helped reduce the amount of time to determine eligibility for benefits from days to minutes — a major boon for state employees.

Today, the API integration platform — while still a work in progress — has dramatically sped up a number of state processes and is paving the way for better data sharing down the road, Gottsegen said.

Speaking at the recent MuleSoft Connect conference in San Francisco, Gottsegen shared the objectives of Colorado’s API integration strategy, the major challenges his team has encountered and the lessons learned.

People, of course, should come first in projects of this scope: Delivering better services to the people of Colorado was aim No. 1. of his team’s API integration platform, Gottsegen said. Security, data governance and corporate culture also demand attention.

Becoming the ‘Amazon of state services’

The task before Gottsegen and his group was to create a process for rolling out APIs that work seamlessly across dozens of different agencies. “Ideally, we want to be the Amazon of state services,” he said of IT’s grand mission.

Developers had to learn how to connect systems to databases that were regulated in different ways. Gottsegen’s team spent a lot of time putting together a comprehensive platform, which was important for integration, he said. It was also important to deliver the APIs in a way that they could be easily consumed by the various state agencies. One goal was to ensure that new APIs were reusable.

Part of the work also involved looking at how services relate to each other. For example, if someone is getting Medicaid, there is a good chance they are also eligible for house services. The API platform had to support the data integration that helps automate these kinds of cross-agency processes, he said.

Getting the API program off the ground was not just about solving the technical problems. When communicating with technology personnel across agencies, Gottsegen said it was important to convey that the API integration platform is about better serving the residents of Colorado.

Learning from contractors

IT did not go it alone. Gottsegen said the state worked with a variety of contractors to speed up its API development process. This included working with MuleSoft to roll out a more expansive API management tier. IT also hired some contractors with integration expertise to kick-start the project. But he added that it was important to ensure the knowledge involved in building the APIs was retained after the contract ends.

“We want our teams to sit next to those contractors to ensure the knowledge of those contractors gets internalized. There have been many cases where state workers did not know how to maintain something after the contractor has left,” he said.

Good metrics, communication critical to API integration success

Before Gottsegen’s team launched a formal API integration program, no one was tracking how long it took agencies to set up a working data integration process. Anecdotal examples of problems would emerge, including stories of agencies that spent over a year negotiating how to set up a data exchange.

The team now has formal metrics to track time to implementation, but the lack of past metrics precludes precise measurements on how the new API integration platform speeds up data exchange compared to before.

In any case, expediting the data exchange process is not just about having a more all-encompassing integration tier, Gottsegen stressed. Better communication between departments is also needed.

As it rolls out the API integration platform, IT is working with the agencies to identify any compliance issues and find a framework to address them.

Centralizing security

Each agency oversees its own data collection and determines where it can be used, Gottsegen said. There are also various privacy regulations to conform to, including HIPAA and IRS 1075.

“One of the reasons we pursued MuleSoft was so we could demonstrate auditable and consistent security and governance of the data,” he said.

Navigating the distinctions between privacy and security is a big challenge, he said. Each agency is responsible for ensuring restrictions on how its data is used; it is not a task assigned to a centralized government group because the agency is the expert on privacy regulations. At the same time, Gottsegen’s group can provide better security into API integration mechanisms used to exchange data between agencies.

To provide API integration security, Gottsegen created a DevOps toolchain run by a statewide center for enablement. This included a set of vetted tools and practices that agencies could adopt to speed the development of new integrations (dev) and the process of pushing them into production (ops) safely.

Gottsegen said the group is developing practices to build capabilities that can be adopted across the state, but progress is uneven. He said the group has seen mixed results in getting buy-in from agencies.

Improving data quality across agencies

Gottsegen’s team has also launched a joint agency interoperability project for the integration of over 45 different data systems across the state. The aim is to build a sturdy data governance process across groups. The first question being addressed is data quality, in particular to ensure a consistent digital ID of citizens. “To be honest, I’m not sure we have a quality measure across the state,” Gottsegen said.

Gottsegen believes that data quality is not about being good or bad, but about fitness for use. It’s not easy articulating what particular data set is appropriate across agencies.

“Data quality should be a partnership between agencies and IT,” he said. His team often gets requests to integrate data across agencies. The challenge is how to provide the tools to do that. The agencies need to be able to describe the idiosyncrasies of how they collect data in order to come up with a standard. Down the road, Gottsegen hopes machine learning will help improve this process.

Building trust with state IT leaders

A lot of state initiatives are driven from the top down. But, if workers don’t like a directive, they can often wait things out until a new government is elected. Gottsegen found that building trust among IT leaders across state agencies was key in growing the API program. “Trust is important — not just in technology changes, but in data sharing as well,” he said.

And face-to face connections matter. In launching its API integration platform, he said, it was important for IT leaders across organizations to learn each other’s names and to meet in person, even when phone calls or video conferences might be more convenient.

As for the future, Gottsegen has a vision that all data sharing will eventually happen through API integrations. But getting there is a long process. “That might be 10 years out — if it happens. We keep that goal in mind while working with our collaborators to build things out.”

Go to Original Article

Four steps to get involved in the Microsoft Office Specialist World Championship | | Microsoft EDU

Students around the world are using their Microsoft Office Specialist (MOS) certification to show colleges and future employers that they have a true mastery of the Microsoft Office suite. In fact, some talented students even go on to compete in a world competition for Microsoft Office.

Each year, Certiport, a Pearson VUE business, and the leading provider of learning curriculum, practice tests and performance-based IT certification exams that accelerate academic and career opportunities, hosts the MOS World Championship. This event is a global competition, testing top students’ skills on Microsoft Office Word, Excel and PowerPoint.

Are you a hard-working student, looking to show the world your Microsoft Office skills? Check out these four easy steps below to find out how you can get involved in the Microsoft Office Specialist Championship.

  1. Learn. Before you’re ready to compete, make sure you’re a master of the Microsoft Office Suite. Certiport has collaborated with multiple learning partners to make preparation easy. You can learn the skills you need to earn a top score on your MOS exam.
  2. Practice. Now that you’ve expanded your knowledge, it’s time to apply it. You can hone your Microsoft Office skills using various practice exams. These performance-based assessment and test preparation tools will prepare you to earn your MOS certification by creating a true “live in the app” experience. You’ll be a master in no time, because you’ll be practicing skills as if in the Microsoft Office application.
  3. Certify. You’re ready to show your skills! Microsoft Office Specialist exams are only delivered in Certiport Authorized Testing Centers. However, with more than 14,000 testing centers worldwide, there’s bound to be one close by. Find a testing center near you, and don’t forget to reach out to the testing center to schedule an appointment. Make sure to push for a score over 700 to be eligible for the MOS World Championship!
  4. Compete. If you’ve earned a top score, then the MOS Championship is your next step. Qualification is simple. When you take a Microsoft Office Specialist exam in Word, Excel or PowerPoint, you’ll automatically enter the MOS Championship and could be chosen to represent your country.

To represent your country at the MOS World Championship, you’ll need to first be named your nation’s champion by competing in a regional competition hosted by Certiport’s network of Authorized Partners. You can see the full list of partners and nations that compete here. In addition, each country has its own selection process, so make sure to connect with your local Partner to find out how you can prepare to compete in the MOS World Championship in 2020.

Interested in learning more about the MOS World Championship? Connect with us at [email protected].

Click here for free STEM resourcesExplore tools for student-centered learning

Go to Original Article
Author: Microsoft News Center

Bluescape releases newest version of its mobile app

Bluescape has launched its newest mobile app to enable users to access their content on the go.

The app, available in the Apple App Store and Google Play store, connects to Bluescape workspaces from mobile devices, such as cellphones or tablets. According to the vendor, it enables users to give presentations without a laptop by launching a Bluescape session from the app onto larger touchscreens.

Users can also access their content and workspace anytime and from anywhere and search and view content. According to Bluescape, the app provides a visual collaboration workspace that integrates day-to-day applications, content and tools.

The Bluescape platform is cloud-based software, with applications designed for collaboration in the workplace. Available applications include mobile and personal workstations, huddle rooms, innovation centers, collaboration suites, conference rooms, training rooms, executive briefing centers, command centers and control centers. Search, messaging and file sharing are also built into the platform.

Bluescape lists professionals in jobs such as architecture, consulting, designing, filmmaking, marketing and product development as ideal users for its product, as these are often groups of people working collaboratively and visually.

Bluescape is among the vendors offering visual collaboration software, which works hand in hand with digital collaborative whiteboards. Vendor Mural provides separate workspaces for teams and enables scaling for companywide processes, with frameworks for Agile, Lean and Design Thinking methods. Custom frameworks are also available.

Competitor Miro touts its product development, user experience research and design, and Lean and Agile capabilities, as well as its enterprise-grade security. Available applications include Google Drive, Box, Dropbox, Slack, OneDrive and Microsoft Teams.

Go to Original Article

For Sale – HTPC 8gb quad core Reduced

For sale all new…

CPU ; AMD Athlon 5150 1.6GHz Socket AM1
CPU COOLER ; Gelid Solutions Slim Silence AM1 CPU
MOTHERBOARD ; MSI AM1I itx motherboard
MEMORY ; 8gb Corsair Vengeance ddr3 2 x4gb 1600mhz
PSU ; Antec Basiq bp350 80+ psu
CASE ; CIt all black slim micro atx case

No hard drives but will include a set of screws

Cost me £150 in parts…

Looking to clear £130inc tracked delivery…

Rather sell as one complete system…

Reason for sale I’ve brought a ryzen system…

Price and currency: £130
Delivery: Delivery cost is included within my country
Payment method: PayPal Bank Transfer
Location: Shropshire
Advertised elsewhere?: Advertised elsewhere
Prefer goods collected?: I have no preference

This message is automatically inserted in all classifieds forum threads.
By replying to this thread you agree to abide by the trading rules detailed here.
Please be advised, all buyers and sellers should satisfy themselves that the other party is genuine by providing the following via private conversation to each other after negotiations are complete and prior to dispatching goods and making payment:

  • Landline telephone number. Make a call to check out the area code and number are correct, too
  • Name and address including postcode
  • Valid e-mail address

DO NOT proceed with a deal until you are completely satisfied with all details being correct. It’s in your best interest to check out these details yourself.

Go to Original Article

IBM Cloud Paks open new business for channel partners

IBM said its latest push into the hybrid and multi-cloud market is setting the stage for new channel opportunities. 

The company this week revealed IBM Cloud Paks, a new set of IBM software offerings containerized on Red Hat OpenShift. According to IBM, Cloud Paks aim to help customers migrate, integrate and modernize applications in multiple-cloud environments.

Those environments include public clouds such as AWS, Azure, Google Cloud Platform, Alibaba and IBM Cloud, as well as private clouds. The Cloud Paks launch follows on the heels of IBM closing its $34 billion Red Hat acquisition in July and is part of a broader strategy to make its software portfolio cloud-native and OpenShift-enabled.

“The strategy has been to containerize the middleware on a common Kubernetes platform. That common Kubernetes platform is Red Hat OpenShift,” said Brian Fallon, director of worldwide digital and partner ecosystems, IBM Cloud and cognitive software.

For IBM business partners, Cloud Paks offer a modular approach to solving common problems faced by customers in their journeys to cloud, he said. The company released five Cloud Pak products, addressing data virtualization; application development; integration of applications, data, cloud services and APIs; process automation; and multi-cloud management.

IBM Cloud Paks can be mixed and matched to address different customer scenarios. For example, the Cloud Pak for Applications and Cloud Pak for Integration “together represent a great opportunity for partners to help their clients move and modernize workloads to a cloud environment,” Fallon said.

Dorothy Copeland, global vice president of programs and business development for the IBM partner ecosystem, said IBM’s push into hybrid and multi-cloud products is creating new opportunities for IBM and Red Hat partners, respectively.

“We are enabling the market to drive hybrid, multi-cloud solutions and, in that, enabling our business partners to be able to do that, as well. … There is a huge opportunity for partners, especially around areas where partners can add specific knowledge, services and management of the deployment,” she said.

Cloud Paks may also serve as an entry point for Red Hat partners to build IBM practices. Copeland noted that Red Hat partners have shown increasing interest in joining the IBM partner ecosystem since the acquisition was announced. IBM has stated it intends Red Hat to operate independently under its ownership.

Logically acquires New York-area IT services company

Christopher Claudio, CEO at LogicallyChristopher Claudio

Logically, a managed IT services provider based in Portland, Maine, has acquired Sullivan Data Management, a 10-employee outsourced IT services firm located north of New York City.

Launched earlier this year, Logically formed from the merger of Winxnet Inc., an IT consulting and outsourcing firm in Portland, and K&R Network Solutions, a San Diego-based managed service provider (MSP). 

Christopher Claudio, Logically’s CEO, said in an April 2019 interview that the company was looking for additional acquisitions. The Sullivan Data Management deal is the first of those transactions. Claudio said two or three acquisitions may follow by the end the year.

Sullivan Data Management fits Logically’s strategy of focusing on locations outside of top-tier metropolitan areas, where MSP competition is thickest. The company is based in Yorktown Heights in Westchester County, an hour’s drive from New York. Sullivan Data Management services customers in Westchester and neighboring counties.

This is the type of acquisition we are looking for.
Christopher ClaudioCEO, Logically

“This is the type of acquisition we are looking for,” Claudio said.

He also pointed to an alignment between Logically’s vertical market focus and Sullivan Data Management’s local government customer base.

“We do a lot of work within the public sector,” Claudio said, noting the bulk of Sullivan Data Management’s clients are in the public safety and municipal management sectors.

Barracuda reports growth in email security market

Barracuda Networks said its email security business is booming, with a $200 million annual revenue run rate for fiscal year 2019, which ended Feb. 28.

For the first quarter of its fiscal year 2020, Barracuda said its email security product, Barracuda Sentinel, saw 440% year-over-year growth in sales booking. In the same time frame, its email security, backup and archiving package, Barracuda Essentials, saw 46% growth in sales bookings, the vendor said.

Meanwhile, Barracuda’s business unit dedicated to MSPs reported the annual recurring revenue for its email protection business increased 122% year over year for the first quarter of fiscal year 2020.

Ezra Hookano, vice president of channels at Barracuda, based in Campbell, Calif., cited conditions of the email security market as one driver behind the company’s growth. Phishing attacks have become more sophisticated in their social-engineering tactics. Email security threats are “very specific and targeted to you and your industry, and [no one is] immune — big or small,” he said.

Hookano also pointed to Barracuda’s free threat scan tool as an important business driver. He said many Barracuda resellers are using the threat scans to drive the sales process.

“We, to this point, have never run a threat scan that didn’t come back with at least some things that were wrong in the [email] network. … About 30% to 40% of the time, something is so bad that [customers] have to purchase immediately.”

Barracuda is looking to differentiate itself from its pure-play email security competitors by tapping into its portfolio, he noted. The company’s portfolio includes web filtering and firewall products, which feed threat data into Barracuda email security.

“If I’m a pure-play email security vendor now, I no longer have the ability to be as accurate as a portfolio company,” Hookano said.

Data from Barracuda’s remote monitoring and management product, Managed Workplace, which the vendor acquired from Avast earlier this year, also bolster email security capabilities.

“Our goal in the email market … is to use all of our other products and the footprint from our other products to make our email security signatures better and to continue to build on our lead there,” he said.

Equinix cites channel in Q2 bookings, customer wins

Equinix Inc., an interconnection and data center company based in Redwood City, Calif., cited channel sales as a key driver behind second-quarter bookings.

The company said channel partners contributed more than 25% of the bookings for the quarter ended June 30. And those bookings accounted for 60% of Equinix’s new customer wins during the quarter, according to the company. Equinix’s second-quarter revenue grew 10% year over year to $1.385 billion.

In a statement, Equinix said it has “deepened its engagement with high-priority partners to drive increased productivity and joint offer creation across its reseller and alliance partners.”

Those partners include Amazon, AT&T, Microsoft, Oracle, Orange, Telstra, Verizon and World Wide Technology, according to a company spokeswoman. New channel wins in the second quarter include a deal in which Equinix is partnering with Telstra to provide cloud connectivity at Genomics England.

Equinix has also partnered with Assured DP, a Rubrik-as-a-service provider, in a disaster recovery offering.

Other news

  • Microsoft partners could see a further boost in momentum for the company’s Teams calling and collaboration platform. Microsoft said it will retire Skype for Business Online on July 31, 2021, a move that should pave the way for Skype-to-Teams migrations. Microsoft officials speaking last month at the Inspire conference cited Teams as among the top channel opportunities for Microsoft’s 2020 fiscal year. Partners expect to find business in Teams training and governance. The Skype for Business Online retirement does not affect Skype Consumer services or Skype for Business Server, according to a Microsoft blog post.
  • Google said more than 90 partners have obtained Google Cloud Partner specializations in the first half of 2019. Partners can earn specializations in areas such as applications development, cloud migration, data analytics, education, work transformation, infrastructure, IoT, location-based services, machine learning, marketing analytics and security. Partners that acquired specializations during the first half of the year include Accenture, Agosto, Deloitte Consulting and Maven Wave Partners. The specialization announcement follows the launch of the Google Cloud Partner Advantage Program, which went live July 1.
  • Mimecast Ltd., an email and data security company based in Lexington, Mass., unveiled its Cyber Alliance Program, which aims to bring together security vendors into what it terms a “cyber-resilience ecosystem.” The program includes cybersecurity technology categories such as security information and event management; security orchestration, automation and response; firewall, threat intelligence and endpoint security. Mimecast said the program offers customers and partners purpose-built, ready-to-use integrations; out-of-the-box APIs; documented guides with sample code; and tutorials that explain how to use the integrations.
  • Two-thirds of channel companies said they have changed their customer experience tactics, with 10% reporting a move to an omnichannel approach for customer interaction in the past year. Those are among the results of a CompTIA survey of more than 400 channel companies. The survey cited customer recruitment and customer retention as areas where the most respondents reported deficiencies.
  • CompTIA also revealed this week that it will acquire Metacog, an assessment, certification and training software vendor based in Worcester, Mass. CompTIA said Metacog’s technology will be incorporated into the upcoming release of its online testing platform. Metacog’s technology uses AI, big data and IoT APIs.
  • Naveego unveiled Accelerator, a tool for partners to analyze data accuracy across a variety of sources. Accelerator “is a great starting point for our partners to get a feel for just how ready a customer’s data is to participate in [data-based projects]. In general, that makes the projects have a higher degree of success, as well as go much more smoothly,” said Derek Smith, CTO and co-founder of the company. Naveego works with about 10 partners, recently expanding its roster with Frontblade Systems, H2 Integrated Solutions, Mondelio and Narwal.
  • US Signal, a data center services provider based in Grand Rapids, Mich., said its DRaaS for VMware offering is generally available. The offering, based on VMware vCloud Availability, provides disaster recovery services for multi-tenant VMware clouds.
  • Synchronoss Technologies Inc., a cloud and IoT product provider based in Bridgewater, N.J., is working with distributor Arrow Electronics to develop and market an IoT smart building offering. The IoT offering is intended for telecom operators, service providers and system integrators.
  • Distributor Synnex Corp. inked a deal with Arista Networks to provide its data center and campus networking products.
  • Peerless Networks, a telecom services provider, named Ryan Patterson as vice president of channel sales. Patterson will oversee the recruitment of new master and direct agents for Peerless’ national channel program, the company said.
  • Nintex, a process management and workflow automation vendor, hired Michael Schultz as vice president of product, channel and field marketing, and Florian Haarhaus as vice president of sales for EMEA.

Market Share is a news roundup published every Friday.

Go to Original Article

Merger and acquisition activity is altering the BI landscape

From Qlik’s acquisition of Podium Data in July 2018 through Salesforce’s purchase of Tableau Software in June 2019, the last year in BI has been characterized by a wave of consolidation.

Merger and acquisition activity is altering the BI landscape

Capped by Salesforce’s $15.7 billion acquisition of Tableau Software on June 10, 2019, and Google’s $2.6 billion purchase of Looker just four days earlier, the BI market over the last year has been marked by a wave of merger and acquisition activity.

Qlik kicked off the surge with its acquisition of Podium Data in July 2018, and, subsequently, made two more purchases.

“To survive, you’re going to have to reach some kind of scale,” said Rick Sherman, founder and managing partner of Athena IT Solutions, in a SearchBusinessAnalytics story in July 2019. “Small vendors are going to be bought or merge with more focused niche companies to build a more complete product.”

It was a little more than a decade ago that a similar wave of merger and activity reshaped the BI landscape, highlighted by IBM buying Cognos Analytics and SAP acquiring Business Objects.

After the flurry of deals in the spring ending with the premium Salesforce paid for Tableau, the pace of mergers and acquisition activity has slowed since the start of the summer, but more could be coming soon as more vendors with a specialized purpose seek partners with complementary capabilities in an attempt to keep pace with competitors that have already filled out their analytics stack.

Go to Original Article

Cohesity CyberScan scans backup copies for security risks

The latest application added to Cohesity Marketplace is designed to trawl through backups to look for security vulnerabilities.

Cohesity CyberScan is a free application available on the Cohesity Marketplace. Using, it compares applications in a backup environment against the public Common Vulnerabilities and Exposures (CVE) database to detect possible security flaws. The admin can then address these flaws, whether it’s an out-of-date patch, software bugs or vulnerabilities around open ports.

Cohesity CyberScan doesn’t affect the production environment when it performs its scan. The application boots up a backup snapshot and performs an API call to in order to find vulnerabilities. This process has the added benefit of confirming whether a particular snapshot is recoverable in the first place.

Raj Dutt, director of product marketing at Cohesity, said because the vulnerability scan happens in Cohesity’s runtime environment and not the live environment, many customers may be prompted to perform these scans. Citing a recent study performed by independent IT research firm Ponemon Institute, Dutt said 37% of respondents who suffered a security breach did not scan their environments for vulnerabilities.

“They work in an environment where the organization is expected to run 24/7/365, so essentially, there is no downtime to run these scans or make the patches,” Dutt said.

Dutt said even organizations that do perform vulnerability scans don’t do them often enough. Vulnerabilities and exposures published on the CVE database are known to bad actors, so weekly or monthly scans still leave organizations with a wide window in which they can be attacked. Dutt said since Cohesity CyberScan doesn’t interfere with the production environment, customers are free to run scans more frequently.

screenshot of Cohesity CyberScan dashboard
The Security Dashboard houses the vulnerability scan and anti-ransomware capabilities.

Phil Goodwin, a research director at IT analyst firm IDC, said there are applications that scan backup copies or secondary data but scan mostly to detect out-of-date drivers or other roadblocks to a successful restore or failover. Running it against a CVE database to look for potential security problems is unique.

End users are talking about data protection and security in the same sentence. It really is two sides of the same coin.
Phil GoodwinResearch director, IDC

Goodwin said Cohesity CyberScan is the latest example of backup vendors adding security capabilities. Security and data protection are different IT disciplines that call for different technology, but Goodwin said he has encountered customers conflating the two.

“End users are talking about data protection and security in the same sentence,” Goodwin said. “It really is two sides of the same coin.”

Security is the proactive approach of preventing data loss, while data protection and backup are reactive. Goodwin said organizations should ideally have both, and backup vendors are starting to provide proactive features such as data masking, air gapping and immutability. But Goodwin said he has noticed many vendors stop shy of malware detection.

Indeed, Cohesity CyberScan does not have malware detection. Dutt stressed that the application’s use cases are focused on detecting cyberattack exposures and ensuring recoverability of snapshots. He pointed out that Cohesity DataPlatform does have anti-ransomware capabilities, and they can be accessed from the same dashboard as CyberScan’s vulnerability scan.

Cohesity CyberScan is generally available now to customers who have upgraded to the latest Cohesity Pegasus 6.4 software. The application itself is free, but customers are required to have their own Tenable license.

Go to Original Article

What is Azure Bastion?

In this post, you’ll get a short introduction into Azure Bastion Host. To be honest, I still don’t know if I should pronounce it as [basˈti̯oːn] (German), /bæstʃən/ (US engl.) or [basˈt̪jõn] (french) but that shouldn’t stop us from learning more about Azure Bastion Host, what is it, and when it’s useful.

So let’s start.

What is Azure Bastion Host?

Azure Bastion Host is a Jump-server as a Service within an Azure vNet (note that this service is currently in preview). What does that mean exactly? Well, a jump server is a fixed point on a network that is the sole place for you to remote in, get to other servers and services, and manage the environment. Now some will say, but I build my own jump server VM myself! While you’re certainly free to do that yourself, there are some key differences between the self-built VM option and a Bastion Host.

A regular Jump-server VM must either be reachable via VPN or needs to have a public IP with RDP and/or SSH open to the Internet. Option one, in some environments, is rather complex. Option two is a security nightmare. With Azure Bastion Host, you can solve this access issue. Azure Bastion enables you to use RDP and SSH via the Internet or (if available) via a VPN using the Azure Portal. The VM does not need a public IP, which GREATLY increases security for the target machine.

NOTE: Looking for more great content on security? Watch our webinar on Azure Security Center On-Demand.

After the deployment (which we’ll talk about in a second), Bastion becomes the 3rd option when connecting to a VM through the Azure Portal, as shown below.


Virtual Machine Bastion

After you hit connect, an HTTPs browser Window will open and your session will open within an SSL encrypted Window.

Bastion in browser

Azure Bastion Use Cases

Now let’s list some possible use-cases. Azure Bastion can be very useful (but not limited) to these scenarios:

  1. Your Azure-based VMs are running in a subscription where you’re unable to connect via VPN, and for security reasons, you cannot set up a dedicated Jump-host within that vNet.
  2. The usage of a Jump-host or Terminal Server in Azure would be more cost-intensive than using a Bastion Host within the VNet (e.g. when you have more than one admin or user working on the host at the same time.)
  3. You want to give developers access to a single VM without giving them access to additional services like a VPN or other things running within the VNet.
  4. You want to implement Just in Time (JIT) Administration in Azure. You can deploy and enable Bastion Host on the fly and as you need it. This allows you yo implement it as part of your Operating System Runbook when you need to maintain the OS of an Azure-based VM. Azure Bastion allows you to do this without setting up permanent access to the VM.

The way you deploy Azure Bastion Host within a VNet is pretty straightforward. Let’s go through the steps together.

  1. Open the Azure Preview Portal through the following link.
  2. Search for the feature in the Azure Marketplace and walk through the deployment wizard by filling out the fields shown below.

create a bastion

Again, the deployment is quite simple and most options are fairly well explained within the UI. However, if you want further details, you can find them in the official feature documentation here.

Also, be aware that a Bastion Host must be implemented in every vNet where you want to connect to a VM. Currently, Bastion does not support vNet Peering.

How Much Does Azure Bastion Cost?

Pricing for Bastion is pretty easy to understand. As all Microsoft VM Services, you pay for the time the Bastion hast is deployed and for any Bastion service you have deployed. You can easily calculate the costs for the Bastions Hosts you need via Azure Price Calculator.

I made my example for one Bastion Host in West Europe, with the assumption it would be needed all month long.

Azure Bastion Price Calculator

Bastion Roadmap Items

Being in preview there are still a number of things that Microsoft is adding to Bastion’s feature set. This includes things like:

  1. Single-Sign-On with Azure AD
  2. Multi-Factor Auth
  3. vNet Peering (Not confirmed, but being HEAVILY requested by the community right now)

vNet Peering support would make it so that only a single Bastion Host in a Hub or Security vNet is needed.

You can see additional feature request or submit your own via the Microsoft Feedback Forum.

If you like a feature request or want to push your own request, keep an eye on the votes. The more votes a piece of feedback has, the more likely Microsoft will work on the feature. 

Additional Documentation and Wrap-Up

Additional documentation can be found on the Azure Bastion Sales Page.

Finally, I’d like to wrap up by finding out what you think of Azure Bastion. Do you think this is a worthy feature? Is this something that you’ll be putting into production once the feature is out of preview? Any issues you currently see with it today? Let us know in the comments section below!

Finally, if you’re interested in learning more about Azure security issues why not watch our webinar session on Azure Security Center? Presented by Thomas Maurer from the Azure Engineering Team, you will learn all about this important security package and how you should be using it to ensure your Azure ecosystem is fully protected!

Azure Security Center Webinar

Watch the Webinar

Thanks for reading!

Go to Original Article
Author: Florian Klaffenbach

Capital One breach suspect may have hit other companies

A new report looking into the attacker accused in the Capital One breach discovered references to other potential victims, but no corroborating evidence has been found yet.

The FBI accused Paige Thompson, who allegedly went by the name “Erratic” on various online platforms, including an invite-only Slack channel. The Slack channel was first reported on by investigative cybersecurity journalist Brian Krebs, who pointed out that file names referenced in the channel pointed to other organizations potentially being victims of similar attacks.

A new report by cybersecurity firm CyberInt, based in London, regarding the Capital One breach built on the information discovered by Krebs. Jason Hill, lead cybersecurity researcher at CyberInt, said the company was able to gain access to the Slack channel via an open invitation link.

“This link was obtained from the now-offline ‘Seattle Warez Kiddies’ Meetup group (Listed as ‘Organized by Paige Thomson’),” Hill wrote via email. “Based on the publicly available information at the time of report completion, such as Capital One’s statement and the [FBI’s] Criminal Complaint, we were able to conduct open source intelligence gathering to fill in some of the missing detail and follow social media leads to gain an understanding of the alleged threat actor and their activity over the past months.”

According to Hill, CyberInt researchers followed the trail through a GitHub account, GitLab page and a screenshot of a file archival process shared in the Slack channel.

“The right-hand side of the screen appears to show the output of the Linux command ‘htop’ that lists current processes being executed. In this case, under the ‘Command’ heading, we can see a number of ‘tar –remove-files -cvf – ‘ processes, which are compressing data (and then removing the uncompressed source),” Hill wrote. “These files correlate with the directory listing, and potential other victims, as seen later within the Slack channel.”

Between the files named in the screenshot and the corresponding messages in the Slack channel, it appeared as though in addition to the Capital One breach, the threat actor may have stolen 485 GB of data from various other organizations. Some organizations were implied by only file names, such as Ford, but others were named directly by Erratic in messages, including the Ohio Department of Transportation, Michigan State University, Infoblox and Vodafone.

Hill acknowledged that CyberInt did not directly contact any of the organizations named, because the company policy is normally to “contact organizations when our research detects specific vulnerabilities that can be mitigated, or threats detected by our threat intelligence platform.

“However in this case, our research was focused on the Capital One breach to gain an understanding of the threat actor’s tactics, techniques and procedures (TTP) and resulted in the potential identification of additional victims rather than the identification of any specific vulnerability or ongoing threat,” Hill wrote. “Our report offered general advice for those concerned about the TTP based on these findings.”

We contacted some of the organizations either directly named or implied via file name in Erratic’s Slack channel. The Ohio Department of Transportation did not respond to a request for comment. Ford confirmed an investigation is underway to determine if the company was the victim of a data breach.

A spokesperson for Michigan State University also confirmed an investigation is underway and the university is cooperating with law enforcement authorities, but at this point there is “no evidence to suggest MSU was compromised.”

Similarly, an Infoblox spokesperson said the company was “continuing to investigate the matter, however, at this time, there is no indication that Infoblox was in any way involved with the Capital One breach. Additionally, there is no indication of an intrusion or data breach causing Infoblox customer data to be exposed.”

A Vodafone spokesperson claimed the company takes security seriously, but added, “Vodafone is not aware of any information that relates to the Capital One security breach.”

Go to Original Article