Tag Archives: where

Disable SMB1 before the next WannaCry strikes

of cyberattacks — the outdated SMB protocol. The first step to disable SMB1 on the network is to find where it lives.

Server Message Block is a transmission protocol used to discover resources and transfer files across the network. SMB1 dates back to the mid-1990s, and Microsoft regularly updates the SMB protocol to address evolving encryption and security needs. In 2016, Microsoft introduced SMB 3.1.1, which is the current version at the time of publication.

But SMB1 still lingers in data centers. Many administrators, as well as third-party storage and printer vendors, haven’t kept up with the new SMB versions — they either default to SMB1 or don’t support the updates to the protocol.

Meanwhile, attackers exploit the weaknesses in the SMB1 protocol to harm the enterprise. In January 2017, the U.S. Computer Emergency Readiness Team urged businesses to disable SMB1 and block all SMB traffic at network boundaries. Several ransomware attacks followed the warning. EternalBlue, WannaCry and Petya all used SMB1 exploits to encrypt data and torment systems administrators. In the fallout, Microsoft issued several SMB-related security updates and even issued patches for unsupported client and server systems. With the fall 2017 Windows updates, Microsoft disabled SMB1 by default in Windows 10 and Windows Server 2016.

Here are some ways to identify where SMB1 is active in your systems and how it can be disabled.

Use Microsoft Message Analyzer to detect SMB1

Microsoft Message Analyzer is a free tool that comes with Windows and detects SMB1-style communications. Message Analyzer traces inbound and outbound activity from different systems on the network.

Microsoft Message Analyzer
The free Microsoft Message Analyzer utility captures network traffic to discover where SMB1 communications appear across the network.

The admin applies certain filters in Message Analyzer to sift through traffic; in this case, the admin uses SMB as a filter. Message Analyzer checks for markers of SMB1 transactions and pinpoints its source and the traffic’s destination. Here’s a sample of captured network traffic that indicates a device that uses SMB1:

ComNegotiate, Status: STATUS_SUCCESS, Selected Dialect: NT LM 0.12, Requested Dialects: [PC NETWORK PROGRAM 1.0, LANMAN1.0, Windows for Workgroups 3.1a, LM1.2X002, LANMAN2.1, NT LM 0.12]

A reference to outdated technologies, such as Windows for Workgroups and LAN Manager, indicates SMB1. Anything that communicates with a Windows network — such as copiers, multifunction printers, routers, switches, appliances and storage devices — could be the culprit still on SMB1.

The first step to disable SMB1 on the network is to find where it lives.

There are three options to remove SMB1 from these devices: turn off SMB1 support, change the protocol or, in extreme cases, remove the equipment permanently from the network.

Use Message Analyzer to find references in “requested dialects,” such as SMB 2.002 and SMB 2.???. This indicates systems and services that default to SMB1 — most likely to provide maximum compatibility with other devices and systems on the network — but can use later SMB versions if SMB1 is not available.

Evaluate with DSC Environment Analyzer

Desired State Configuration Environment Analyzer (DSCEA) is a PowerShell tool that uses DSC to see if systems comply with the defined configuration. DSCEA requires PowerShell 5.0 or higher.

DSC works in positive statements — because we want to disable SMB1, we have to build a DSC statement in that way to find systems with SMB1 already disabled. By process of elimination, DSCEA will generate a report of systems that failed to meet our requirements — these are the systems that still have SMB1 enabled.

Microsoft provides a more detailed guide to write a configuration file that uses DSCEA to find SMB1 systems.

Identify the perpetrators

To make this detective work less burdensome, Microsoft has a list of products that still require the SMB1 protocol. Some of the products are older and out of support, so don’t expect updates that use the latest version of SMB.

TD builds on its reputation for excellent customer service with digital banking services powered by the Microsoft Cloud – Transform

TD Bank Group (TD) is sharply focused on building the bank of the future. A future where digital is one of the core driving forces of its transformation journey; where data provides insights into the bank’s customer beliefs, needs and behaviors; and where technology will be the centerpiece of the bank’s delivery model.

In a short time, the bank has made tremendous progress. While TD continues to make the necessary investments in its digital transformation, it does so with the customer at the center. TD has always delivered spectacular in-person customer experiences – that’s how it became the sixth largest bank in North America.

Phrases like artificial intelligence, big data and cloud services didn’t exist in the industry several years ago, but now they’re part of everyday discussions across TD. The bank’s digital and data-driven transformation allows more meaningful and personal engagements with customers, fuels application development, and informs branch and store service delivery by gathering insights to better serve customers the way they want to be served, with precision and close attention to their specific needs.

[embedded content]

TD generates close to 100 million digital records daily, and has more than 12 million digitally active customers. With the Microsoft Cloud to help harness that data, TD can deliver on their promise of legendary service at every touchpoint.

“After all, we’re talking about people’s money,” says Imran Khan, vice president of Digital Customer Experience at TD. “No one gets up in the morning and says, ‘I want a mortgage or a new credit card.’ They say, ‘I want to own a home, invest in my children’s education, start a business, take a holiday with my family, plan for a happy and secure retirement.”

TD knew early on that to innovate quickly it required a flexible platform that harnessed customer data and delivered actionable insights. With Microsoft Azure and data platform services to help provide the power and intelligent capabilities TD was in search of, the financial institution continues to live up to its rich reputation.

New advancements in Azure for IT digital transformation

I’m at Ignite this week, where more than 20,000 of us are talking about how we can drive our businesses forward in a climate of constant technology change. We are in a time where technology is one of the core ways companies can better serve customers and differentiate versus competitors. It is an awesome responsibility. The pace of change is fast, and constant – but with that comes great opportunity for innovation, and true business transformation.

Here at Microsoft, our mission is to empower every person and every organization on the planet to achieve more. I believe that mission has a special meaning for the IT audience, particularly in the era of cloud computing. Collectively we are working with each of you to take advantage of new possibilities in this exciting time. That’s the reason we are building Azure – for all of you. The trusted scale and resiliency of our Azure infrastructure, the productivity of our Azure services for building and delivering modern applications, and our unmatched hybrid capabilities, are the foundation that can help propel your business forward. With 42 regions announced around the world and an expansive network spanning more than 4,500 points of presence– we’re the backbone for your business.

Core Infrastructure

Cloud usage goes far beyond the development and test workloads people originally started with. Enterprises are driving a second wave of cloud adoption, including putting their most mission-critical, demanding systems in the cloud. We are the preferred cloud for the enterprise, with more than 90% of the Fortune 500 choosing the Microsoft cloud. Today at Ignite, we’re making several announcements about advancements in Azure infrastructure:

  • New VM sizes. We continue to expand our compute options at a rapid rate. In my general session, I will demonstrate SAP HANA running on both M-series and purpose-built infrastructure, the largest of their kind in the cloud. I will discuss the preview of the B-series VM for burstable workloads, and announce the upcoming Fv2-, NCv2-, ND-series which offer the innovation of new processor types like Intel’s Scalable Xeon and NVIDIA’s Tesla P100 and P40 GPUs.
  • The preview of Azure File Sync, offering secure, centralized file share management in the cloud. This new service provides more redundancy and removes complexity when it comes to sharing files, eliminating the need for special configuration or code changes.
  • A new enterprise NFS service, powered by NetApp. Building on the partnership with NetApp announced in June, Microsoft will deliver a first-party, native NFS v3/v4 service based on NetApp’s proven ONTAP® and other hybrid cloud data services, with preview available in early 2018. This service will deliver enterprise-grade data storage, management, security, and protection for customers moving to Microsoft Azure. We will also enable this service to advance hybrid cloud scenarios, providing visibility and control across Azure, on-premises and hosted NFS workloads. 
  • The preview of a new Azure networking service called Azure DDoS Protection, which helps protect publicly accessible endpoints from distributed denial of service (DDoS) attacks. Azure DDoS Protection learns an application’s normal traffic patterns and automatically applies traffic scrubbing when attacks are detected to ensure only legitimate traffic reaches the service.
  • The introduction of two new cloud governance services – Azure Cost Management and Azure Policy – to help you monitor and optimize cloud spend and cloud compliance. We are making Azure Cost Management free for Azure customers, and you can sign up now for a preview of Azure Policy. 
  • Integration of the native security and management experience. New updates in the Azure portal simplify the process of backing up, monitoring, and configuring diaster recovery for virtual machines. We are also announcing update management will now be free for Azure customers.
  • A preview of the new Azure Migrate service, which helps discover and migrate virtual machines and servers. The new service captures all on-premises applications, workloads, and data, and helps map migration dependencies over to Azure, making IT’s jobs immensely easier. Azure Migrate also integrates with the Database Migration Services we released today.
  • A preview of the new Azure Data Box, which provides a secure way to transfer very large datasets to Azure. This integrates seamlessly with Azure services like Backup and Site Recovery as well as partner solutions from CommVault, Netapp, Veritas, Veeam, and others.

Building on the news from last week about the preview of Azure Availability Zones, later today I will also talk about the unique measures we are taking in Azure to help customers ensure business continuity. As the only cloud provider with single VM SLAs, 21 announced region pairs for disaster recovery, Azure offers differentiated rich high availability and disaster recovery capabilities. This means you have the best support, resiliency, and availability for your mission-critical workloads.

Modern Applications

Applications are central to every digital transformation strategy. One of the compelling and more recent technologies that is helping in the modernization of applications is containers. Having received more attention from developers to date, containers are now accelerating application deployment and streamlining the way IT operations and development teams collaborate to deliver applications. Today we are announcing even more exciting advancements in this space:

  • Windows Server containers were introduced with Windows Server 2016. The first Semi-Annual Channel release of Windows Server, version 1709, introduces further advances in container technology, including an optimized Nano Server Container image (80% smaller!), new support for Linux containers on Hyper-V, and the ability to run native Linux tools with the Windows Subsystem for Linux (aka Bash for Windows).
  • Azure supports containers broadly, offering many options to deploy, from simple infrastructure to richly managed. Azure Container Instances (ACI) provide the simplest way to create and deploy new containers in the cloud with just a few simple clicks, and today I’m announcing Azure Container Instances now support Windows Server in addition to Linux.
  • Azure Service Fabric offers a generalized hosting and container orchestration platform designed for highly scalable applications, and today we are announcing the general availability of Linux support.

Hybrid Cloud

Nearly 85 percent of organizations tell us they have a cloud strategy that is hybrid, and even more – 91 percent – say they believe that hybrid cloud will be a long-term approach. Hybrid cloud capabilities help you adopt the cloud faster. What is unique about Microsoft’s hybrid cloud approach is that we build consistency between on-premises and the cloud. Consistency helps take the complexity of hybrid cloud out because it means you don’t need two different systems for everything. We build that consistency across identity, data, development, and security and management. Today we’re advancing our hybrid cloud leadership even further via the following developments:

  • Azure Stack is now shipping from our partners Dell EMC, Hewlett Packard Enterprise (HPE) and Lenovo. You can see all of these integrated systems on the show floor at Ignite. As an extension of Azure, Azure Stack brings the agility and fast-paced innovation of cloud computing to on-premises environments. Only Azure Stack lets you deliver Azure services from your organization’s datacenter, while balancing the right amount of flexibility and control – for truly-consistent hybrid cloud deployments.
  • Our fully managed Azure SQL Database service now has 100 percent SQL Server compatibility for no code changes via Managed Instance. And today, we are introducing a new Azure Database Migration Service that enables a near-zero downtime migration. Now customers can migrate all of their data to Azure without hassle or high cost.
  • Azure Security Center can now be used to secure workloads running on-premises and in other clouds. We’re also releasing today new capabilities to better defend against threats and respond quickly, including Just in Time (JIT) access, dynamic app whitelisting, and being able to drill down into an attack end to end with interactive investigation paths and mapping.

Beyond all of the product innovation above, one of the areas I’m proud of is the work we’re doing to save customers money. For example, the Azure Hybrid Benefit for Windows Server and the newly announced Azure Hybrid Benefit for SQL Server allow customers to use their existing licenses to get discounts in Azure, making Azure the most economical choice and path to the cloud for these customers. Together with the new Azure Reserved VM Instances we just announced, customers will be able to save up to 82 percent on Windows Server VMs. The free Azure Cost Management capabilities I mentioned above help customers save money by optimizing how they run things in Azure. And we are now offering the new Azure free account which introduces the free use of many popular services for 12 months, in addition to the $200 free credit we provide.

It’s an exciting time for IT, and we’re equally excited that you are our trusted partners in this era of digital transformation. I look forward to hearing your questions or feedback so that we can further your trust in us and empower each of you to achieve more.

For Trade – 2 x Apple Airport Extreme (AC) & 16GB Kingston 2666 WANTED: Ubiquiti AP

Would you take £35 for your UAP delivered?

I assume you’re no where near me for collection.

I’ve got my Pro installed last night and very impressed, I have coverage in my whole house now but still could make use of your spare unit in the garage wall which is behind my lounge so tablets can use that.

I know you don’t want lose out but you can send it by pigeon to save money. I’m in no rush.

For Trade – 2 x Apple Airport Extreme (AC) & 16GB Kingston 2666 WANTED: Ubiquiti AP

Would you take £35 for your UAP delivered?

I assume you’re no where near me for collection.

I’ve got my Pro installed last night and very impressed, I have coverage in my whole house now but still could make use of your spare unit in the garage wall which is behind my lounge so tablets can use that.

I know you don’t want lose out but you can send it by pigeon to save money. I’m in no rush.

For Trade – 2 x Apple Airport Extreme (AC) & 16GB Kingston 2666 WANTED: Ubiquiti AP

Would you take £35 for your UAP delivered?

I assume you’re no where near me for collection.

I’ve got my Pro installed last night and very impressed, I have coverage in my whole house now but still could make use of your spare unit in the garage wall which is behind my lounge so tablets can use that.

I know you don’t want lose out but you can send it by pigeon to save money. I’m in no rush.

For Trade – 2 x Apple Airport Extreme (AC) & 16GB Kingston 2666 WANTED: Ubiquiti AP

Would you take £35 for your UAP delivered?

I assume you’re no where near me for collection.

I’ve got my Pro installed last night and very impressed, I have coverage in my whole house now but still could make use of your spare unit in the garage wall which is behind my lounge so tablets can use that.

I know you don’t want lose out but you can send it by pigeon to save money. I’m in no rush.

For Trade – 2 x Apple Airport Extreme (AC) & 16GB Kingston 2666 WANTED: Ubiquiti AP

Would you take £35 for your UAP delivered?

I assume you’re no where near me for collection.

I’ve got my Pro installed last night and very impressed, I have coverage in my whole house now but still could make use of your spare unit in the garage wall which is behind my lounge so tablets can use that.

I know you don’t want lose out but you can send it by pigeon to save money. I’m in no rush.

DevOps tools training sparks IT productivity

Enterprises have a new weapon to combat the IT skills shortage where new hiring and training practices fall short.

Most IT pros agree the fastest path to IT burnout is what Amazon engineers have termed “undifferentiated heavy lifting,” which is repetitive and uninteresting work that has little potential for wider impact beyond keeping the lights on. DevOps tools training, which involves IT automation practices, can reduce or eliminate such mundane work and can compensate against staff shortages and employee attrition.

“Automation tools aren’t used to eliminate staff; they’re used to help existing staff perform at a higher level,” said Pete Wirfs, a programmer specialist at SAIF Corp., a not-for-profit workers’ compensation insurance company in Salem, Ore., that has used Automic Software’s Automation Engine to orchestrate scripts.

The company has used Automation Engine since 2013, but last year, it calculated new application development would add hundreds of individual workflows to the IT operations workload. Instead, Wirfs said he found a way to automate database queries and use the results to kick off scripts, so a single centralized workflow could meet all the project’s needs.

As a result, SAIF has expanded its IT environment exponentially over the last four years with no additional operations staff. The data center also can run lights-out for a few hours each night, with the automation scripts set up to handle monitoring, health checks and route alerts to the appropriate contacts when necessary. No IT ops employees work on Sundays at SAIF at all.

“There’s no end to what we can find to automate,” Wirfs said.

DevOps tools training standardizes IT processes

SAIF’s case illustrates an important facet of DevOps tools training: standardization of a company’s tools and workflows. A move from monoliths to microservices can make an overall system more complex, but individual components become similar, repeatable units that are easier to understand, maintain and troubleshoot.

“The monoliths of the early 2000s were very complicated, but now, people are a lot more pragmatic,” said Nuno Pereira, CTO of iJET International, a risk management company in Annapolis, Md. “DevOps has given us a way to keep component complexity in check.”

In modern monitoring systems, DevOps tools training can curtail the notifications that bombard IT operations pros through centralized tools, such as Cisco’s AppDynamics and LogicMonitor. These are popular among DevOps shops because they boost the signal-to-noise ratio of highly instrumented and automated environments, and they establish a standardized common ground for collaborative troubleshooting.

“[With] LogicMonitor, [we can] capture data and make it easily viewable so that different disciplines of IT can speak the same language across skill sets,” said Andy Domeier, director of technology operations at SPS Commerce, a communications network for supply chain and logistics businesses based in Minneapolis.

Four or five years ago, problems in the production infrastructure weren’t positively identified for an average of about 30 minutes per incident, Domeier said. Now, within one to two minutes, DevOps personnel can determine there is a problem, with an average recovery time of 10 to 15 minutes, he estimated.

Standardization has been key to keeping up with ever-bigger web-scale infrastructure at DevOps bellwethers such as Google.

“If every group in a company has a different set of technologies, it is impossible to make organizationwide changes that lift all boats,” said Ben Sigelman, who built Dapper, a distributed tracing utility Google uses to monitor distributed systems. Google maintains one giant source-code repository, for example, which means any improvement immediately benefits the entire Google codebase.

“Lack of standardization is an impediment to DevOps, more than anything else,” Sigelman said.

Google has standardized on open source tools, which offer common platforms that can be used and developed by multiple companies, and this creates another force-multiplier for the industry. Sigelman, now CEO of a stealth startup called LightStep, said DevOps tools training has started to have a similar effect in the mainstream enterprise.

Will AI help?

DevOps tools training can go a long way to help small IT teams manage big workloads, but today’s efficiency improvements have their limits. Already, some tools, such as Splunk Insights, use adaptive machine-learning algorithms to give the human IT pro’s brain an artificial intelligence (AI) boost — a concept known as AIOps.

“The world is not going to get easier,” said Rick Fitz, senior vice president of IT markets for Splunk, based in San Francisco. “People are already overwhelmed with complexity and data. To get through the next five to 10 years, we have to automate the mundane so people can use their brains more effectively.”

People are already overwhelmed with complexity and data. To get through the next five to 10 years, we have to automate the mundane.
Rick Fitzsenior vice president of IT markets, Splunk

Strong enthusiasm for AIOps has spread throughout the industry. Today’s analytics products, such as Splunk, use statistics to predict when a machine will fail or the broader impact of a change to an IT environment. However, AIOps systems may move beyond rules-based systems to improve on those rules or gain insights humans won’t come up with on their own, said Brad Shimmin, analyst with GlobalData PLC, headquartered in London. Groups of companies will share data the way they share open source software development today and enhance the insights AIOps can create, he predicted.

The implications for AIOps are enormous. Network intrusion detection is just one of the many IT disciplines experts predict will change with AIOps over the next decade. AIOps may be able to detect attack signatures or malicious behavior in users that humans and today’s systems cannot detect — for example, when someone hijacks and maliciously uses an end-user account, even if the end user’s identifier and credentials remain the same.

But while AIOps has promise, those who’ve seen its early experimental implementations are skeptical that AIOps can move beyond the need for human training and supervision.

“AI needs a human being to tell it what matters to the business,” LightStep’s Sigelman said, based on what he saw while working at Google. “AI is a fashionable term, but where it’s most successful is when it’s used to sift through a large stream of data with user-defined filtering.”

Beth Pariseau is senior news writer for TechTarget’s Data Center and Virtualization Media Group. Write to her at bpariseau@techtarget.com or follow @PariseauTT on Twitter.

Cloud App Security new auto-remediation feature

Immediate session log off for suspicious users

Real-time remediation for security threats is a key challenge for companies, where attackers can move quickly to access critical data. The Cloud App Security team is excited to introduce a new feature for threat protection through integration with Azure Active Directory: when a suspicious activity is identified in Cloud App Security portal, you can now initiate an auto-remediation action logging off these users and requiring users to sign in again to Office 365 as well as all apps accessed through Azure Active Directory.

Let’s explore two key reaction capabilities of this feature:

Respond to anomalous behavior

External sharing of sensitive files, download of sensitive files from unrecognized locations, or any activity that’s considered abnormal can trigger alerts in Cloud App Security portal. These alerts provide immediate notification of potential security incidents and assist admins with proactive investigation.

In the event of suspicious user behavior, the new auto-remediation feature allows the security admin to take immediate action, triggering a revocation of all user sessions, and requiring the user to sign-in again to all apps.

React to account takeover

When an attacker gains unauthorized access to an account, a common industry practice is to disable the account. But this is not enough! If the account is actively being used to exfiltrate data, gain elevated privileges in the organization, or any other method that keeps the attacker’s session active, they can still use the compromised account.

The new Cloud App Security capability allows an admin to revoke the compromised account’s sessions and fully mitigate the attack. Cloud App Security invalidates all the user’s refresh tokens issued to cloud apps.

How to implement this feature

Requiring the user to sign in again can be set during the policy creation phase, or initiated directly from an alert as part of the resolution options for a user. Initiating governance actions directly from the policy allow for automatic remediation. In this case, the admin needs only to select this option and it will be enforced.

image

Policy setting: require user to sign-in again

Alternatively, an admin can select to require another sign in as part of the reactive investigation of an alert as seen below. In either case, to ensure secure productivity, the user is protected and can continue working with minimal interruption.

image

Require user to sign in again during investigation of a specific alert

Better together

Our goal is to provide a holistic and innovative security approach with Enterprise Mobility + Security. Cloud App Security and Azure Active Directory together offer unique value that help you gain better control over your cloud, by identifying suspicious activities which may be indicative of a breach and then respond immediately.

Learn more and give us feedback

We know how important visibility, control and threat protection are for you, especially when it comes to cloud apps. Our goal is to continuously innovate to provide a top-notch user experience, visibility, data control and threat protection for your cloud apps. If you would like to learn more about our solution, please visit our technical documentation page.

We’d also love to hear your feedback. If you have any questions, comments or feedback, please leave a comment or visit our Microsoft Cloud App Security Tech Community page.