Tag Archives: where

WD My Cloud Mk1 enclosure

My WD My Cloud has developed a fault where it falls off the network every few hours, needing a hard reboot. I’ve tested the drive and its in perfect condition so it mus be an overheating issue with the board.

Any road up, I need a new enclosure. If you have one you don’t require any longer, let me know. It must be a MK1 version with the shiny silver enclosure and not the dull grey one.

Cheers.

Location: Belfast, N. Ireland…

WD My Cloud Mk1 enclosure

NVMe flash storage doesn’t mean tape and disk are dying

Not long ago, a major hardware vendor invited me to participate in a group chat where we would explore the case for flash storage and software-defined storage. On the list of questions sent in advance was that burning issue: Has flash killed disk? Against my better judgment, I accepted the offer. Opinions being elbows, I figured I had a couple to contribute.

I joined a couple of notable commentators from the vendor’s staff and the analyst community, who I presumed would echo the talking points of their client like overzealous high school cheerleaders. I wasn’t wrong.

Shortly after it started, I found myself drifting from the nonvolatile memory express (NVMe) flash storage party line. I also noted that software-defined storage (SDS) futures weren’t high and to the right in the companies I was visiting, despite projections by one analyst of 30%-plus growth rates over the next couple years. Serious work remained to be done to improve the predictability, manageability and orchestration of software-defined and hyper-converged storage, I said, and the SDS stack itself needed to be rethought to determine whether the right services were being centralized.

Yesterday’s silicon tomorrow

I also took issue with the all-silicon advocates, stating my view that NVMe flash storage might just be “yesterday’s silicon storage technology tomorrow,” or at least a technology in search of a workload. I wondered aloud whether NVMe — that the “shiny new thing” — mightn’t be usurped shortly by capacitor-backed dynamic RAM (DRAM) that’s significantly less expensive and faster. DRAM also has much lower latency than NVMe flash storage because it’s directly connected to the memory channel rather than the PCI bus or a SAS or SATA controller.

The vendor tried to steer me back into the fold, saying “Of course, you need the right tool for the right job.” Truer words were never spoken. I replied that silicon storage was part of a storage ecosystem that would be needed in its entirety if we were to store the zettabytes of data coming our way. The vendor liked this response since the company had a deep bench of storage offerings that included disk and tape.

I then took the opportunity to further press the notion that disk isn’t dead any more than tape is dead, despite increasing claims to the contrary. (I didn’t share a still developing story around a new type of disk with a new form factor and new data placement strategy that could buy even more runway for that technology. For now, I am sworn to secrecy, but once the developers give the nod, readers of this column will be the first to know.)

I did get some pushback from analysts about tape, which they saw as completely obsoleted in the next generation, all-silicon data center. I could have pushed them over to Quantum Corp. for another view.

The back story

A few columns back, I wrote something about Quantum exiting the tape space based on erroneous information from a recently released employee. I had to issue a retraction, and I contacted Quantum and spoke with Eric Bassier, senior director of data center products and solutions, who set the record straight. Seems Quantum — like IBM and Spectra Logic — is excited about LTO-8 tape technology and how it can be wed to the company’s Scalar tape products and StorNext file system.

Bassier said Quantum was “one of only a few storage companies [in 2016] to demonstrate top-line growth and profitability,” and its dedication to tape was not only robust, it succeeded with new customers seeking to scale out capacity. In addition to providing a dense enterprise tape library, the Scalar i6000 has 11,000 or more slots, a dual robot and as many as 24 drives in a single 19-inch rack frame, all managed with web services using representational state transfer, or RESTful API calls.

Quantum, like IBM and Spectra Logic, is articulating a product strategy that has fingers in all the popular storage buckets.

Quantum was also hitting the market with a 3U rack-mountable, scalable library capable of delivering 150 TB uncompressed LTO-7 tape storage or 300 TB uncompressed LTO-8 in storage for backup, archive or additional secondary storage for less frequently used files and objects. Add compression and you more than double these capacity numbers. That, Bassier asserted, was more data than many small and medium-sized companies would generate in a year.

Disk also has a role in Quantum’s world; its DXi product provides data deduplication that’s a significant improvement over the previous-generation model. It offers performance and density improvements through the application of SSDs and 8 TB HDDs, as well as a reduction in power consumption.

All the storage buckets

Quantum, like IBM and Spectra Logic, is articulating a product strategy that has fingers in all the popular buckets, including tape, disk and NVMe flash storage. After years of burying their story under a rock by providing OEM products to other vendors who branded them as their own, 90% of the company’s revenue is now derived from the Quantum brand.

Bottom line: We might eventually get to an all-silicon data center. In the same breath, I could say that we might eventually get that holographic storage the industry has promised since the Kennedy administration. For planning 2018, your time is better spent returning to basics. Instead of going for the shiny new thing, do the hard work of understanding your workload, then architecting the right combination of storage and software to meet your needs. Try as you might, the idea of horizontal storage technology — one size fits most — with simple orchestration and administration, remains elusive.

That’s my two elbows.

The top Exchange and Office 365 tutorials of 2017

Even in the era of Slack and Skype, email remains the key communication linchpin for business. But where companies use email is changing.

In July 2017, Microsoft said, for the first time, its cloud-based Office 365 collaboration platform brought in more revenue than traditional Office licensing. In October 2017, Microsoft said it had 120 million commercial subscribers using its cloud service.

This trend toward the cloud is reflected by the heavy presence of Office 365 tutorials in this compilation of the most popular tips of 2017 on SearchExchange. More businesses are interested in moving from a legacy on-premises server system to the cloud — or at least a new version of Exchange.

The following top-rated Office 365 tutorials range from why a business would use an Office 365 hybrid setup to why a backup policy is essential in Office 365.

5. Don’t wait to make an Office 365 backup policy

Microsoft does not have a built-in backup offering for Office 365, so admins have to create a policy to make sure the business doesn’t lose its data.

Admins should work down a checklist to ensure email is protected if problems arise:

  • Create specific plans for retention and archives.
  • See if there are regulations for data retention.
  • Test backup procedures in Office 365 backup providers, such as Veeam and Backupify.
  • Add alerts for Office 365 backups.

4. What it takes to convert distribution groups into Office 365 Groups

Before the business moves from its on-premises email system to Office 365, admins must look at what’s involved to turn distribution groups into Office 365 Groups. The latter is a collaborative service that gives access to shared resources, such as a mailbox, calendar, document library, team site and planner.

Microsoft provides conversion scripts to ease the switch, but they might not work in every instance. Many of our Office 365 tutorials cover these types of migration issues. This tip explains some of the other obstacles administrators encounter with Office 365 Groups and ways around them.

3. Considerations before a switch to Office 365

While Office 365 has the perk of lifting some work off IT’s shoulders, it does have some downsides. A move to the cloud means the business will lose some control over the service. For example, if Office 365 goes down, there isn’t much an admin can do if it’s a problem on Microsoft’s end.

Businesses also need to keep a careful eye on what exactly they need from licensing, or they could end up paying far more than they should. And while it’s tempting to immediately adopt every new feature that rolls out of Redmond, Wash., the organization should plan ahead to determine training for both the end user and IT department to be sure the company gets the most out of the platform.

2. When a hybrid deployment is the right choice

A clean break from a legacy on-premises version of Exchange Server to the cloud sounds ideal, but it’s not always possible due to regulations and technical issues. In those instances, a hybrid deployment can offer some benefits of the cloud, while some mailboxes remain in the data center. Many of our Office 365 tutorials assist businesses that require a hybrid model to contend with certain requirements, such as the need to keep certain applications on premises.

1. A closer look at Exchange 2016 hardware

While Microsoft gives hardware requirements for Exchange Server 2016, its guidelines don’t always mesh with reality. For example, Microsoft says companies can install Exchange Server 2016 on a 30 GB system partition. But to support the OS and updates, businesses need at least 100 GB for the system partition.

A change from an older version of Exchange to Exchange 2016 might ease the burden on the storage system, but increase demands on the CPU. This tip explains some of the adjustments that might be required before an upgrade.

Disable SMB1 before the next WannaCry strikes

of cyberattacks — the outdated SMB protocol. The first step to disable SMB1 on the network is to find where it lives.

Server Message Block is a transmission protocol used to discover resources and transfer files across the network. SMB1 dates back to the mid-1990s, and Microsoft regularly updates the SMB protocol to address evolving encryption and security needs. In 2016, Microsoft introduced SMB 3.1.1, which is the current version at the time of publication.

But SMB1 still lingers in data centers. Many administrators, as well as third-party storage and printer vendors, haven’t kept up with the new SMB versions — they either default to SMB1 or don’t support the updates to the protocol.

Meanwhile, attackers exploit the weaknesses in the SMB1 protocol to harm the enterprise. In January 2017, the U.S. Computer Emergency Readiness Team urged businesses to disable SMB1 and block all SMB traffic at network boundaries. Several ransomware attacks followed the warning. EternalBlue, WannaCry and Petya all used SMB1 exploits to encrypt data and torment systems administrators. In the fallout, Microsoft issued several SMB-related security updates and even issued patches for unsupported client and server systems. With the fall 2017 Windows updates, Microsoft disabled SMB1 by default in Windows 10 and Windows Server 2016.

Here are some ways to identify where SMB1 is active in your systems and how it can be disabled.

Use Microsoft Message Analyzer to detect SMB1

Microsoft Message Analyzer is a free tool that comes with Windows and detects SMB1-style communications. Message Analyzer traces inbound and outbound activity from different systems on the network.

Microsoft Message Analyzer
The free Microsoft Message Analyzer utility captures network traffic to discover where SMB1 communications appear across the network.

The admin applies certain filters in Message Analyzer to sift through traffic; in this case, the admin uses SMB as a filter. Message Analyzer checks for markers of SMB1 transactions and pinpoints its source and the traffic’s destination. Here’s a sample of captured network traffic that indicates a device that uses SMB1:

ComNegotiate, Status: STATUS_SUCCESS, Selected Dialect: NT LM 0.12, Requested Dialects: [PC NETWORK PROGRAM 1.0, LANMAN1.0, Windows for Workgroups 3.1a, LM1.2X002, LANMAN2.1, NT LM 0.12]

A reference to outdated technologies, such as Windows for Workgroups and LAN Manager, indicates SMB1. Anything that communicates with a Windows network — such as copiers, multifunction printers, routers, switches, appliances and storage devices — could be the culprit still on SMB1.

The first step to disable SMB1 on the network is to find where it lives.

There are three options to remove SMB1 from these devices: turn off SMB1 support, change the protocol or, in extreme cases, remove the equipment permanently from the network.

Use Message Analyzer to find references in “requested dialects,” such as SMB 2.002 and SMB 2.???. This indicates systems and services that default to SMB1 — most likely to provide maximum compatibility with other devices and systems on the network — but can use later SMB versions if SMB1 is not available.

Evaluate with DSC Environment Analyzer

Desired State Configuration Environment Analyzer (DSCEA) is a PowerShell tool that uses DSC to see if systems comply with the defined configuration. DSCEA requires PowerShell 5.0 or higher.

DSC works in positive statements — because we want to disable SMB1, we have to build a DSC statement in that way to find systems with SMB1 already disabled. By process of elimination, DSCEA will generate a report of systems that failed to meet our requirements — these are the systems that still have SMB1 enabled.

Microsoft provides a more detailed guide to write a configuration file that uses DSCEA to find SMB1 systems.

Identify the perpetrators

To make this detective work less burdensome, Microsoft has a list of products that still require the SMB1 protocol. Some of the products are older and out of support, so don’t expect updates that use the latest version of SMB.

TD builds on its reputation for excellent customer service with digital banking services powered by the Microsoft Cloud – Transform

TD Bank Group (TD) is sharply focused on building the bank of the future. A future where digital is one of the core driving forces of its transformation journey; where data provides insights into the bank’s customer beliefs, needs and behaviors; and where technology will be the centerpiece of the bank’s delivery model.

In a short time, the bank has made tremendous progress. While TD continues to make the necessary investments in its digital transformation, it does so with the customer at the center. TD has always delivered spectacular in-person customer experiences – that’s how it became the sixth largest bank in North America.

Phrases like artificial intelligence, big data and cloud services didn’t exist in the industry several years ago, but now they’re part of everyday discussions across TD. The bank’s digital and data-driven transformation allows more meaningful and personal engagements with customers, fuels application development, and informs branch and store service delivery by gathering insights to better serve customers the way they want to be served, with precision and close attention to their specific needs.

[embedded content]

TD generates close to 100 million digital records daily, and has more than 12 million digitally active customers. With the Microsoft Cloud to help harness that data, TD can deliver on their promise of legendary service at every touchpoint.

“After all, we’re talking about people’s money,” says Imran Khan, vice president of Digital Customer Experience at TD. “No one gets up in the morning and says, ‘I want a mortgage or a new credit card.’ They say, ‘I want to own a home, invest in my children’s education, start a business, take a holiday with my family, plan for a happy and secure retirement.”

TD knew early on that to innovate quickly it required a flexible platform that harnessed customer data and delivered actionable insights. With Microsoft Azure and data platform services to help provide the power and intelligent capabilities TD was in search of, the financial institution continues to live up to its rich reputation.

New advancements in Azure for IT digital transformation

I’m at Ignite this week, where more than 20,000 of us are talking about how we can drive our businesses forward in a climate of constant technology change. We are in a time where technology is one of the core ways companies can better serve customers and differentiate versus competitors. It is an awesome responsibility. The pace of change is fast, and constant – but with that comes great opportunity for innovation, and true business transformation.

Here at Microsoft, our mission is to empower every person and every organization on the planet to achieve more. I believe that mission has a special meaning for the IT audience, particularly in the era of cloud computing. Collectively we are working with each of you to take advantage of new possibilities in this exciting time. That’s the reason we are building Azure – for all of you. The trusted scale and resiliency of our Azure infrastructure, the productivity of our Azure services for building and delivering modern applications, and our unmatched hybrid capabilities, are the foundation that can help propel your business forward. With 42 regions announced around the world and an expansive network spanning more than 4,500 points of presence– we’re the backbone for your business.

Core Infrastructure

Cloud usage goes far beyond the development and test workloads people originally started with. Enterprises are driving a second wave of cloud adoption, including putting their most mission-critical, demanding systems in the cloud. We are the preferred cloud for the enterprise, with more than 90% of the Fortune 500 choosing the Microsoft cloud. Today at Ignite, we’re making several announcements about advancements in Azure infrastructure:

  • New VM sizes. We continue to expand our compute options at a rapid rate. In my general session, I will demonstrate SAP HANA running on both M-series and purpose-built infrastructure, the largest of their kind in the cloud. I will discuss the preview of the B-series VM for burstable workloads, and announce the upcoming Fv2-, NCv2-, ND-series which offer the innovation of new processor types like Intel’s Scalable Xeon and NVIDIA’s Tesla P100 and P40 GPUs.
  • The preview of Azure File Sync, offering secure, centralized file share management in the cloud. This new service provides more redundancy and removes complexity when it comes to sharing files, eliminating the need for special configuration or code changes.
  • A new enterprise NFS service, powered by NetApp. Building on the partnership with NetApp announced in June, Microsoft will deliver a first-party, native NFS v3/v4 service based on NetApp’s proven ONTAP® and other hybrid cloud data services, with preview available in early 2018. This service will deliver enterprise-grade data storage, management, security, and protection for customers moving to Microsoft Azure. We will also enable this service to advance hybrid cloud scenarios, providing visibility and control across Azure, on-premises and hosted NFS workloads. 
  • The preview of a new Azure networking service called Azure DDoS Protection, which helps protect publicly accessible endpoints from distributed denial of service (DDoS) attacks. Azure DDoS Protection learns an application’s normal traffic patterns and automatically applies traffic scrubbing when attacks are detected to ensure only legitimate traffic reaches the service.
  • The introduction of two new cloud governance services – Azure Cost Management and Azure Policy – to help you monitor and optimize cloud spend and cloud compliance. We are making Azure Cost Management free for Azure customers, and you can sign up now for a preview of Azure Policy. 
  • Integration of the native security and management experience. New updates in the Azure portal simplify the process of backing up, monitoring, and configuring diaster recovery for virtual machines. We are also announcing update management will now be free for Azure customers.
  • A preview of the new Azure Migrate service, which helps discover and migrate virtual machines and servers. The new service captures all on-premises applications, workloads, and data, and helps map migration dependencies over to Azure, making IT’s jobs immensely easier. Azure Migrate also integrates with the Database Migration Services we released today.
  • A preview of the new Azure Data Box, which provides a secure way to transfer very large datasets to Azure. This integrates seamlessly with Azure services like Backup and Site Recovery as well as partner solutions from CommVault, Netapp, Veritas, Veeam, and others.

Building on the news from last week about the preview of Azure Availability Zones, later today I will also talk about the unique measures we are taking in Azure to help customers ensure business continuity. As the only cloud provider with single VM SLAs, 21 announced region pairs for disaster recovery, Azure offers differentiated rich high availability and disaster recovery capabilities. This means you have the best support, resiliency, and availability for your mission-critical workloads.

Modern Applications

Applications are central to every digital transformation strategy. One of the compelling and more recent technologies that is helping in the modernization of applications is containers. Having received more attention from developers to date, containers are now accelerating application deployment and streamlining the way IT operations and development teams collaborate to deliver applications. Today we are announcing even more exciting advancements in this space:

  • Windows Server containers were introduced with Windows Server 2016. The first Semi-Annual Channel release of Windows Server, version 1709, introduces further advances in container technology, including an optimized Nano Server Container image (80% smaller!), new support for Linux containers on Hyper-V, and the ability to run native Linux tools with the Windows Subsystem for Linux (aka Bash for Windows).
  • Azure supports containers broadly, offering many options to deploy, from simple infrastructure to richly managed. Azure Container Instances (ACI) provide the simplest way to create and deploy new containers in the cloud with just a few simple clicks, and today I’m announcing Azure Container Instances now support Windows Server in addition to Linux.
  • Azure Service Fabric offers a generalized hosting and container orchestration platform designed for highly scalable applications, and today we are announcing the general availability of Linux support.

Hybrid Cloud

Nearly 85 percent of organizations tell us they have a cloud strategy that is hybrid, and even more – 91 percent – say they believe that hybrid cloud will be a long-term approach. Hybrid cloud capabilities help you adopt the cloud faster. What is unique about Microsoft’s hybrid cloud approach is that we build consistency between on-premises and the cloud. Consistency helps take the complexity of hybrid cloud out because it means you don’t need two different systems for everything. We build that consistency across identity, data, development, and security and management. Today we’re advancing our hybrid cloud leadership even further via the following developments:

  • Azure Stack is now shipping from our partners Dell EMC, Hewlett Packard Enterprise (HPE) and Lenovo. You can see all of these integrated systems on the show floor at Ignite. As an extension of Azure, Azure Stack brings the agility and fast-paced innovation of cloud computing to on-premises environments. Only Azure Stack lets you deliver Azure services from your organization’s datacenter, while balancing the right amount of flexibility and control – for truly-consistent hybrid cloud deployments.
  • Our fully managed Azure SQL Database service now has 100 percent SQL Server compatibility for no code changes via Managed Instance. And today, we are introducing a new Azure Database Migration Service that enables a near-zero downtime migration. Now customers can migrate all of their data to Azure without hassle or high cost.
  • Azure Security Center can now be used to secure workloads running on-premises and in other clouds. We’re also releasing today new capabilities to better defend against threats and respond quickly, including Just in Time (JIT) access, dynamic app whitelisting, and being able to drill down into an attack end to end with interactive investigation paths and mapping.

Beyond all of the product innovation above, one of the areas I’m proud of is the work we’re doing to save customers money. For example, the Azure Hybrid Benefit for Windows Server and the newly announced Azure Hybrid Benefit for SQL Server allow customers to use their existing licenses to get discounts in Azure, making Azure the most economical choice and path to the cloud for these customers. Together with the new Azure Reserved VM Instances we just announced, customers will be able to save up to 82 percent on Windows Server VMs. The free Azure Cost Management capabilities I mentioned above help customers save money by optimizing how they run things in Azure. And we are now offering the new Azure free account which introduces the free use of many popular services for 12 months, in addition to the $200 free credit we provide.

It’s an exciting time for IT, and we’re equally excited that you are our trusted partners in this era of digital transformation. I look forward to hearing your questions or feedback so that we can further your trust in us and empower each of you to achieve more.

For Trade – 2 x Apple Airport Extreme (AC) & 16GB Kingston 2666 WANTED: Ubiquiti AP

Would you take £35 for your UAP delivered?

I assume you’re no where near me for collection.

I’ve got my Pro installed last night and very impressed, I have coverage in my whole house now but still could make use of your spare unit in the garage wall which is behind my lounge so tablets can use that.

I know you don’t want lose out but you can send it by pigeon to save money. I’m in no rush.

For Trade – 2 x Apple Airport Extreme (AC) & 16GB Kingston 2666 WANTED: Ubiquiti AP

Would you take £35 for your UAP delivered?

I assume you’re no where near me for collection.

I’ve got my Pro installed last night and very impressed, I have coverage in my whole house now but still could make use of your spare unit in the garage wall which is behind my lounge so tablets can use that.

I know you don’t want lose out but you can send it by pigeon to save money. I’m in no rush.

For Trade – 2 x Apple Airport Extreme (AC) & 16GB Kingston 2666 WANTED: Ubiquiti AP

Would you take £35 for your UAP delivered?

I assume you’re no where near me for collection.

I’ve got my Pro installed last night and very impressed, I have coverage in my whole house now but still could make use of your spare unit in the garage wall which is behind my lounge so tablets can use that.

I know you don’t want lose out but you can send it by pigeon to save money. I’m in no rush.

For Trade – 2 x Apple Airport Extreme (AC) & 16GB Kingston 2666 WANTED: Ubiquiti AP

Would you take £35 for your UAP delivered?

I assume you’re no where near me for collection.

I’ve got my Pro installed last night and very impressed, I have coverage in my whole house now but still could make use of your spare unit in the garage wall which is behind my lounge so tablets can use that.

I know you don’t want lose out but you can send it by pigeon to save money. I’m in no rush.