Tag Archives: Migration

Data migration specialist Datadobi adds S3-to-S3 support

Data migration specialist Datadobi is bolstering its DobiMigrate software with support for Dell EMC’s PowerStore midrange lineup and various object stores that are compatible with Amazon’s S3 API.

Customers can use Datadobi software to move data from non-Dell EMC storage arrays to the new PowerStore systems that launched in May. The extension of support to the latest Dell EMC storage comes as no surprise. Four former EMC engineers founded Datadobi in 2010, and the Belgium-based company has worked closely with EMC, and subsequently, Dell EMC, on data migration from Centera and NAS systems.

The support for S3-to-S3 object data migration with the DobiMigrate 5.9 update, due on July 13, is more of a novel twist for Datadobi. The DobiMigrate software could enable customers to shift data from Amazon’s Simple Storage Service (S3) to on-premises S3-based object stores, and vice versa. Datadobi plans to support AWS S3, Cloudian HyperStore, Dell EMC ECS, IBM Cloud Object Storage, NetApp StorageGrid and Scality Zenko CloudServer when DobiMigrate 5.9 launches.

Datadobi CTO Carl D’Halluin said the company would test and ensure compatibility with other S3-based object stores based on customer requests. Possible options include object stores from Google, Backblaze and Wasabi to help customers migrate data between public clouds.

Datadobi DobiMigrate.
Datadobi’s DobiMigrate 5.9 update adds the ability to move data between object stores that support Amazon’s S3 API.

Data migration from AWS

But Michael Jack, Datadobi’s co-founder and VP of global sales, expects the most popular use case could be AWS to on-premises object storage to help customers avoid the egress fees that Amazon and other cloud providers charge to access data. A secondary source of demand could be customers refreshing their object storage for better functionality or higher performance.

“There was a big flurry of media a year and a half ago around repatriation — that repatriation is going to be huge because Amazon’s too expensive,” Jack said. But, rather than a “massive tsunami of opportunities,” Jack expects to see “incremental business” from customers that don’t want to be locked in to a particular storage vendor or storage.

A lot of people get upset about the egress fees that they didn’t really understand with public cloud storage.
Marc StaimerPresident, Dragon Slayer Consulting

“A lot of people get upset about the egress fees that they didn’t really understand with public cloud storage,” said Marc Staimer, president of Dragon Slayer Consulting. “They want to move, but there’s no easy way to migrate from S3 to S3 natively, so they need a third party.”

Datadobi specializes in migrating large quantities of data. The company initially focused on moving data out of EMC’s Centera content-addressable storage and then built DobiMigrate to tackle NAS migration when EMC’s Isilon team reached out. The DobiMigrate 5.9 update will add connectors to S3 on top of the NAS migration framework, incorporating important lessons that Datadobi’s engineering team learned through its NAS work about the incompatibilities and vagaries in different vendors’ systems.

S3 data migration is not trivial

“We thought S3 would be simple, because it’s a simple protocol, but the same is true here,” D’Halluin said. “Copying over S3 data is not trivial, and enterprises want some guarantees that the data is copied over properly.”

D’Halluin said Datadobi engineers had to “invent new ways” to scan the S3 content and enable fast copying and parallel scanning. DobiMigrate runs on premises behind the data center’s firewall, under the control of the data center administrator. The software can synchronize the objects and metadata from the source buckets to the target buckets.

“One of the product’s values to the enterprise is the scale at which it can operate,” said Krista Macomber, a senior analyst at Evaluator Group. “This is important for enterprises looking to migrate petabytes of data. The scanning process is multithreaded, and initial full, as well as incremental, scans occur outside of the data path. This helps to accelerate the scanning and copying process and cut down on the maintenance window.”

Macomber thinks more Datadobi customers could move S3 data to the cloud, for disaster recovery or archival purposes or as part of a legacy application migration. But she said pulling data back on premises could also be important as business analytics becomes more popular.

DobiMigrate supports only NAS-to-NAS or S3-to-S3 migrations. The software does not offer gateway services to migrate file-based NAS data to S3-based object stores, or vice versa. Datadobi supports the migration of data from NAS or Centera systems to S3 object stores only on an individual project basis.

Datadobi roadmap

Datadobi roadmap items include optimizing migrations from Amazon’s Glacier, cold storage as well as support for advanced object features such as locking and tagging. D’Halluin said there is no incentive yet to support migrations between S3-based object stores and Microsoft’s Azure because product support for Azure’s Blob object storage API is minimal.

D’Halluin said Datadobi plans to phase in S3-to-S3 migration capabilities and help customers analyze which data to bring back from AWS, keep in the cloud or delete altogether. He expects customers initially will need help, so Datadobi will offer the new S3-to-S3 functionality as a service before making it available through the do-it-yourself software.

DobiMigrate has a 50 TB minimum for S3-to-S3 migrations. Jack estimated the service list price would start at about $45,000 for a 50 TB migration, depending on the customer’s choice of storage vendor, systems integrator or a value-added reseller to provide it. Jack said the do-it-yourself software would list at about $30,000 for an S3-to-S3 migration of 50 TB of data.

Jack estimated that 25% of Datadobi’s customers use the software on their own, and 75% go through the company’s more than 100 partners. Datadobi sells only through the channel, but Jack said the company’s sales organization communicates directly with all customers. In addition to DobiMigrate, Datadobi sells DobiReplicate for NAS-to-NAS file-based replication and DobiSync for file-to-object synchronization.

Datadobi claims to have more than 800 customers, including service providers and enterprises in industries such as financial services, healthcare, oil and gas, and media and entertainment. They tend to have large amounts of data that can scale past 1 PB in complex multivendor environments spanning on-premises and cloud sites, according to Jack.

Datadobi’s competition includes free tools such as robocopy and rsync, as well as Data Dynamics and Komprise. Jack said Datadobi focuses on large one-time, “fire-hose” types of migrations in comparison to the “trickle” approach that Data Dynamics and Komprise take with their data management/migration products.

Staimer said StrongBox Data Solutions offers S3-to-S3 migration as a feature but finds that the quantities of moved object-based data tend to be small in comparison to the multi-petabyte projects the vendor does with file-based data.

Go to Original Article

WANdisco, Azure do data migration dance

WANdisco has integrated its big data migration with the Microsoft Azure cloud.

WANdisco LiveData Platform for Azure — in customer preview — is designed to make it easier to move petabytes of data to Azure. Customers can discover LiveData through Marketplace and access its services directly through Portal and Azure command line interface (CLI). With LiveData, customers can perform large-scale migration of Hadoop data to Azure, and enable backup and disaster recovery (DR) in the cloud and cloud bursting. As a native service, LiveData Platform for Azure will show up on the same bill as Azure.

WANdisco also launched LiveData Migrator and LiveData Plane for the new Azure-based platform. These two work together to allow consistency between an on-premises Hadoop environment and Azure Data Lake Storage. LiveData Migrator performs a one-time scan of the on-premises data and feeds it to LiveData Plane, which captures any changes after that point.

LiveData can scan through petabyte-scale data and generate a copy in the cloud while ensuring both copies are the same. It is powered by WANdisco Fusion, a consensus engine that keeps data consistent and available across multiple environments. Because it is a single scan and data migration is continuous, nothing needs to be shut down. This integration with Azure makes it easier for Azure customers to discover and deploy LiveData.

LiveData’s ability to move petabytes of data without interrupting production and without risk of losing the data midflight is something no other vendor does, said Merv Adrian, Gartner research vice president of data and analytics. Moving data at this scale takes a long time, and traditionally involves a combination of physically shipping servers loaded with data to a cloud provider and/or transferring data to the cloud during non-peak hours. The data is inaccessible during migration using these methods. Adrian said as a result, enterprises tend not to move live, active data this way.

“Taking everything down until I’m finished isn’t an option,” Adrian said.

LiveData doesn’t technically “finish” the migration until later, but customers can access and make changes to all the data mid-migration. LiveData ensures those changes are reflected in all copies. Adrian said that’s an important differentiator from other migration tools.

WANdisco LiveData does not yet have similar integration with AWS or Google Cloud, but Adrian said that the Azure integration makes most sense. AWS has larger adoption, but Adrian pointed out that AWS and Google have no on-premises presence — those customers are already on the cloud. Microsoft customers are most likely hybrid, running Microsoft products in their data centers while also dipping into Azure for their cloud needs. They are the customers most likely looking to juggle petabytes of data between on-premises and cloud.

screenshot of WANdisco LiveData in Azure
LiveData Platform for Azure can be discovered and deployed in the Azure Portal.

WANdisco CEO and founder David Richards said WANdisco focuses on serving the enterprise market. He said while AWS has higher general market adoption, it has similar adoption among enterprises as Azure. He also said Azure adoption is growing faster among the enterprise, partly because Microsoft’s office productivity and collaboration tools both on- and off-premises are widely popular.

Richards said cloud demand is spiking because of an increase in at-home workers as well as companies investing in AI and machine learning. Business has slowed across the board due to the COVID-19 pandemic, and companies are thinking of ways to modernize and transform their businesses in response. Investing in AI — specifically, the ability to make better decisions automatically — is a way for businesses to differentiate themselves.

“Businesses have to now reinvent themselves, but that has to come with severe IT mobilization,” Richards said. “The boldest move a company can make is looking at AI.”

Adrian brought up another point about the interplay between COVID-19 and cloud: many businesses are looking to cut costs, and CTOs are going to look at putting hardware on the chopping block. He said it depends on the workload, but in most cases, the total cost of ownership over three years for hosting on the cloud is cheaper than provisioning all the necessary hardware, floor space and cooling to host it on-premises.

Determining these costs and identifying which workloads are actually cheaper on the cloud is still a “black art,” Adrian said. It takes meticulous modeling to map out costs, and those models could still be wrong because the demands of the workloads and the cost of the cloud could grow or shrink unpredictably. However, Adrian said AI and machine learning are absolutely better done on the cloud because of the “bursty” nature of their compute demands.

Go to Original Article

Avoid common pain points when migrating to Exchange Online

A migration from on-premises Exchange to Office 365 is more than just a matter of putting mailboxes into Microsoft’s cloud. There are several factors that can slow this type of project, and some issues won’t arise until you thought the project was done.

There are quite a few organizations still running an Exchange Server platform, but many of them are looking at migrating to Exchange Online and hand over some of the administrative burden to Microsoft. In my experience, I see four common problems for organizations that can be avoided. With a little preparation, you can avoid these stumbling blocks and make the experience a positive one for both IT and the end user.

Update on-premises software

Near the top of the list of common issues is not having the current versions of software running on premises.

Active Directory, on-premises Exchange, Outlook, Windows clients and servers all need to be up to date to give your organization the best possible migration experience. At one time, Microsoft’s organizational posture was more forgiving and would support older software, but today, the company wants all software that touches Exchange to be on the latest version. Some of the older Office suites will still work but only with basic functionality and end users will miss out on newer features, such as Focused Inbox.

That many enterprises struggle with keeping their software current isn’t a surprise, because it’s difficult to patch and deploy updates in a timely fashion. In some cases, organizations depend on third-party software that is rarely updated and may have compatibility issues with a frequent update schedule. There is no easy solution for these problems. But as IT pros, we need to sort through the updates and find a way to get all that software on the latest release.

Understand mail flow scenarios

The next area that hinders a lot of organizations migrating to Exchange Online is not understanding the different ways to set up mail flow into and out of Microsoft’s hosted email platform.

Only when you fully understand all the pieces in your organization’s transport stack can you set up a mail flow that meets your needs.

Microsoft designed Office 365 and Exchange Online to be very flexible with regards to the support of different mail flow scenarios. Email can go to on-premises Exchange first, then into Exchange Online. Mail can also go to Exchange Online first, then flow to the on-premises Exchange servers.

During a hybrid migration, the most common scenario is to leave the mail flow configuration to reach the on-premises Exchange Server first, then use hybrid configuration to forward email to mailboxes in the Microsoft cloud via the hybrid routing address. This hybrid routing address, which looks something like [email protected], is an attribute of the on-premises Active Directory account.

When you set up an Exchange hybrid deployment and move mailboxes properly, that address is automatically added to the user’s account. This mail flow arrangement tends to work very well, but if that address is not added to the users account, mail flow won’t work for that user.

Another popular option is to route email through Office 365 first, then to your on-premises mailboxes. This option puts Exchange Online Protection as the gatekeeper in front of all your organization’s mailboxes.

Ultimately, your decision comes down to what other services your organization has in that mail flow path. Some organizations use third-party antivirus products, some use a vendor’s encryption services, while others depend on a particular discovery application. Any of those third-party services may be cloud-based or installed on premises. Some of the services need to be placed before your end-user mailboxes in the transport flow, while others need to be at the end of the transport flow. There is no one-size fits-all configuration. Only when you fully understand all the pieces in your organization’s transport stack can you set up a mail flow that meets your needs.

Understand authentication

A move to the cloud means added complexity to your end-user authentication process. Microsoft provides a wide range of authentication options for Office 365 and Exchange Online, but that flexibility also means there are many choices to make during your migration.

Active Directory Federation Services, password hash sync and pass-through authentication are where the authentication options start, but any of those options can be deployed with multifactor authentication, conditional access and a whole load of Azure Information Protection options. Add in some encryption and the migration process gets complicated quickly.

All these choices and security add-ons help protect the business, but it’s a complex undertaking. It takes some effort not only to settle on a particular authentication but to implement it properly and do thorough testing to avoid an avalanche of help desk calls.

Understand accepted domains

Over time, many on-premises Exchange organizations tend to collect multiple accepted domains. Accepted domains are the part of the email address after the @ symbol.

I see many customers have issues when they move mailboxes to the cloud because they forgot to verify all the accepted domains used on those mailboxes. This problem is simple to avoid: Review the accepted domains in your on-premises Exchange organization and make sure they are verified in your Office 365 tenant before migrating the mailboxes.

Go to Original Article

SAP S/4HANA migration should be business-driven

As SAP customers contemplate an SAP S/4HANA migration, they have to work through big questions like what infrastructure it will run on and how to handle business processes. One of the keys to a successful S/4HANA migration will be which part of the organization sets the project parameters, IT or business.

SAP expert Ekrem Hatip, senior solution architect at Syntax Systems, advises that because an S/4HANA migration is a fundamentally different project than a normal system upgrade, such as from SAP R/3 to SAP ECC, organizations must approach it differently. In this Q&A, Hatip discusses some of the issues that organizations need to consider as they embark on an S/4HANA journey.

Syntax Systems is based in Montreal and provides managed services for SAP systems, including hosting SAP systems in Syntax Systems data centers and running SAP systems on public cloud provider infrastructures.

How are Syntax customers approaching a possible S/4HANA migration? Is it on their minds?

Ekrem Hatip, Syntax senior solution architectEkrem Hatip

Ekrem Hatip: Over the last few years we have brought up the S/4HANA topic even if the customer doesn’t show immediate interest in doing that. We discuss with them what S/4HANA is, what are the advantages, and what are the innovations that S/4HANA introduces. We look at the customers’ existing landscape and discuss the possible migration paths from their system to an S/4HANA system. We talk about the business requirements, because an S/4HANA conversion is not a technical upgrade — it’s not a technical conversion from one database to another. It touches every aspect of their business processes, and we want to make sure that customers are aware that it is a sizable project.

Are customers eager to move or are they holding back now?

Hatip: Most customers are happy with what they have right now — with their SAP implementation. It satisfies their current needs and they don’t see an immediate reason to go to S/4HANA other than the fact that SAP has put the 2025 date in front of them [when SAP will end support for SAP ECC]. We can help our customers to understand what is ahead of them.

So educating them on what to expect is the first step of an S/4HANA migration?

Hatip: Absolutely. Most people don’t know much about SAP HANA let alone S/4HANA. Their expectation is, just like when they upgraded from R/3 to ECC, they will go ahead and just upgrade their system over one weekend. Then come Monday morning, they will continue running as they used to run on a shiny new system. We have to make sure that they understand this is not the case. Most of their business processes will be touched and most of the business processes might need to be modified or dropped. I also tell customers — especially if they’re going with a greenfield implementation — to keep their customizations at minimum. Everything seems to be going into cloud and S/4HANA Cloud is out there. So, I tell them if they can limit their customizations, they’ll be able to go to S/4HANA Cloud for the true SaaS experience.

Are customers considering any other systems as an alternative to an S/4HANA migration?

Hatip: For many customers SAP is the core of their business operations, and I haven’t yet seen any customers who are considering going to other solutions than SAP for their core business. So, it’s more likely they’re considering if they want to remain on ECC for as long as they can before moving to S/4HANA. With that said, I have seen that some customers are now considering alternatives to some of the peripheral systems offered by SAP. For example, one customer who was using BOB-J [SAP BusinessObjects BI] for its reporting is now considering using Microsoft Power BI on Azure. Do I know whether this is driven by the fact that they need to go to S/4HANA or not? I don’t know, but my observation is that some customers are considering alternatives for the systems surrounding their core SAP environment.

What are the most critical issues to consider as you make the S/4HANA decision?

Hatip: Unlike the previous conversions or upgrades, an S/4HANA conversion is not an IT-driven decision. It should definitely be a business-driven decision. It should be coming from the top and presented to the IT department, as opposed to the IT department going back and saying, this operating system is going out of support or that database is going out of support, so we have to upgrade. It should definitely be coming from the business side. Therefore, not only should the CIO be convinced to go to S/4HANA, at the same time CEOs and COOs should also be in the decision-making process. An S/4HANA conversion is a lengthy and fairly complex project, but at the same time it allows customers to clean up their existing legacy systems. Customers can use the opportunity to clean up data and review hardware or server requirements, or they can definitely leverage the public cloud offerings when they decide to go to S/4HANA. Finally, CIOs and the IT department should try to keep their customizations at a minimum in order to future-proof their environment. 

Go to Original Article

Are containers on Windows the right choice for you?

It’s nearly the end of the road for Windows Server 2008/2008 R2. Some of the obvious migration choices are a newer version of Windows Server or moving the workload into Azure. But does a move to containers on Windows make sense?

After Jan. 14, 2020, Microsoft ends extended support for the Windows Server 2008 and 2008 R2 OSes, which also means no more security updates unless one enrolls in the Extended Security Update program. While Microsoft prefers that its customers move to the Azure cloud platform, another choice is to use containers on Windows.

Understand the two different virtualization technologies

If you are thinking about containerizing Windows Server 2008/2008 R2 workloads, then you need to consider the ways containers differ from a VM. The most basic difference is a container is much lighter than a VM. Whereas each VM has its own OS, containers share a base OS image. The container generally includes application binaries and anything else necessary to run the containerized application.

Containers share a common kernel, which has advantages and disadvantages. One advantage is containers can be extremely small in size. It is quite common for a container to be less than 100 MB, which enables them to be brought online very quickly. The low overhead of containers makes it possible to run far more containers than VMs on the host server.

However, containers share a common kernel. If the kernel fails, then all the containers that depend on it will also fail. Similarly, a poorly written application can destabilize the kernel, leading to problems with other containers on the system.

VMs vs. Docker

As a Windows administrator considering containerizing legacy Windows Server workloads, you need to consider the fundamental difference between VMs and containers. While containers do have their place, they are a poor choice for applications with high security requirements due to the shared kernel or for applications with a history of sporadic stability issues.

Another major consideration with containers is storage. Early on, containers were used almost exclusively for stateless workloads because containers could not store data persistently. Unlike a VM, shutting down a container deletes all data within the container.

Container technology has evolved to support persistent storage through the use of data volumes. Even so, it can be difficult to work with data volumes. Applications that have complex storage requirement usually aren’t a good fit for containerization. For example, database applications tend to be poor candidates for containerization due to complex storage configuration.

If you are used to managing physical or virtual Windows Server machines, you might think of setting up persistent storage as a migration specific task. While there is a requirement to provide a containerized application with the persistent storage that it needs, it’s a one-time task completed as part of the application migration. It is important to remember that containers are designed to be completely portable. A containerized application can move from a development and test environment to a production server or to a cloud host without the need to repackage the application. Setting up complex storage dependencies can undermine container portability; an organization will need to consider whether a newly containerized application will ever need to be moved to another location.

What applications are suited for containers?

As part of the decision-making process related to using containers on Windows, it is worth considering what types of applications are best suited for this type of deployment. Almost any application can be containerized, but the ideal candidate is a stateless application with varying scalability requirements. For example, a front-end web application is often an excellent choice for a containerized deployment for a few reasons. First, web applications tend to be stateless. Data is usually saved on a back-end database that is separate from the front-end application. Second, container platforms work well to meet an application’s scalability requirements. If a web application sees a usage spike, additional containers can instantly spin up to handle the demand. When the spike ebbs, it’s just a matter of deleting the containers.

Before migrating any production workloads to containers, the IT staff needs to develop the necessary expertise to deploy and manage containers. While container management is not usually overly difficult, it is completely different from VM management. Windows Server 2019 supports the use of Hyper-V containers, but you cannot use Hyper-V Manager to create, delete and migrate containers in the same way that you would perform these actions on a VM.

Containers are a product of the open source world and are therefore managed from the command line using Linux-style commands that are likely to be completely foreign to many Windows administrators. There are GUI-based container management tools, such as Kubernetes, but even these tools require some time and effort to understand. As such, having the proper training is essential to a successful container deployment.

Despite their growing popularity, containers are not an ideal fit for every workload. While some Windows Server workloads are good candidates for containerization, other workloads are better suited as VMs. As a general rule, organizations should avoid containerizing workloads that have complex, persistent storage requirements or require strict kernel isolation.

Go to Original Article

Geek gifts 2019: Think Opex, not Capex with subscriptions

Geek gifts are undergoing a paradigm shift not unlike the enterprise IT migration to public cloud. Resource ownership and gift supply chain management are so passé — let any number of subscription services entertain and delight your geek for you!

Just as Amazon has eliminated the toil that used to be involved in holiday shopping, from schlepping to the mall to battling for a parking space to waiting in line at a register, subscription services can step in and take over even selecting and shipping gifts online. Name an item, interest or type of entertainment — there’s almost always a subscription service out there to bring it forth to your geek, on a regular basis, often for a full year. Bacon, for example. Craft beer. E-Books. Video games.

“An HBO Now subscription so you can watch The Watchmen on your flights to and from conferences, or Apple Music so you can drown out the screaming babies on the plane,” suggested one of our road warrior geek friends.

Our geek consultants this year rave about LootCrate for the geekily inclined. There are gift box subscriptions for every conceivable fandom, from Harry Potter to Marvel Comics to Hello Kitty.

“It’s all kinds of geeky goodness,” summed up one of our geek advisors.

Subscription services are also available for truly hardcore geeks who never stop talking shop long enough to watch movies or read comic books, such as a membership to document collaboration service Coda.io, or Zapier for home automation management.

The latest in gadgets, both practical and fun

As always, the cardinal rule of buying gifts for geeks is: know thy geek. Some geeks can’t get enough Funko Pops. Other geeks’ dearest wish is to get rid of the clutter they already have. Some geeks will be thrilled with beer delivered monthly to their doorstep — others would have much more fun making their own. For the DIY geeks, or traditionalists who want to wrap something themselves, gadgets and doodads still abound.

“I’m usually a ‘late early adopter’ of new tech toys, so I might not [want] anything crazy,” said one such geek we know. “[But] one of my favorite not-so-new things is the PicoBrew home brewing appliance.”

PicoBrew is pricey (the Pico C for craft beer homebrew enthusiasts is priced at $399.00), and not everyone likes beer. There’s also a $125 cocktail bitters kit one of our geeks had their eye on.

“It lets you be a bit of a mad scientist and drink the results,” he said.

For teetotaling or practically minded geeks, our geek advisors pointed out useful gadgets such as the iFixit Pro Tech Toolkit, reMarkable digital notepad or Rocketbook.

“It’s like a Kindle, but a writing tablet, perfect for those who like the feel of writing stuff but don’t want an extremely bright or distracting tablet,” said one geek expert of the reMarkable pad. “[Rocketbook] is like a notebook, but you can use it like a whiteboard, as the pages wipe clean with a damp cloth. You can also pair it with an app and upload your notes to the cloud.”

Geeks with a quirky sense of humor may delight in somewhat pointless gadgets, such as the Qwerkywriter or a Makey Makey kit that can turn anything into a keyboard, from bananas to water buckets.

“I don’t know if this is actually a good product,” said one of our geek advisors of Qwerkywriter. “But the idea of a mechanical keyboard for an iPad makes me laugh.”

Budget-friendly humorous geek gift suggestions include the time-honored coffee mug and some truly bizarre drink coasters.  

Perennial favorites: Star Wars mania strikes back

Star Wars already led our geek gift recommendations in 2015, but could have again this year, with the launch of Disney+ and The Mandalorian series (Baby Yoda!), along with the hotly anticipated Dec. 20 release of Star Wars: The Rise of Skywalker. Even after all these years, it’s hard to go wrong with Star Wars swag for many geeks. Not every geek is part of this fandom, but one of our geek advisors reported there is a strong correlation between enterprise IT geeks and Star Wars fans: Amazon suggested to him that those who bought Gene Kim’s The Unicorn Project book also ordered Rebel Alliance stickers, he said.

Perennial favorites in the non-Star Wars category include all the other major fandoms with movies coming out, such as Wonder Woman and Ghostbustersdinner and a movie, anyone? Geeks also still covet Sonos smart speaker devices, drones and Raspberry Pis, including the latest Raspberry Pi 4.

“New hardware, including a gigabit NIC, finally!” one of our geek experts said of the Pi 4. “Being able to cluster these and run k3s is also super cool.”

Much as geeks love home automation with toolkits such as Arduino and Hue Play, consider the potential privacy implications of devices such as Ring before you gift them, our geek friends warn.

“I would avoid anything that is going to record you and later be used by law enforcement or in other ways you didn’t intend, such as Ring, Alexa or Google Home,” said one. (If only we’d had this insight back in 2016!)

Go to Original Article

Microsoft acquires Mover.io to ease OneDrive migrations

Microsoft plans to add cloud migration tools to aid SharePoint on-premises migrations to the OneDrive cloud with its Monday acquisition of Mover.io, an eight-year-old Canadian startup specializing in the self-service mass migration of enterprise files.

Mover.io, acquired for an undisclosed sum, also provides cross-cloud migrations from a dozen OneDrive file-sharing cloud competitors, including Box, Dropbox, Egnyte and Google Drive. Microsoft continues to support SharePoint on-premises, but the company has not said how long it will continue to do so, leaving room for speculation among users and experts.

Mover.io, acquired a month after Microsoft bought data-migration vendor Movere, will join several file-migration tools and services already on the Microsoft cloud platform, including FastTrack and the SharePoint Migration Tool. Users also have a choice of several other third-party tools to do the job, including ShareGate and Metalogix, which support file migrations to OneDrive.

Microsoft could, theoretically, poach customers from competing cloud file-management systems such as Box with the Mover.io migration tools. But the real OneDrive migration target customer for the Mover.io tools is Microsoft’s SharePoint on-premises base, said Deep Analysis founder Alan Pelz-Sharpe.

Enterprise-scale file migrations from on-premises servers to the cloud pose challenges of maintaining file directory structure as well as access and security policies, Pelz-Sharpe said. SharePoint enterprise migrations in particular can be even thornier because it was designed for front-line office workers to set up ad-hoc file-sharing sites with little IT assistance.

The fact that SharePoint’s been around for nearly two decades, pre-dating widespread cloud adoption, compounds the issue. Pelz-Sharpe described one of his clients, a utility company, whose SharePoint on-premises footprint has grown over the years to 12,000 SharePoint sites.

“They have no idea what is in them, and no idea what to do with them,” Pelz-Sharpe said. “These things can be complex. It’s a recognized problem, so the more experience, skills and tools Microsoft can bring to help, the better.”

Specifics about Mover.io features integrating with the Microsoft 365 platform will come next month, said Jeff Teper, Microsoft corporate VP for Office, SharePoint and OneDrive, in a blog post announcing the acquisition.

Go to Original Article

Are you ready for the Exchange 2010 end of life?

Exchange Server 2010 end of life is approaching — do you have your migration plan plotted out yet?

Exchange Server 2010 reached general availability on November 9, 2009, and has been the cornerstone of the collaboration strategy for many organizations over the last decade. Since that time, Microsoft also produced three releases of Exchange Server, with Exchange Server 2019 being the most recent. Exchange Server 2010 continues to serve the needs of many organizations, but they must look to migrate from this platform when support ends on January 14, 2020.

What exactly does end of support mean for existing Exchange Server 2010 deployments? Your Exchange 2010 servers will continue to operate with full functionality after this date; however, Microsoft will no longer provide technical support for the product. In addition, bug fixes, security patches and time zone updates will no longer be provided after the end-of-support date. If you haven’t already started your migration from Exchange Server 2010, now is the time to start by seeing what your options are.

Exchange Online

For many, Exchange Online — part of Microsoft Office 365 — is the natural replacement for Exchange Server 2010. This is my preferred option.

The cloud isn’t for everyone, but in many instances the reasons organizations cite for not considering the cloud are based on perception or outdated information, not reality.

A hybrid migration to Exchange Online is the quickest way to migrate to the latest version of Exchange that is managed by Microsoft. Smaller organizations may not need the complexity of this hybrid setup, so they may want to investigate simpler migration options. Not sure which migration option is best for you? Microsoft has some great guidance to help you decide on the best migration path.

The cloud isn’t for everyone, but in many instances the reasons organizations cite for not considering the cloud are based on perception or outdated information, not reality. I often hear the word “compliance” as a reason for not considering the cloud. If this is your situation, you should first study the compliance offerings on the Microsoft Trust Center. Microsoft Office 365 fulfills many industry standards and regulations, both regionally and globally.

If you decide to remain on premises with your email, you also have options. But the choice might not be as obvious as you think.

Staying with Exchange on premises

Exchange Server 2019 might seem like the clear choice for organizations that want to remain on premises, but there are a few reasons why this may not be the case.

Migrating from Exchange 2010 to Exchange 2016

First, there is no direct upgrade path from Exchange Server 2010 to Exchange Server 2019. For most organizations, this migration path involves a complex multi-hop migration. You first migrate all mailboxes and resources to Exchange Server 2016, then you decommission all remnants of Exchange Server 2010. You then perform another migration from Exchange Server 2016 to Exchange Server 2019 to finalize the process. This procedure involves significant resources, time and planning.

Another consideration with Exchange Server 2019 is licensing. Exchange Server 2019 is only available to volume license customers via the Volume Licensing Service Center. This could be problematic for smaller organizations without this type of agreement.

Organizations that use the unified messaging feature in Exchange Server 2010 have an additional caveat to consider: Microsoft removed the feature from Exchange Server 2019 and recommends Skype for Business Cloud Voicemail instead.

For those looking to remain on premises, Exchange Server 2019 has some great new features, but it is important to weigh the benefits against the drawbacks, and the effort involved with the migration process.

Microsoft only supports Exchange Server 2019 on Windows Server 2019. For the first time, the company supports Server Core deployments and is the recommended deployment option. In addition, Microsoft made it easier to control external access to the Exchange admin center and the Exchange Management Shell with client access rules.

Microsoft made several key improvements in Exchange Server 2019. It rebuilt the search infrastructure to improve indexing of larger files and search performance. The company says the new search architecture will decrease database failover times. The MetaCacheDatabase feature increases the overall performance of the database engine and allows it to work with the latest storage hardware, including larger disks and SSDs.

There are some new features on the client side as well. Email address internationalization allows support for email addresses that contain non-English characters. Some clever calendar improvements include “do not forward” work without the need for an information rights management deployment and the option to cancel/decline meetings that occur while you’re out of office.

What happens if the benefits of upgrading to Exchange Server 2019 don’t outweigh the drawbacks of the migration process? Exchange Server 2016 extended support runs through October 2025, making it a great option for those looking to migrate from Exchange Server 2010 and stay in support. The simpler migration process and support for unified messaging makes Exchange Server 2016 an option worth considering.

Go to Original Article

End users will make or break an Office 365 migration

An Office 365 migration can improve an end user’s experience by making it easier to work in a mobile environment while also keeping Office 365 features up to date. But if the migration is done without the end users in mind, it can lead to headaches for IT admins.

At a Virtual Technology User Group (VTUG) event in Westbrook, Maine, about 30 attendees piled into a Westbrook Middle School classroom to hear tips on how to transition to Office 365 smoothly.

Office 365 is Microsoft’s subscription-based line of Office applications, such as Word, PowerPoint, Outlook, Teams and Excel. Rather than downloaded onto a PC, Office 365 apps are run in the cloud, enabling users to access their files wherever they are.

“As IT admins, we need to make the digital transformation technology seem easy,” said Jay Gilchrist, business development manager for Presidio Inc., a cloud, security and digital infrastructure vendor in New York and a managed service provider for Microsoft. Gilchrist and his Presidio colleague, enterprise software delivery architect Michael Cessna, led the session, outlining lessons they’ve learned from previous Office 365 migrations.

Importance of communication and training

Their first lessons included communicating with end users, keeping a tight migration schedule and the importance of training.

“You want to make it clear that you’re not just making a change for change’s sake,” Gilchrist said. “Communicate these changes as early as possible and identify users who may need a little more training.”

One practical tip he offered is to reserve the organization’s name in Office 365 early to ensure it’s available.

You want to make it clear that you’re not just making a change for change’s sake.
Jay GilchristBusiness development manager, Presidio

Conducting presentations, crafting targeted emails and working to keep the migration transparent can help IT admins keep end users up to date and enthused about the transition.

“End users are not information professionals,” Cessna said. “They don’t understand what we understand and these changes are a big deal to them.”

Cessna and Gilchrist said that if IT admins want end users to adopt apps in Office 365, they’ll need to provide the right level of training. IT admins can do that by providing internal training sessions, using external resources such as SharePoint Training Sites, as well as letting users work with the apps in a sandbox environment. Training will help end users get used to how the apps work and address questions end users may have in real time, thereby reducing helpdesk tickets once the Office 365 migration is completed. 

Governance and deployment

Before an Office 365 migration, IT admins need to have a governance of applications and deployment plan in place.

“Governance built within Microsoft isn’t really there,” Cessna said. “You can have 2,000 users and still have 4,500 Team sessions and now you have to manage all that data. It’s good to take care of governance at the beginning.”

Deployment of Office 365 is another aspect that IT admins need to tackle at the start of an Office 365 migration. They need to determine what versions are compatible with the organization’s OS and how the organization will use the product.

“It’s important to assess the digital environment, the OSes, what versions of Office are out there and ensure the right number of licenses,” Cessna said.

Securing and backing up enterprise data

One existing concern for organizations migrating from on-premises to an Office 365 cloud environment is security.

Microsoft provides tools that can help detect threats and secure an organization’s data. Microsoft offers Office 365 Advanced Threat Protection (ATP), a cloud-based email filtering service that helps protect against malware, Windows Defender ATP, an enterprise-grade tool to detect and respond to security threats, and Azure ATP, which accesses the on-premises Active Directory to identify threats.

Microsoft has also added emerging security capabilities such as passwordless log in, single-sign-on and multi-factor authentication to ensure data or files don’t get compromised or stolen during an Office 365 migration.

Regulated organizations such as financial institutions that need to retain data for up to seven years will need to back up Office 365 data, as Microsoft provides limited data storage capabilities, according to Cessna.

Microsoft backs up data within Office 365 for up to two years in some cases, and only for one month in other cases, leaving the majority of data backup to IT.

“[Microsoft] doesn’t give a damn about your data,” he said. “Microsoft takes care of the service, but you own the data.”

Picking the right license

Once the organization is ready for the migration, it’s important to choose the right Office 365 license, according to Gilchrist.

There are several ways for an organization to license an Office 365 subscription. Gilchrist said choosing the right one depends on the size of the organization and the sophistication of the organization’s IT department.

The subscription choices for Office 365.
When deciding on which Office 365 subscription to license, it’s important to examine the size and scope of your organization and decide which offering works best for you.

Smaller businesses can choose an option of licenses for 300 or less users, as well as options for add-ons like a desktop version of Office and advanced security features. The cost for enterprise licenses differs depending on the scope of the licenses and number of licenses needed, and educational and non-profit discounts on licenses are offered as well.

Other licensing options include Microsoft 365 bundles, which combine Office 365 with a Windows 10 deployment, or organizations could use Microsoft as a Cloud Solution Provider and have the company handle the heavy lifting of the Office 365 migration.

“There are different ways to do it. You just have to be aware of the best way to license for your business,” Gilchrist said.

Measuring success and adoption

Once completed, IT still has one more objective, and that’s to prove the worth of an Office 365 migration.

“This is critical and these migrations aren’t cheap,” Cessna said. “You want to show back to the business the ROI and what this new world looks like.”

To do that, IT admins will have to circle back to their end users. They can use tools such as Microsoft’s Standard Office 365 Usage Reports, Power BI Adoption reports or other application measurement software to pin down end user adoption and usage rates. They can provide additional training, if necessary.

“Projects fail because the end users aren’t happy,” Cessna said. “We don’t take them into account enough. Our end users are our customers and we need to make sure they’re happy.”

Go to Original Article

VMware HCX makes hybrid, multi-cloud more attainable

LAS VEGAS — VMware HCX attempts to drive migration to hybrid and multi-cloud architectures, but IT administrators are still hesitant to make the switch due to concerns around cost and complexity.

Before doing product evaluations and determining if VMware Hybrid Cloud Extension (HCX) is a good option for workload migration, admins must figure out if the cloud meets their current and future business needs. What is the organization trying to accomplish with its existing deployments?

For example, consider a near-end-of-support vSphere 5.5 environment: Is the goal to seamlessly migrate those workloads from the current environment to the cloud without an on-premises upgrade? Or, is successfully migrating hundreds of VMs or large amounts of storage the objective?

Determining the ultimate goal and whether a private cloud, hybrid cloud, public cloud or multi-cloud makes the most sense is a decision that admins must make on a case-by-case basis.

Cloud cost and complexity concerns

The ongoing fee associated with using cloud services is just one of the cost concerns, experts said in a session here at VMworld 2018. During the migration, admins have to worry about whether they’ll need to change IPs, the potential of running into compatibility issues, and the responsibility of ensuring business continuity and disaster recovery.

“Even after we meet all their requirements, we’ve seen in any organization all kinds of inertia about getting going,” said Allwyn Sequeira, senior vice president and general manager of hybrid cloud services at VMware. “People think they need to go buy high-bandwidth pipes to connect from on-prem to the cloud. People think they need to do an assessment of applications to see if this is an app that should be moved to the cloud.”

App dependencies and mapping are certainly important issues to consider. With more VMs, the environment is more complex; it’s easier to break something during migration.

Even when a certain vendor or product addresses their concerns, admins need buy-in from networking, security, compliance and governance teams before moving forward with the cloud.

The introduction of VMware HCX is the vendor’s attempt to remove some of the roadblocks keeping organizations from adopting hybrid and multi-cloud environments.

What is VMware HCX, and what are its use cases?

VMware HCX, also known as NSX Hybrid Connect, is a platform that enables admins to migrate VMs and applications between vSphere infrastructures with at least version 5.0 and newer and from on-premises environments to the cloud.

The top use cases of VMware HCX include consolidating and modernizing the data center, extending the data center to the cloud, and disaster recovery.

“HCX gives you freedom of choice,” said Nathan Thaler, director of cloud platforms at MIT in Cambridge, Mass. “You can move your workload into a cloud provider as long it works for you, and then you can move it out without any lock-in. We’ve moved certain VMs between multiple states and without any network downtime.”

Thaler did caution organizations to avoid using virtual hardware beyond the highest level of compatibility with the oldest cloud environment.

Disaster recovery to the cloud, while maybe not as front of mind as other popular use cases, is key in the event of a natural disaster.

“We wanted to be able to have resiliency whether it’s an East Coast event or a West Coast event,” said HCX customer Gary Goforth, senior systems engineer at ConnectWise Inc., a business management software provider based in Tampa, Fla.

VMware HCX-supported features include Encrypted vMotion, vSphere Replication and scheduled migrations. The functionality itself seems to be what admins are really looking for.

“We wanted a fairly simple, easy way to implement a cloud,” Goforth said. “We wanted to do it with minimal to no downtime and to handle a bulk migration of our virtual machines.”

In terms of the VMware HCX roadmap, the vendor is working on constructs to move workloads across different clouds, Sequeira said.

“It’s all about interconnecting data centers to each other,” he said. “Ultimately, at the end of the day, where you run is going to become less important than what services you need.”