Tag Archives: problems

How to tackle an email archive migration for Exchange Online

Problem solve
Get help with specific problems with your technologies, process and projects.

A move from on-premises Exchange to Office 365 also entails determining the best way to transfer legacy archives. This tutorial can help ease migration complications.


A move to Office 365 seems straightforward enough until project planners broach the topic of the email archive…

“;
}
});

/**
* remove unnecessary class from ul
*/
$(“#inlineregform”).find( “ul” ).removeClass(“default-list”);

/**
* Replace “errorMessageInput” class with “sign-up-error-msg” class
*/
function renameErrorMsgClass() {
$(“.errorMessageInput”).each(function() {
if ($(this).hasClass(“hidden”)) {
$(this).removeClass(“errorMessageInput hidden”).addClass(“sign-up-error-msg hidden”);
} else {
$(this).removeClass(“errorMessageInput”).addClass(“sign-up-error-msg”);
}
});
}

/**
* when validation function is called, replace “errorMessageInput” with “sign-up-error-msg”
* before return
*/
function validateThis(v, form) {
var validateReturn = urValidation.validate(v, form);
renameErrorMsgClass();
return validateReturn;
}

/**
* DoC pop-up window js – included in moScripts.js which is not included in responsive page
*/
$(“#inlineRegistration”).on(“click”,”a.consentWindow”, function(e) {
window.open(this.href, “Consent”, “width=500,height=600,scrollbars=1”);
e.preventDefault();
});

migration.

Not all organizations keep all their email inside their messaging platform. Many organizations that archive messages also keep a copy in a journal that is archived away from user reach for legal reasons.

The vast majority of legacy archive migrations to Office 365 require third-party tools and must follow a fairly standardized process to complete the job quickly and with minimal expense. Administrators should migrate mailboxes to Office 365 first and then the archive for the fastest way to gain benefits from Office 365 before the archive reingestion completes.

An archive product typically scans mailboxes for older items and moves those to longer term, cheaper storage that is indexed and deduplicated. The original item typically gets replaced with a small part of the message, known as a stub or shortcut. The user can find the email in their inbox and, when they open the message, an add-in retrieves the full content from the archive.

Options for archived email migration to Office 365

The native tools to migrate mailboxes to Office 365 cannot handle an email archive migration. When admins transfer legacy archive data for mailboxes, they usually consider the following three approaches:

  1. Export the data to PST archives and import it into user mailboxes in Office 365.
  2. Reingest the archive data into the on-premises Exchange mailbox and then migrate the mailbox to Office 365.
  3. Migrate the Exchange mailbox to Office 365 first, then perform the email archive migration to put the data into the Office 365 mailbox.

Option 1 is not usually practical because it takes a lot of manual effort to export data to PST files. The stubs remain in the user’s mailbox and add clutter.

Option 2 also requires a lot of labor-intensive work and uses a lot of space on the Exchange Server infrastructure to support reingestion.

That leaves the third option as the most practical approach, which we’ll explore in a little more detail.

Migrate the mailbox to Exchange Online

When you move a mailbox to Office 365, it migrates along with the stubs that relate to the data in the legacy archive. The legacy archive will no longer archive the mailbox, but users can access their archived items. Because the stubs usually contain a URL path to the legacy archive item, there is no dependency on Exchange to view the archived message.

Some products that add buttons to restore the individual message into the mailbox will not work; the legacy archive product won’t know where Office 365 is without further configuration. This step is not usually necessary because the next stage is to migrate that data into Office 365.

Transfer archived data

Legacy archive solutions usually have a variety of policies for what happens with the archived data. You might configure the system to keep the stubs for a year but make archive data accessible via a web portal for much longer.

There are instances when you might want to replace the stub with the real message. There might be data that is not in the user’s mailbox as a stub but that users want on occasion.

We need tools that not only automate the data migration, but also understand these differences and can act accordingly.

We need tools that not only automate the data migration, but also understand these differences and can act accordingly. The legacy archive migration software should examine the data within the archive and then run batch jobs to replace stubs with the full messages. In this case, you can use the Exchange Online archive as a destination for archived data that no longer has a stub.

Email archive migration software connects via the vendor API. The software assesses the items and then exports them into a common temporary format — such as an EML file — on a staging server before connecting to Office 365 over a protocol such as Exchange Web Services. The migration software then examines the mailbox and replaces the stub with the full message.

migration dashboard
An example of a third-party product’s dashboard detailing the migration progress of a legacy archive into Office 365.

Migrate journal data

With journal data, the most accepted approach is to migrate the data into the hidden recoverable items folder of each mailbox related to the journaled item. The end result is similar to using Office 365 from the day the journal began, and eDiscovery works as expected when following Microsoft guidance.

For this migration, the software scans the journal and creates a database of the journal messages. The application then maps each journal message to its mailbox. This process can be quite extensive; for example, an email sent to 1,000 people will map to 1,000 mailboxes.

After this stage, the software copies each message to the recoverable items folder of each mailbox. While this is a complicated procedure, it’s alleviated by software that automates the job.

Legacy archive migration offerings

There are many products tailored for an email archive migration. Each has its own benefits and drawbacks. I won’t recommend a specific offering, but I will mention two that can migrate more than 1 TB a day, which is a good benchmark for large-scale migrations. They also support chain of custody, which audits the transfer of all data

TransVault has the most connectors to legacy archive products. Almost all the migration offerings support Enterprise Vault, but if you use a product that is less common, then it is likely that TransVault can move it. The TransVault product accesses source data either via an archive product’s APIs or directly to the stored data. TransVault’s service installs within Azure or on premises.

Quadrotech Archive Shuttle fits in alongside a number of other products suited to Office 365 migrations and management. Its workflow-based process automates the migration. Archive Shuttle handles fewer archive sources, but it does support Enterprise Vault. Archive Shuttle accesses source data via API and agent machines with control from either an on-premises Archive Shuttle instance or, as is more typical, the cloud version of the product.

Dig Deeper on Exchange Online administration and implementation

Mobile Sharing & Companion Experiences for Microsoft Teams Meetings – Microsoft Garage

Research into Computer-Supported Collaborative Work has explored problems of disengagement in video meetings and device conflict since the 1990s, but good solutions that could work at scale have been elusive. Microsoft Research Cambridge UK had been working on these issues when the 2015 Hackathon arose as an opportunity to highlight for the rest of the company that just a few simple and dynamic device combinations might provide users with the means to solve the issues themselves.

While we had explored some research prototypes in late 2014 and early 2015, for the Hackathon we decided to use a vision video with the goal of getting the attention of the Skype product group, because we knew that the idea would have the most impact as an infrastructural feature of an existing product rather than as a new stand-alone product. We called the video “Skype Unleashed” to connote breaking free of the traditional one person per endpoint model.

team in a conference room
Turning the hackathon video into a working proof-of-concept

When we won the Business category, our prize was meeting with the sponsor of the Business category, then-COO Kevin Turner.  We scrambled to build a proof-of-concept prototype, which at first we jokingly referred to as “Skype Skwid”, a deliberate misspelling of “squid”, because it was like a body that had lots of tentacles that could reach out to different other things. However, we realized that we needed an official project name, so we became “Project Wellington”. This was a related inside joke, because the largest squid in the world is the Colossal Squid, and the largest specimen in the world is in the Museum of New Zealand Te Papa Tongarewa… in Wellington, New Zealand.

So as Project Wellington we went to meet Kevin Turner, who also invited Gurdeep Singh Pall, then-CVP for Skype, in November 2015. Both immediately saw the relevance of the concepts and Gurdeep connected us to Brian MacDonald’s incubation project that would become Microsoft Teams.

Brian also understood right away that Companion Experiences could be an innovative market differentiator for meetings and a mobile driver for Teams. He championed the integration of our small Cambridge group with his Modern Meetings group as a loose v-team. The Modern Meetings group was exceptionally welcoming, graciously showing us the ropes of productization and taking on the formidable challenge of helping us socialize the need for changes at all levels of the product, from media stack, middle tier, and all clients. We, in turn, learned a lot about the cadence of production, scoping, aligning with the needs of multiple roadmaps, and the multitude of issues required to turn feature ideas into releasable code.Through 2016 and 2017 we worked on design iterations, usability testing, and middle tier and client code. We were thrilled when first glimpses of roving camera and proximity joining were shown at Build 2017, and then announced as officially rolling out at Enterprise Connect 2018.

a group of people in a conference room
The combined research and product team

We are very excited to see these features released. We are also excited to close the research loop by evaluating our thesis that dynamic device combinations will improve hybrid collaboration in video meetings, and doing research ‘in the wild’ at a scale unimaginable by most research projects. Microsoft is one of only a handful of institutions that can make research possible that will improve the productivity of millions of people daily. So as well as releasing product features, we are exceptionally proud of the model of collaboration itself. And, indeed, we are continuing to collaborate with Microsoft Teams even after these features are released, as we now have a tremendous relationship with a product group that understands how we work and values our help.

To come full circle, then, it was Satya Nadella’s emphasis on the Hackathon as a valuable use of company time, and The Garage’s organization of the event itself, that allowed ideas well outside a product group to be catapulted to the attention of people who could see its value and then provide a path to making it happen.

If you would like to find out more about this project, connect with Sean Rintel on LinkedIn or follow @seanrintel on twitter.

A data replication strategy for all your disaster recovery needs

Meeting an organization’s disaster recovery challenges requires addressing problems from several angles based on specific recovery point and recovery time objectives. Today’s tight RTO and RPO expectations mean almost no data gets lost and no downtime.

To meet those expectations, businesses must move beyond backup and consider a data replication strategy. Modern replication products offer more than just a rapid disaster recovery copy of data, though. They can help with cloud migration, using the cloud as a DR site and even solving copy data challenges.

Replication software comes in two forms. One is integrated into a storage system, and the other is bought separately. Both have their strengths and weaknesses.

An integrated data replication strategy

The integrated form of replication has a few advantages. It’s often bundled at no charge or is relatively inexpensive. Of course, nothing in life is really free. The customer pays extra for the storage hardware in order to get the “free” software. In addition, at-scale, storage-based replication is relatively easy to manage. Most storage system replication works at a volume level, so one job replicates the entire volume, even if there are a thousand virtual machines on it. And finally, storage system-based replication is often backup-controlled, meaning the replication job can be integrated and managed by backup software.

There are, however, problems with a storage system-based data replication strategy. First, it’s specific to that storage system. Consequently, since most data centers use multiple storage systems from different vendors, they must also manage multiple replication products. Second, the advantage of replicating entire volumes can be a disadvantage, because some data centers may not want to replicate every application on a volume. Third, most storage system replication inadequately supports the cloud.

Stand-alone replication

IT typically installs stand-alone replication software on each host it’s protecting or implements it into the cluster in a hypervisor environment. Flexibility is among software-based replication’s advantages. The same software can replicate from any hardware platform to any other hardware platform, letting IT mix and match source and target storage devices. The second advantage is that software-based replication can be more granular about what’s replicated and how frequently replication occurs. And the third advantage is that most software-based replication offers excellent cloud support.

While backup software has improved significantly, tight RPOs and RTOs mean most organizations will need replication as well.

At a minimum, the cloud is used as a DR target for data, but it’s also used as an entire disaster recovery site, not just a copy. This means there can be instantiate virtual machines, using cloud compute in addition to cloud storage. Some approaches go further with cloud support, allowing replication across multiple clouds or from the cloud back to the original data center.

The primary downside of a stand-alone data replication strategy is it must be purchased, because it isn’t bundled with storage hardware. Its granularity also means dozens, if not hundreds of jobs, must be managed, although several stand-alone data replication products have added the ability to group jobs by type. Finally, there isn’t wide support from backup software vendors for these products, so any integration is a manual process, requiring custom scripts.

Modern replication features

Modern replication software should support the cloud and support it well. This requirement draws a line of suspicion around storage systems with built-in replication, because cloud support is generally so weak. Replication software should have the ability to replicate data to any cloud and use that cloud to keep a DR copy of that data. It should also let IT start up application instances in the cloud, potentially completely replacing an organization’s DR site. Last, the software should support multi-cloud replication to ensure both on-premises and cloud-based applications are protected.

Another feature to look for in modern replication is integration into data protection software. This capability can take two forms: The software can manage the replication process on the storage system, or the data protection software could provide replication. Several leading data protection products can manage snapshots and replication functions on other vendors’ storage systems. Doing so eliminates some of the concern around running several different storage system replication products.

Data protection software that integrates replication can either be traditional backup software with an added replication function or traditional replication software with a file history capability, potentially eliminating the need for backup software. It’s important for IT to make sure the capabilities of any combined product meets all backup and replication needs.

How to make the replication decision

The increased expectation of rapid recovery with almost no data loss is something everyone in IT will have to address. While backup software has improved significantly, tight RPOs and RTOs mean most organizations will need replication as well. The pros and cons of both an integrated and stand-alone data replication strategy hinge on the environment in which they’re deployed.

Each IT shop must decide which type of replication best meets its current needs. At the same time, IT planners must figure out how that new data replication product will integrate with existing storage hardware and future initiatives like the cloud.

Wanted – Anyone selling an I5 processor and Motherboard?

Hi guys,

My son is having problems with bottlenecking after i bought him a GTX 1060 GPU so we want to replace his AMD A8-7600 for something a bit more suitable.

It doesn’t have to be current so please let me know what you have to offer.

Many thanks,

Paul

Location: Haydock, Merseyside

______________________________________________________
This message is automatically inserted in all classifieds forum threads.
By replying to this thread you agree to abide by the trading rules detailed here.
Please be advised, all buyers and sellers should satisfy themselves that the other party is genuine by providing the following via private conversation to each other after negotiations are complete and prior to dispatching goods and making payment:

  • Landline telephone number. Make a call to check out the area code and number are correct, too
  • Name and address including postcode
  • Valid e-mail address

DO NOT proceed with a deal until you are completely satisfied with all details being correct. It’s in your best interest to check out these details yourself.

Price drop! Cooler Master 700W PSU & Lide 210 Scanner

Cooler Master 700W PSU for sale.
This PSU is a couple of years old. It has worked fine no problems for me. No high pitched whine or any other issues. Has a connection for pretty much everything. Upgrading my pc for something a little more powerful so this good girl can go to a good home. Normally i don`t recommend 2nd hand psus, but she`s good and is just wasting space doing nothing (I already have a spare psu for testing).

£35 including postage.

LIDE 210 Scanner A4 size).
Love this…

Price drop! Cooler Master 700W PSU & Lide 210 Scanner

Price drop! Cooler Master 700W PSU & Lide 210 Scanner

Cooler Master 700W PSU for sale.
This PSU is a couple of years old. It has worked fine no problems for me. No high pitched whine or any other issues. Has a connection for pretty much everything. Upgrading my pc for something a little more powerful so this good girl can go to a good home. Normally i don`t recommend 2nd hand psus, but she`s good and is just wasting space doing nothing (I already have a spare psu for testing).

£35 including postage.

LIDE 210 Scanner A4 size).
Love this…

Price drop! Cooler Master 700W PSU & Lide 210 Scanner

Alienware M17xR4 – i7 32GB DDR3 / Nvidia 8GB DDR5

This is a fantastic laptop in great condition & with no problems

Absolutely awesome machine for gaming & film editing

I use it for online gaming myself on PUBG / CS:GO / BF1 at amazing FPS & selling so i can build my own desktop computer

I can post anywhere in the country via UPS

Alienware M17xR4

Intel Core i7-4910MQ

32GB RAM DDR3

1TB HHD
256GB SSD

Full HD Screen 17.3”

Nvidia 8GB DDR5 GTX880M

DVDRW

Bluetooth Ready

SD Slot

Windows 10

Built in Webcam…

Alienware M17xR4 – i7 32GB DDR3 / Nvidia 8GB DDR5

For Sale – Alienware M17xR4 – i7 32GB DDR3 / Nvidia 8GB DDR5

This is a fantastic laptop in great condition & with no problems

Absolutely awesome machine for gaming & film editing

I use it for online gaming myself on PUBG / CS:GO / BF1 at amazing FPS & selling so i can build my own desktop computer

I can post anywhere in the country via UPS

Alienware M17xR4

Intel Core i7-4910MQ

32GB RAM DDR3

1TB HHD
256GB SSD

Full HD Screen 17.3”

Nvidia 8GB DDR5 GTX880M

DVDRW

Bluetooth Ready

SD Slot

Windows 10

Built in Webcam

International Keyboard

Price and currency: £795
Delivery: Delivery cost is included within my country
Payment method: PayPal
Location: Cornwall
Advertised elsewhere?: Advertised elsewhere
Prefer goods collected?: I have no preference

______________________________________________________
This message is automatically inserted in all classifieds forum threads.
By replying to this thread you agree to abide by the trading rules detailed here.
Please be advised, all buyers and sellers should satisfy themselves that the other party is genuine by providing the following via private conversation to each other after negotiations are complete and prior to dispatching goods and making payment:

  • Landline telephone number. Make a call to check out the area code and number are correct, too
  • Name and address including postcode
  • Valid e-mail address

DO NOT proceed with a deal until you are completely satisfied with all details being correct. It’s in your best interest to check out these details yourself.

For Sale – Alienware M17xR4 – i7 32GB DDR3 / Nvidia 8GB DDR5

This is a fantastic laptop in great condition & with no problems

Absolutely awesome machine for gaming & film editing

I use it for online gaming myself on PUBG / CS:GO / BF1 at amazing FPS & selling so i can build my own desktop computer

I can post anywhere in the country via UPS

Alienware M17xR4

Intel Core i7-4910MQ

32GB RAM DDR3

1TB HHD
256GB SSD

Full HD Screen 17.3”

Nvidia 8GB DDR5 GTX880M

DVDRW

Bluetooth Ready

SD Slot

Windows 10

Built in Webcam

International Keyboard

Price and currency: £795
Delivery: Delivery cost is included within my country
Payment method: PayPal
Location: Cornwall
Advertised elsewhere?: Advertised elsewhere
Prefer goods collected?: I have no preference

______________________________________________________
This message is automatically inserted in all classifieds forum threads.
By replying to this thread you agree to abide by the trading rules detailed here.
Please be advised, all buyers and sellers should satisfy themselves that the other party is genuine by providing the following via private conversation to each other after negotiations are complete and prior to dispatching goods and making payment:

  • Landline telephone number. Make a call to check out the area code and number are correct, too
  • Name and address including postcode
  • Valid e-mail address

DO NOT proceed with a deal until you are completely satisfied with all details being correct. It’s in your best interest to check out these details yourself.