🏝️ Importing PST files to EXO - Network Share

If you need to import PST files to mailboxes and you are using Exchange Online, there is a unique way of doing this action in the cloud, so lets get started.....

Get Mailbox size before Import

This is handy so you know if the import was sucessfully completed, so for that we need this command:

Get-MailboxStatistics Bear@diepiggydiedie.com | fl DisplayName,TotalItemSize

That will return this:

DisplayName   : Bear
TotalItemSize : 1.567 GB (1,682,943,429 bytes)

Right good, so its 1.578GB before in the import, nice.

Compliance is the Way!

First you need to go to compliance to get you import task SAS URL from the import wizard, so first you need this URL:

https://compliance.microsoft.com

Once here you will need "Data Lifecycle Management" then "Office 365" as below:


Once here you will need "Import" then "New import job" as below:



Then you need a name for the job, there are weird requirements for this name, you will know if its wrong.....

Requirements: Job Name: 2-64 lowercase letters, numbers or hyphens, must start with a letter, no spaces


Data Transfer Type

You then need to upload your data so select that as below:



SAS URL for PST's

Then you need the SAS URL to upload the PST files to, and you need AzCopy as well which is installed on the computer where the PST file are located and you need the mappings file.



First get the SAS URL, but clicking "Show network upload SAS URL" that will then revel the SAS URL as below, all you need here is the "copy to clipboard" button, click that and confirm you SAS URL is actually in your clipboard:


AZCopy

AZCopy is required to copy the files from your local computer to the SAS URL, so for this download AZCopy from this link here

Then you need to start a command prompt, not as an admin this time (for a change) and nivigate to the folder where AZCopy is installed like this. you do not need the DIR, but that confirms AZCopy.exe is there.....


Then you need this command to copy data to the Azure SAS URL:

azcopy.exe copy "<Source location of PST files>" "<SAS URL>"

This means if you would like to copy a folder full of PST files to the SAS URL then your syntax would look like this:

azcopy.exe copy "C:\PST\*" "https://sasurl.goes.here/23r3r2r32r32r3r3e1r/blah"

When this command is executed it will look something like this:



This will then crack on with the upload and report back at then end, with an update.

Confirm with Storage Explorer

I like to confirm that the upload is complete with Azure Storage Explorer, first you will need to get storage explorer from here

Once you have installed that application, start it up and click on the connector icon as below:



From here is Blob Container:

Then SAS URL:


Then you need to put the SAS URL in the box below:


That will take you to the summary screen, and yes the SAS URL has been removed:



Then you click connect - find that SAS account, click on it, and the files uploaded will be to the left, the name of those files is behind that big red box:


Mapping File

This is a CSV file that contains a header field and then the data field, so that look like this, do not play with the header field and then you have the values you need for your requirements:

Workload,FilePath,Name,Mailbox,IsArchive,TargetRootFolder,ContentCodePage,SPFileContainer,SPManifestContainer,SPSiteUrl

Template:

Exchange,\<folderpath>,<pst_file_name>,<mailbox-forimport>,FALSE,/,,,,


Example:

Exchange,,\bear.pst,bearx1@whodoytouthinkyouare.mail.onmicrosoft.com,FALSE,/,,,,

Caution: When you remove variables from the CSV file, you need to leave the commas intact, if the value is removed leave the "," there else it will fail

Additionally, remember that the mailbox is used in the mapping file is where the data ends up, if you get the map in the file wrong, you will give the wrong User the wrong PST files - Just be careful when creating this file as it can absolutely give you a headache if you get that wrong

<folderpath> defines the name of the folder you uploaded to the SAS URL, no folder leave it blank

<pst_file_name> defines the name of the PST you uploaded which is case sensitive

<mailbox-forimport> defines where the data will be imported to, aka an e-mail address

Ready the Mapping File

Head back to the import panel, you need to tick those boxes as the bottom to continue:



Then you need to give your CSV file as you can see here:


Once you have clicked Upload File and navigate and selected your file you need to validate it with the "validate" button:

When you click the validate button you get this and no errors, then you are good to carry on the import process:

Caution: If you get a error here, you will get a validation failed and you will need to download a CSV file to see the error you got - like on the community chest in Monopoly.

Mapping Errors

If you get an error they are not always helpful this is what I got:

Line, Error
2,"The PST (bear.pst) could not be found in the storage account"

This was caused for me as it was "Bear.pst" and not "bear.pst"

Import Analysis

Once that is accepted it will look like this, however remember this is the import and analysis of the file, not the import into the mailbox:


Import to Office 365

If you now want to import to Office 365 you need to click on the tick box next to the job then you will see "Import to Office 365" as below, give that a click:



Now you get the option to filter the data before you import, this is not required for this exercise as I have said no:



When you continue you will get a review screen with the total data size, click submit to get the party started:


The job will then change to "Import in progress" and you are now importing the data:


After some waiting around for the cloud to start doing the job it was doing from the start, you will notice or if you click on the import job that data will be being sent to the destination mailbox:


With a click on that job you get this:














Obviously, the contents of PST file will be sent to the mailbox defined in your mapping file, alternatively, if that was not obvious, you probably shouldn’t be doing data imports.

Bulk Imports

I am well aware this article covered a single mailbox with a single PST file, however if you have a requirement for multiple mailboxes, this process supports up to 500 in a single request.

If you have multiple PST files for the person concerned, then add more than one line for the same mailbox, obviously, the name of the PST file room need to be different make sure that reflected in your mappings file.

Created by value

You will notice on the import job. There is a field at the bottom called created by, this will obviously have the name of the person that created the import job in compliance manager, however, this has no connection with where the data is going to end up.


Just in case, people start an import job and see their name in that job, relax it’s just reporting you created the job.

Confirm with the mailbox has data

This is the last bit in confirming all is going well with the import, so lets get the mailbox size before the import, which you should have done from the start of this guide, but did you forget ?

Get-MailboxStatistics Bear@diepiggydiedie.com | fl DisplayName,TotalItemSize

That will return this:

DisplayName   : Bear
TotalItemSize : 1.567 GB (1,682,943,429 bytes)

However after the export you would expect more data in the mailbox bear@diepiggydiedie.com, so lets find out, with the same command which is:

Get-MailboxStatistics Bear@diepiggydiedie.com | fl DisplayName,TotalItemSize

and that should return more data this time, as the PST has been replayed to it:

DisplayName   : Bear
TotalItemSize : 13.44 GB (14,427,994,664 bytes)

Amazing, that means we are all good and the import is complete - all hail megatron.
Previous Post Next Post

Ω†Ω…ΩˆΨ°Ψ¬ Ψ§Ω„Ψ§ΨͺΨ΅Ψ§Ω„