Upload on-premises content to SharePoint Online using PowerShell cmdlets

Posted: February 11, 2019 in SharePoint Online

This is a step by step guide on how to use the SharePoint Online Migration PowerShell cmdlets to migrate content from an on-premises file share to Office 365.

SharePoint Online Migration PowerShell cmdlets are designed to move on-premises content from file shares. Requiring minimal CSOM calls, it leverages Azure temporary BLOB storage to scale to the demand of large migration of data content.

Here are the steps for using SPO Migration powershell to upload your on-premises data into SharePoint Online:

Step 1: Install the SharePoint Online Management Shell

Step 2: Setup your working directory

Step 3: Determine your locations and credentials

Step 4: Create a new content package from an on-premises file share

Step 5: Convert the content package for your target site

Step 6: Submit content to import

(Optional) Step 7: Processing and Monitoring your SPO Migration

Prerequisites

  • Supported Operating Systems: Windows 7 Service Pack 1, Windows 8, Windows Server 2008 R2 SP1, Windows Server 2008 Service Pack 2, Windows Server 2012, Windows Server 2012 R2
  • Windows PowerShell 4.0

 Note

Permissions: You must be a site collection administrator on the site you are targeting.

Before you begin

Step 1: Install the SharePoint Online Management Shell

The first step is install the SharePoint Online Management shell.

  1. Uninstall all previous versions of the SharePoint Online Management Shell.
  2. Install from here: SharePoint Online Management Shell.
  3. Open SharePoint Online Management Shell and select Run as Administrator.

Upload on-premises content to SharePoint Online using PowerShell cmdlets

Step 2: Setup your working directory

Before you start the migration process, you need to setup your working directory by creating two empty folders. These folders to not require a lot of disk space as they will only contain XML.

  1. Create a Temporary package folder.
  2. Create a Final package folder.

Upload on-premises content to SharePoint Online using PowerShell cmdlets

Step 3: Determine your locations and credentials

In this step you need to identify your locations and credentials, including the location of your source files, target files and web location.

On your local computer, open the SharePoint Online Management Shell. Run the following commands substituting your values.

$cred = (Get-Credential admin@contoso.com)

$sourceFiles = '\\fileshare\users\charles'

$sourcePackage = 'C:\migration\CharlesDocumentsPackage_source'

$targetPackage = 'C:\migration\CharlesDocumentsPackage_target'

$targetWeb = 'https://contoso-my.sharepoint.com/personal/charles_contoso_com'

$targetDocLib = 'Documents'

Upload on-premises content to SharePoint Online using PowerShell cmdlets

Step 4: Create a new content package from an on-premises file share

In this step, you will create a new migration package from a file share. To create a content package from a file share, the New-SPOMigrationPackage command reads the list of content targeted by the source path and will generate XML to perform migration.

The following parameters are required unless marked optional:

  • SourcefilesPath: points to the content you want to migrate
  • OutputPackagePath: points to your Temporary folder
  • TargetWebUrl: point to your destination web
  • TargetDocumentLibraryPath: point to the document library inside the web.
  • IgnoreHidden: option to skip hidden files (optional)
  • ReplaceInvalidCharacters: will fix invalids characters when possible (optional)

Example:

This example shows how to create a new package from a file share, ignoring hidden files and replacing unsupported characters in a file/folder name.

New-SPOMigrationPackage -SourceFilesPath $sourceFiles -OutputPackagePath $sourcePackage -TargetWebUrl $targetWeb -TargetDocumentLibraryPath $targetDocLib -IgnoreHidden -ReplaceInvalidCharacters

Upload on-premises content to SharePoint Online using PowerShell cmdlets

Step 5: Convert the content package for your target site

After you have created the content package, use the ConvertTo-SPOMigrationTargetedPackage command to convert the xml generated in your temporary folder. It saves a new set of targeted migration package metadata files to the target directory. This is the final package.

 Note

Your target site collection administrator credentials are used to gather data to connect to the data site collection.

There are six required parameters to enter (others are optional

  • ParallelImport : Tell the tool to optimise performance by using parallel threads.
  • SourceFiles: points to the directory location where the package’s source content files exist
  • SourcePackagePath: points to your Temporary package folder
  • OutputPackagePath: points to your final package folder
  • Credentials: SPO credential that has admin rights to the destination site
  • TargetWebUrl: point to your destination web
  • TargetDocumentLibraryPath: the path to your destination library

Example:

This example shows how to convert a package to a targeted one by looking up data in the target site collection. It uses the -ParallelImport parameter to boost file share migration performance.

$finalPackages = ConvertTo-SPOMigrationTargetedPackage -ParallelImport -SourceFilesPath $sourceFiles -SourcePackagePath $sourcePackage -OutputPackagePath $targetPackage -Credentials $cred -TargetWebUrl $targetWeb -TargetDocumentLibraryPath $targetDocLib

Upload on-premises content to SharePoint Online using PowerShell cmdlets

Step 6: Submit content to import

In this step, the Invoke-SPOMigrationEncryptUploadSubmit command creates a new migration job in the target site collection, and then returns a GUID representing the JobID. This command will upload encrypted source files and manifests into temporary Azure blob storage per job.

There are four required parameters to enter (others are optional):

  • TargetwebURL: Points to the web of the destination
  • SourceFilesPath : Points to the files to import
  • SourcePackagePath : Points to the final manifest of the files to import
  • Credentials: The SharePoint Online credentials that have Site Collection Administrator rights to the destination site

Example 1:

This example shows how to submit package data to create a new migration job.

$job = Invoke-SPOMigrationEncryptUploadSubmit -SourceFilesPath $sourceFiles -SourcePackagePath $spoPackagePath -Credentials $cred -TargetWebUrl $targetWebUrl

Example 2:

This example shows how to submit package data to create new migration jobs for parallel import.

$jobs = $finalPackages | % {Invoke-SPOMigrationEncryptUploadSubmit -SourceFilesPath $_.FilesDirectory.FullName -SourcePackagePath $_.PackageDirectory.FullName -Credentials $cred -TargetWebUrl $targetWeb}

For each submitted job, the Invoke cmdlet returns these properties as part of a job:

  • JobId: the ID of the job in SPO
  • ReportingQueueUri: the SPO Azure queue that stores the real-time progress messages of the migration.
  • Encryption: The encryption key and method used during uploading the content to Azure. This is required when you decrypt the queue messages and import logs.

If you’re using your own Azure storage account, then use Set-SPOMigrationPackageAzureSource and Submit-SPOMigrationJob to upload content into your storage.

Upload on-premises content to SharePoint Online using PowerShell cmdlets

(Optional) Step 7: Processing and Monitoring your SPO Migration

After the job is submitted, only Azure and SPO are interacting to fetch and migrate the content into the destination. This process is timer-job based, which means it’s in a queue on a first come first served basis. This does not prevent other jobs from being queued up by the same person.

There is a potential of a 1 minute delay if there are no other jobs running.

Checking job status

You can check the status of your job by viewing the real time updates posted in the Azure storage account queue by using the Encryption.EncryptionKey returned in step 6.

Viewing logs

If you’re using your own Azure storage account, you can look into the manifest container in the Azure Storage for logs of everything that happened. At this stage, it is now safe to delete those containers if you don’t want to keep them as backup in Azure.

If there were errors or warnings, .err and .wrn files will be created in the manifest container.

If you’re using the temporary Azure storage created by Invoke-SPOMigrationEncryptUploadSubmit in step 6, the import log SAS URL can be obtained by decrypting the Azure queue message with the “Event” value “JobLogFileCreate”. With the import log SAS URL, you can download the log file and decrypt it with the same encryption key as returned in Step 6.

Upload on-premises content to SharePoint Online using PowerShell cmdlets

Scripting Scenarios for Reuse

The following is a sample script you can use that includes the complete steps from determining your locations and credentials to submitting your package data to create a new migration job.

Copy
$userName = "admin@contoso.onmicrosoft.com"
  
$sourceFiles = "d:\data\documents"
  
$packagePath = "d:\data\documentPackage"
  
$spoPackagePath = "d:\data\documentPackageForSPO"
  
$targetWebUrl = "https://contoso.sharepoint.com/sites/finance"
  
$targetLibrary = "Documents"
  
$cred = Get-Credential $userName
  
New-SPOMigrationPackage -SourceFilesPath $sourceFiles -OutputPackagePath $packagePath -TargetWebUrl $targetWebUrl -TargetDocumentLibraryPath $targetLibrary -IgnoreHidden -ReplaceInvalidCharacters

$userName = “admin@contoso.onmicrosoft.com”

$sourceFiles = “d:\data\documents”

$packagePath = “d:\data\documentPackage”

$spoPackagePath = “d:\data\documentPackageForSPO”

$targetWebUrl = “https://contoso.sharepoint.com/sites/finance”

$targetLibrary = “Documents”

$cred = Get-Credential $userName

New-SPOMigrationPackage -SourceFilesPath $sourceFiles -OutputPackagePath $packagePath -TargetWebUrl $targetWebUrl -TargetDocumentLibraryPath $targetLibrary -IgnoreHidden -ReplaceInvalidCharacters

Convert package to a targeted one by looking up data in target site collection

$finalPackages = ConvertTo-SPOMigrationTargetedPackage -SourceFilesPath $sourceFiles -SourcePackagePath $packagePath -OutputPackagePath $spoPackagePath -TargetWebUrl $targetWebUrl -TargetDocumentLibraryPath $targetLibrary -Credentials $cred

Submit package data to create new migration job

$job = Invoke-SPOMigrationEncryptUploadSubmit -SourceFilesPath $sourceFiles -SourcePackagePath $spoPackagePath -Credentials $cred -TargetWebUrl $targetWebUrl

This sample shows how to get the returned information of a job, which comes in the form of a GUID.

Copy
$job = $jobs[0]
$job.JobId
Guid
----
779c4b3b-ec24-4705-bb58-c38f4329418c

This sample shows how to get the $job.ReportingQueueURi.AbosoluteUri.

Copy
# To obtain the $job.ReportingQueueUri.AbsoluteUri
https://spodm1bn1m013pr.queue.core.windows.net/953pq20161005-f84b9e51038b4139a179f973e95a6d6f?sv=2014-02-14&sig=TgoUcrMk1Pz8VzkswQa7owD1n8TvLmCQFZGzyV7WV8M%3D&st=2016-10-04T07%3A00%3A00Z&se=2016-10-26T07%3A00%3A00Z&sp=rap

This sample demonstrates how to obtain the encryption key and the sample return.

Copy
$job.Encryption
EncryptionKey                                       EncryptionMethod
-------------                                             ----------------
{34, 228, 244, 194...}                              AES256CBC

 Important

All messages are encrypted in the queue. If you want to read from the ReportingQueue, you must have the EncryptionKey.

Best Practices and Limitations

Description Recommendation
Package size 10-20 GB
Use -ParallelImport switch for File Share migration which will automatically split the big package into smaller ones.
File size 2 GB
Target size Target site should remain non-accessible to users until migration is complete
SharePoint Online limits SharePoint Online and OneDrive for Business: software boundaries and limitsSharePoint Online: software boundaries and limits

Azure Limits

Resource Default/Limit
TB per storage account 500 TB
Max size of single blob container, table, or queue 500 TB
Max number of blob containers, blobs, file shares, tables, queues, entities, or messages per storage account Only limit is the 500 TB storage account capacity
Target throughput for single blob Up to 60 MB per second, or up to 500 requests per second

 

Leave a comment