Sharepoint - Uploading a large file to Office 365 via CSOM PowerShell

If you are following Office Dev PnP, then they have released brilliant approaches for uploading files of various size. Their code is in c#, which we convert into PowerShell.

I had an exactly same scenario where I needed to upload files of size greater than 500MB or 1GB and then update the metadata properties. I used PowerShell csom for this.

Office Dev PnP has mentioned 4 types of methods, I will let you know about 2. And ask you to read in detail about Large file upload with CSOM at GitHub and Upload large files sample app for SharePoint at MSDN

  1. SaveBinaryDirect: There is no file size limitations, but there's a security time-out after 30 minutes. Only use this method if you’re using a user-only authentication policy. User-only authentication policy is not available in an app for SharePoint, but can be used in native device apps, Windows PowerShell, and Windows console applications.

    $Context.RequestTimeout = [System.Threading.Timeout]::Infinite
    [Microsoft.SharePoint.Client.File]::SaveBinaryDirect($Context, $fileUrl, $LocalFile.OpenRead(), $true)
    

    Note: Initially I was using this method without setting Request Timeout to infinite which worked for files of size around 100MB but threw Timeout error for files of size 500MB. So I made sure I use Request Timeout as Infinite.

  2. Upload file in chunks: Upload a single file as a set of chunks using the StartUpload, ContinueUpload, and FinishUpload methods on the File class. There is no file size limits. Time-out occurs after 30 minutes. Each chunk of the file must upload within 30 minutes of completion of the previous chunk to avoid the time-out. Recommended for SharePoint Online when the file is larger than 10 MB.

    Office Dev PnP has given this function in c#, I have simply converted it into PowerShell

    Function UploadFileInSlice ($ctx, $libraryName, $fileName, $fileChunkSizeInMB) {
        $fileChunkSizeInMB = 9
    
        # Each sliced upload requires a unique ID.
        $UploadId = [GUID]::NewGuid()
    
        # Get the name of the file.
        $UniqueFileName = [System.IO.Path]::GetFileName($fileName)
    
        # Get the folder to upload into. 
        $Docs = $ctx.Web.Lists.GetByTitle($libraryName)
        $ctx.Load($Docs)
        $ctx.Load($Docs.RootFolder)
        $ctx.ExecuteQuery()
    
        # Get the information about the folder that will hold the file.
        $ServerRelativeUrlOfRootFolder = $Docs.RootFolder.ServerRelativeUrl
    
        # File object.
        [Microsoft.SharePoint.Client.File] $upload
    
        # Calculate block size in bytes.
        $BlockSize = $fileChunkSizeInMB * 1024 * 1024
    
        # Get the size of the file.
        $FileSize = (Get-Item $fileName).length
        if ($FileSize -le $BlockSize)
        {
            # Use regular approach.
            $FileStream = New-Object IO.FileStream($fileName,[System.IO.FileMode]::Open)
            $FileCreationInfo = New-Object Microsoft.SharePoint.Client.FileCreationInformation
            $FileCreationInfo.Overwrite = $true
            $FileCreationInfo.ContentStream = $FileStream
            $FileCreationInfo.URL = $UniqueFileName
            $Upload = $Docs.RootFolder.Files.Add($FileCreationInfo)
            $ctx.Load($Upload)
            $ctx.ExecuteQuery()
            return $Upload
        }
        else
        {
            # Use large file upload approach.
            $BytesUploaded = $null
            $Fs = $null
            Try {
                $Fs = [System.IO.File]::Open($fileName, [System.IO.FileMode]::Open, [System.IO.FileAccess]::Read, [System.IO.FileShare]::ReadWrite)
                $br = New-Object System.IO.BinaryReader($Fs)
                $buffer = New-Object System.Byte[]($BlockSize)
                $lastBuffer = $null
                $fileoffset = 0
                $totalBytesRead = 0
                $bytesRead
                $first = $true
                $last = $false
    
                # Read data from file system in blocks. 
                while(($bytesRead = $br.Read($buffer, 0, $buffer.Length)) -gt 0) {
                    $totalBytesRead = $totalBytesRead + $bytesRead
    
                    # You've reached the end of the file.
                    if($totalBytesRead -eq $FileSize) {
                        $last = $true
                        # Copy to a new buffer that has the correct size.
                        $lastBuffer = New-Object System.Byte[]($bytesRead)
                        [array]::Copy($buffer, 0, $lastBuffer, 0, $bytesRead)
                    }
    
                    If($first)
                    {
                        $ContentStream = New-Object System.IO.MemoryStream
                        # Add an empty file.
                        $fileInfo = New-Object Microsoft.SharePoint.Client.FileCreationInformation
                        $fileInfo.ContentStream = $ContentStream
                        $fileInfo.Url = $UniqueFileName
                        $fileInfo.Overwrite = $true
                        $Upload = $Docs.RootFolder.Files.Add($fileInfo)
                        $ctx.Load($Upload)
    
                        # Start upload by uploading the first slice.
                        $s = [System.IO.MemoryStream]::new($buffer) 
    
                        # Call the start upload method on the first slice.
                        $BytesUploaded = $Upload.StartUpload($UploadId, $s)
                        $ctx.ExecuteQuery()
    
                        # fileoffset is the pointer where the next slice will be added.
                        $fileoffset = $BytesUploaded.Value
    
                        # You can only start the upload once.
                        $first = $false
                    }
                    Else
                    {
                        # Get a reference to your file.
                        $Upload = $ctx.Web.GetFileByServerRelativeUrl($Docs.RootFolder.ServerRelativeUrl + [System.IO.Path]::AltDirectorySeparatorChar + $UniqueFileName);
                        If($last) {
                            # Is this the last slice of data?
                            $s = [System.IO.MemoryStream]::new($lastBuffer)
    
                            # End sliced upload by calling FinishUpload.
                            $Upload = $Upload.FinishUpload($UploadId, $fileoffset, $s)
                            $ctx.ExecuteQuery()
    
                            Write-Host "File upload complete"
                            # Return the file object for the uploaded file.
                            return $Upload
                        }
                        else {
                            $s = [System.IO.MemoryStream]::new($buffer)
    
                            # Continue sliced upload.
                            $BytesUploaded = $Upload.ContinueUpload($UploadId, $fileoffset, $s)
                            $ctx.ExecuteQuery()
    
                            # Update fileoffset for the next slice.
                            $fileoffset = $BytesUploaded.Value
                        }
                    }
    
                }  #// while ((bytesRead = br.Read(buffer, 0, buffer.Length)) > 0)
            }
            Catch {
                Write-Host $_.Exception.Message -ForegroundColor Red
            }
            Finally {
                if ($Fs -ne $null)
                {
                    $Fs.Dispose()
                }
            }
        }
        return $null
    }
    

    Usage:

    $credentials = Get-Credential
    $SiteURL = "https://yoursite.sharepoint.com"
    $Context = New-Object Microsoft.SharePoint.Client.ClientContext($SiteURL) 
    $Context.Credentials = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($credentials.UserName, $credentials.Password)
    
    $UpFile = UploadFileInSlice -ctx $Context -libraryName "YourLibName" -fileName "C:\LargeFiles\FileWithLargeSize.docx"
    

    Check this blog

So once you finish uploading the files using any of the above method, then you can load the file object and change the metedata properties.

$File = $Context.Web.GetFileByServerRelativeUrl($fileUrl)
$ListItem = $File.ListItemAllFields
$Listitem["Field"] = "Value"
$ListItem.Update()
$Context.Load($ListItem)
$Context.ExecuteQuery()

SharePoint CSOM request size is very limited and it cannot excess 2 MB limit and you cannot change this setting in Office 365 environment. So if you have to upload bigger files you have to use REST API. Here is MSDN reference https://msdn.microsoft.com/en-us/library/office/dn292553.aspx