I added a counter here that will count the amount of files that are moved. Write-host ' to' $targetPath -BackgroundColor DarkCyan Write-host ' - Move file from' $siteRelativeUrl -BackgroundColor DarkCyan Move-PnPFile -SiteRelativeUrl $siteRelativeUrl -TargetUrl $targetPath -OverwriteIfAlreadyExists -Force:$force Write-host ' - Move item to' $targetPath -BackgroundColor DarkYellow $moveFile = Compare-FileDates -sourceFilePath $siteRelativeUrl -targetFilePath $targetPath If it does, we need to compare the file dates, we want to keep the last modified file in this case. Before we can move the file, we need to check if the file exists in the orignal location. So we now come to the part what it’s all about, moving the actual files. # Recursively calls Get-PnpFolderItem for a given Document Libraryįunction Get-PnpFolderItemRecursively($FolderSiteRelativeUrl) Comparing files dates in SharePoint Online and moving the files The function is based on a script from Josh Einstein that you can find here. We use the Get-PnPFolderItem function here, but then in a recursive way. # SharePoint urlĬonnect-PnPOnline -Url $url -UseWebLogin Finding the duplicate filesīecause we need to process the subfolders as well, the trick here is to go through the folders recursively. We can do this simply with PnPOnline, and I am using the web login switch here so we can use our normal login with MFA here.Ī t the end of the article I will show the complete script. Create the target path (the original location)īefore we can do anything we need to connect to the SharePoint site.Finding the duplicate files and folders.I have broken down the script in a couple of steps, each translated to there own function: So we need to work recursively through all the folders, looking for items with (1) in the name. The duplicate folders are not only at the root level of the document library but can also be 3 levels deep in a subfolder. Merging the duplicate folders and files will take a couple of steps. As you can see in this case, the files can be restored, but having a backup solution would make restoring so much easier. I have written before about it Do you need a backup solution for Microsoft 365. This is one of those situations where a third-party backup solution would really be a lifesaver. Microsoft was unable to assist with the cleanup, so I stepped in. Only due to a communication error, the restore job was done in two parts, resulting in duplicate files. Because of the large number of files that were deleted (over the 750.000), they asked Microsoft to restore it, which they did. In case you are wondering what happened An user deleted files locally without stopping the syncing first. So this made it really easy to find the files and folders and merge them with the original location. Users had already worked in the duplicate folders, so we not only needed to merge the folders but also preserve the last modified file.Īll duplicate files or folders had the same pattern, (1), in their file name. The -r option ( -recursive) makes the operation recursive and -i ( -itemize-changes) selects the particular output format (the >f and the pluses indicates that the file is a new file on the receiving end).A reader of this blog reached out to me if I knew a way to merge duplicate folders in SharePoint Online. The -n option ( -dry-run) makes sure that no file is actually transferred to dir2. The above will output a line for each file anywhere under dir1 that does not have a corresponding file under dir2. diff -q may quiet it down a bit in that case.įor comparing deeper hierarchies, you may want to use rsync: $ rsync -r -ignore-existing -i -n dir1/ dir2 If the file contents of files with identical names differ, then this would obviously output quite a lot of additional data that may not be of interest. If the files in the two directories not only have the same names, but also the same contents, you may use diff (note: BSD diff used here, GNU diff may possibly say something else): $ diff dir1 dir2 The loop, written out more verbosely (and using basename rather than a parameter substitution to delete the directory name from the pathname of the files in the first directory): for f1 in dir1/* do If that file does not exist, its name is printed. This loops through all the names in the first directory, and for each creates the corresponding name of a file expected to exist in the second directory.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |