Tag Archives: folders

PowerShell commands to copy files: Basic to advanced methods

Copying files between folders, drives and machines is a common administrative task that PowerShell can simplify. Administrators who understand the parameters associated with the Copy-Item commands and how they work together will get the most from the PowerShell commands to copy files.

PowerShell has providers — .NET programs that expose the data in a data store for viewing and manipulation — and a set of common cmdlets that work across providers. These include the *-Item, *-ItemProperty, *-Content, *-Path and *-Location cmdlets. Therefore, you can use the Copy-Item cmdlet to copy files, Registry keys and variables.

The example in the following command uses variable $a:

Copy-Item -Path variable:a -Destination variable:aa

When working with databases, administrators commonly use transactions — one or more commands treated as a unit — so the commands either all work or they all roll back. PowerShell transactions are only supported by the Registry provider, so the UseTransaction parameter on Copy-Item doesn’t do anything. The UseTransaction parameter is part of Windows PowerShell v2 through v5.1, but not in the open source PowerShell Core.

When working with databases, administrators commonly use transactions — one or more commands treated as a unit — so the commands either all work or they all roll back.

PowerShell has a number of aliases for its major cmdlets. Copy-Item uses three aliases.

Get-Alias -Definition copy-item

CommandType     Name           Version    Source

———–     —-            ——-    ——

Alias           copy -> Copy-Item

Alias           cp -> Copy-Item

Alias           cpi -> Copy-Item

These aliases only exist on Windows PowerShell to prevent a conflict with native Linux commands for PowerShell Core users.

Ways to use PowerShell commands to copy files

To show how the various Copy-Item parameters work, create a test file with the following command:

Get-Process | Out-File -FilePath c:testp1.txt

Use this command to copy a file:

Copy-Item -Path C:testp1.txt -Destination C:test2

The issue with this command is there is no indication if the operation succeeds or fails.

When working interactively, you can use the alias and positional parameters to reduce typing.

Copy C:testp1.txt C:test2

While this works in scripts, it makes the code harder to understand and maintain.

To get feedback on the copy, we use the PassThru parameter:

Copy-Item -Path C:testp1.txt -Destination C:test2 -PassThru

    Directory: C:test2

Mode          LastWriteTime    Length Name

—-          ————-     —— —-

-a—-       13/08/2018  11:01    40670 p1.txt

Or we can use the Verbose parameter:

The Verbose parameter
Administrators can use the Verbose parameter to see detailed output when running PowerShell commands.

The Verbose parameter gives you information as the command executes, while PassThru shows you the result.

By default, PowerShell overwrites the file if a file with the same name exists in the target folder. If the file in the target directory is set to read-only, you’ll get an error.

Copy-Item -Path C:testp1.txt -Destination C:test2

Copy-Item : Access to the path ‘C:test2p1.txt’ is denied.

At line:1 char:1

+ Copy-Item -Path C:testp1.txt -Destination C:test2

+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

+ CategoryInfo          : PermissionDenied: (C:testp1.txt:FileInfo) [Copy-Item], UnauthorizedAccessException

+ FullyQualifiedErrorId : CopyFileInfoItemUnauthorizedAccessError,

Microsoft.PowerShell.Commands.CopyItemCommand

You need to be a PowerShell Jedi to overcome this. Use the Force parameter:

Copy-Item -Path C:testp1.txt -Destination C:test2 -Force

As part of the copy process, you can rename the file. You must include the new file name as part of the destination. For example, this code creates nine copies of the p1.txt file called p2.txt through p10.txt.

2..10 | foreach {

 $newname = “p$_.txt”

 Copy-Item -Path C:testp1.txt -Destination C:test$newname

}

PowerShell commands to copy multiple files

There are a few techniques to copy multiple files when using PowerShell.

Copy-Item -Path C:test*.txt -Destination C:test2

Copy-Item -Path C:test*  -Filter *.txt -Destination C:test2

Copy-Item -Path C:test* -Include *.txt -Destination C:test2

These commands copy all the .txt files from the test folder to the test2 folder, but you can also be more selective and only copy files with, for instance, a 6 in the name.

Copy-Item -Path C:test* -Include *6*.txt -Destination C:test2 -PassThru

    Directory: C:test2

Mode        LastWriteTime       Length Name

—-         ————-      —— —-

-a—-       13/08/2018 11:01    40670 p6.txt

-a—-       13/08/2018 11:01    40670 x6.txt

You can also exclude certain files from the copy operation. This command copies all the text files that start with the letter p unless there is a 7 in the name:

Copy-Item -Path C:test*  -Filter p*.txt  -Exclude *7*.txt -Destination C:test2

PowerShell copying
Administrators can fine-tune the PowerShell commands to copy certain files from a folder and exclude others.

You can combine the Path, Filter, Include or Exclude parameters to define exactly what to copy. If you use Include and Exclude in the same call, PowerShell ignores Exclude. You can also supply an array of file names. The path is simplified if your working folder is the source folder for the copy.

Copy-Item -Path p1.txt,p3.txt,x5.txt -Destination C:test2

The Path parameter accepts pipeline input.

Get-ChildItem -Path C:testp*.txt |

where {(($_.BaseName).Substring(1,1) % 2 ) -eq 0} |

Copy-Item -Destination C:test2

PowerShell checks the p*.txt files in the c:test folder to see if the second character is divisible by 2. If so, PowerShell copies the file to the C:test2 folder.

[embedded content]

How to use PowerShell cmdlets to copy, move
and delete files

If you end up with a folder or file name that contains wild-card characters, use the LiteralPath parameter instead of the Path parameter. LiteralPath treats all the characters as literals and ignores any possible wild-card implications.

To copy a folder and all its contents, use the Recurse parameter.

Copy-Item -Path c:test -Destination c:test2 -Recurse

The recursive copy will work its way through all the subfolders below the c:test folder. PowerShell will then create a folder named test in the destination folder and copy the contents of c:test into it.

When copying between machines, you can use UNC paths to bypass the local machine.

Copy-Item -Path \server1fs1testp1.txt -Destination \server2arctest

Another option is to use PowerShell commands to copy files over a remoting session.

$cred = Get-Credential -Credential W16ND01Administrator

$s = New-PSSession -VMName W16ND01 -Credential $cred

In this case, we use PowerShell Direct to connect to the remote machine. You’ll need the Hyper-V module loaded to create the remoting session over the VMBus. Next, use PowerShell commands to copy files to the remote machine.

Copy-Item -Path c:test -Destination c: -Recurse -ToSession $s

You can also copy from the remote machine.

Copy-Item -Path c:testp*.txt -Destination c:test3 -FromSession $s

The ToSession and FromSession parameters control the direction of the copy and whether the source and destination are on the local machine or a remote one. You can’t use ToSession and FromSession in the same command.

Copy-Item doesn’t have any error checking or restart capabilities. For those features, you’ll need to write the code. Here is a starting point:

function Copy-FileSafer {

 [CmdletBinding()]

 param (

   [string]$path,

   [string]$destinationfolder

 )

 if (-not (Test-Path -Path $path)) {

   throw “File not found: $path”

 }

 $sourcefile = Split-Path -Path $path -Leaf

 $destinationfile = Join-Path -Path $destinationfolder -ChildPath $sourcefile

 $b4hash = Get-FileHash -Path $path

 try {

    Copy-Item -Path $path -Destination $destinationfolder -ErrorAction Stop

 }

 catch {

   throw “File copy failed”

 }

 finally {

   $afhash = Get-FileHash -Path $destinationfile

   if ($afhash.Hash -ne $b4hash.Hash) {

      throw “File corrupted during copy”

   }

   else {

     Write-Information -MessageData “File copied successfully” -InformationAction Continue

   }

 }

}

In this script, the file path for the source is tested and a hash of the file is calculated. The file copy occurs within a try-catch block to catch and report errors.

With additional coding, the script can recursively retry a certain number of times. After each copy attempt, the script can calculate the hash of the file and compare it to the original. If they match, all is well. If not, an error is reported.

What are some considerations for a public folders migration?


A public folders migration from one version of Exchange to another can tax the skills of an experienced administrator…

“;
}
});

/**
* remove unnecessary class from ul
*/
$(“#inlineregform”).find( “ul” ).removeClass(“default-list”);

/**
* Replace “errorMessageInput” class with “sign-up-error-msg” class
*/
function renameErrorMsgClass() {
$(“.errorMessageInput”).each(function() {
if ($(this).hasClass(“hidden”)) {
$(this).removeClass(“errorMessageInput hidden”).addClass(“sign-up-error-msg hidden”);
} else {
$(this).removeClass(“errorMessageInput”).addClass(“sign-up-error-msg”);
}
});
}

/**
* when validation function is called, replace “errorMessageInput” with “sign-up-error-msg”
* before return
*/
function validateThis(v, form) {
var validateReturn = urValidation.validate(v, form);
renameErrorMsgClass();
return validateReturn;
}

/**
* DoC pop-up window js – included in moScripts.js which is not included in responsive page
*/
$(“#inlineRegistration”).on(“click”,”a.consentWindow”, function(e) {
window.open(this.href, “Consent”, “width=500,height=600,scrollbars=1”);
e.preventDefault();
});

— but there’s another level of complexity when cloud enters the mix.

A session at last week’s Virtualization Technology Users Group event in Foxborough, Mass. detailed the nuances of Office 365 subscription offerings and the migration challenges administrators face. Microsoft offers a la carte choices for companies that wish to sign up for a single cloud service, such as Exchange Online, and move the messaging platform into the cloud, said Michael Shaw, a solution architect for Office 365 at Whalley Computer Associates in Southwick, Mass., in his presentation.

Microsoft offers newer collaboration services in Office 365, but some IT departments cling to one holdover that the company cannot extinguish — public folders. This popular feature, introduced in 1996 with Exchange 4.0, gives users a shared location to store documents, contacts and calendars.

For companies on Exchange 2013/2016, Microsoft did not offer a way to move “modern” public folders — called “public folder mailboxes” after an architecture change in Exchange 2013 — to Office 365 until March 2017. Prior to that, many organizations either developed their own public folders migration process, used a third-party tool or brought in experts to help with the transition.

Organizations that want to use existing public folders after a switch from on-premises Exchange to Office 365 should be aware of the proper sequence to avoid issues with a public folders migration, Shaw said.

Most importantly, public folders should migrate over last. That’s because mailboxes in Office 365 can access a public folder that is on premises, but a mailbox that is on premises cannot access public folders in the cloud, Shaw said.

“New can always access old, but old can’t access new,” he said.

IT admins should keep in mind, however, that Microsoft dissuades customers from using public folders for document use due to potential issues when multiple people try to work on the same file. Instead, the company steers Office 365 shops to SharePoint Online for document collaboration, and the Groups service for shared calendars and mobile device access.

In another attempt to prevent public folders migration to Office 365, Microsoft caps public folder mailboxes in Exchange Online at 1,000. They also come with a limit of 50 GB per mailbox in the lower subscription levels and a 100 GB quota in the higher E3 and E5 tiers. Public folder storage cannot exceed 50 TB.

Still, support for public folders has no foreseeable end despite Microsoft’s efforts to eradicate the feature. Microsoft did not include public folders in Exchange Server 2007, but reintroduced it in a service pack after significant outcry from customers, Shaw said. Similarly, there was no support for public folders when Microsoft introduced Office 365 in 2011, but it later buckled to customer demand.