I have a powershell script that writes every file and its attributes recursively starting from a specific directory. This works but the directories could have as many as 1,000,000 files. What I want to do is batch them at 1000 inserts per transaction. Here is the original PS:
$server = ""
$Database = ""
$Path = "C:\Test"
$Connection = New-Object System.Data.SQLClient.SQLConnection
$Connection.ConnectionString = "server='$Server';database='$Database';trusted_connection=true;"
$Connection.Open()
$Command = New-Object System.Data.SQLClient.SQLCommand
$Command.Connection = $Connection
foreach($file in Get-ChildItem -Verbose -Recurse -Path $Path | Select-Object Name,Length,Mode, Directory,CreationTime, LastAccessTime, LastWriteTime) {
$fileName = $file.Name
$fileSize = ([int]$file.Length)
$fileMode = $file.Mode
$fileDirectory = $file.Directory
$fileCreationTime = [datetime]$file.CreationTime
$fileLastAccessTime = [datetime]$file.LastAccessTime
$fileLastWriteTime = [datetime]$file.LastWriteTime
$sql = "
begin
insert TestPowerShell
select '$fileName', '$fileSize', '$fileMode', '$fileDirectory', '$fileCreationTime', '$fileLastAccessTime', '$fileLastWriteTime'
end
"
$Command.CommandText = $sql
echo $sql
$Command.ExecuteNonQuery()
}
$Connection.Close()
My thoughts are to implement some sort of counter that will keep appending the insert until it reaches 1000 and then jump out of the loop and execute. I cannot figure out with this current setup how to batch at 1000, execute and then pick back up with the get-childitem loop.