Pipe file content into PowerShell command without loading the entire file to memory

Solution 1:

It seems that PowerShell Get-Content is not very efficient when it comes to huge files, so you may be able to use the -ReadCount switch to tell it how many lines to pipe at a time from the file. I put 500 below so it'd pipe 500 lines at a time.

Read the Microsoft PowerShell article below too.

  • https://technet.microsoft.com/en-us/library/hh849787.aspx

PowerShell v 5.0 Example

  • Get-Content -ReadCount 500 backup.sql | & psql --username=... db_name

PowerShell Legacy Version Example

  • Get-Content -Read 500 backup.sql | & psql --username=... db_name

Since PowerShell Get-Content isn't effecient with huge files, have a look at Start-Process

Read the Microsoft PowerShell article below too.

  • https://technet.microsoft.com/en-us/library/hh849848.aspx

PowerShell v 5.0 Example (using -RedirectStandardInput switch)

Start-Process "C:\Program Files\PostgreSQL\<version>\bin\psql.exe" '--username=... db_name' -RedirectStandardInput backup.sql -NoNewWindow -Wait

Solution 2:

How can I simulate the behavior of < in PowerShell?

Check my answer to this question: PowerShell's pipe adds linefeed

Here is my Invoke-RawPipeline function (get latest version from this Gist).

Use it to pipe binary data between processes' Standard Output and Standard Input streams. It can read input stream from file/pipeline and save resulting output stream to file.

It requires PsAsync module to be able to launch and pipe data in multiple processes.

Solution 3:

To solve this issue I created a powershell cmd.exe process with the executable parameters

$cmdArgs = @('/c','psql.exe','--username=...', .. , 'dbname' , '<', 'backup.sql' )
&'cmd.exe' $cmdArgs

Worked perfect for what I was trying to do