Linux: How to use a file as input and output at the same time?

I've just run the following in bash:

uniq .bash_history > .bash_history

and my history file ended up completely empty.

I guess I need a way to read the whole file before writing to it. How is that done?

PS: I obviously thought of using a temporary file, but I'm looking for a more elegant solution.


I recommend using sponge from moreutils. From the manpage:

DESCRIPTION
  sponge  reads  standard  input  and writes it out to the specified file. Unlike
  a shell redirect, sponge soaks up all its input before opening the output file.
  This allows for constructing pipelines that read from and write to the same 
  file.

To apply this to your problem, try:

uniq .bash_history | sponge .bash_history

echo "$(uniq .bash_history)" > .bash_history

should have the desired result. The subshell gets executed before .bash_history is opened for writing. As explained in Phil P's answer, by the time .bash_history is read in the original command, it has already been truncated by the > operator.


The problem is that your shell is setting up the command pipeline before running the commands. It's not a matter of "input and output", it's that the file's content is already gone before uniq even runs. It goes something like:

  1. The shell opens the > output file for writing, truncating it
  2. The shell sets up to have file-descriptor 1 (for stdout) be used for that output
  3. The shell executes uniq, perhaps something like execlp("uniq", "uniq", ".bash_history", NULL)
  4. uniq runs, opens .bash_history and finds nothing there

There are various solutions, including the in-place editing and the temporary file usage which others mention, but the key is to understand the problem, what's actually going wrong and why.