How to read a file into a variable in shell?
I want to read a file and save it in variable, but I need to keep the variable and not just print out the file. How can I do this? I have written this script but it isn't quite what I needed:
#!/bin/sh
while read LINE
do
echo $LINE
done <$1
echo 11111-----------
echo $LINE
In my script, I can give the file name as a parameter, so, if the file contains "aaaa", for example, it would print out this:
aaaa
11111-----
But this just prints out the file onto the screen, and I want to save it into a variable! Is there an easy way to do this?
Solution 1:
In cross-platform, lowest-common-denominator sh
you use:
#!/bin/sh
value=`cat config.txt`
echo "$value"
In bash
or zsh
, to read a whole file into a variable without invoking cat
:
#!/bin/bash
value=$(<config.txt)
echo "$value"
Invoking cat
in bash
or zsh
to slurp a file would be considered a Useless Use of Cat.
Note that it is not necessary to quote the command substitution to preserve newlines.
See: Bash Hacker's Wiki - Command substitution - Specialities.
Solution 2:
If you want to read the whole file into a variable:
#!/bin/bash
value=`cat sources.xml`
echo $value
If you want to read it line-by-line:
while read line; do
echo $line
done < file.txt
Solution 3:
Two important pitfalls
which were ignored by other answers so far:
- Trailing newline removal from command expansion
- NUL character removal
Trailing newline removal from command expansion
This is a problem for the:
value="$(cat config.txt)"
type solutions, but not for read
based solutions.
Command expansion removes trailing newlines:
S="$(printf "a\n")"
printf "$S" | od -tx1
Outputs:
0000000 61
0000001
This breaks the naive method of reading from files:
FILE="$(mktemp)"
printf "a\n\n" > "$FILE"
S="$(<"$FILE")"
printf "$S" | od -tx1
rm "$FILE"
POSIX workaround: append an extra char to the command expansion and remove it later:
S="$(cat $FILE; printf a)"
S="${S%a}"
printf "$S" | od -tx1
Outputs:
0000000 61 0a 0a
0000003
Almost POSIX workaround: ASCII encode. See below.
NUL character removal
There is no sane Bash way to store NUL characters in variables.
This affects both expansion and read
solutions, and I don't know any good workaround for it.
Example:
printf "a\0b" | od -tx1
S="$(printf "a\0b")"
printf "$S" | od -tx1
Outputs:
0000000 61 00 62
0000003
0000000 61 62
0000002
Ha, our NUL is gone!
Workarounds:
ASCII encode. See below.
-
use bash extension
$""
literals:S=$"a\0b" printf "$S" | od -tx1
Only works for literals, so not useful for reading from files.
Workaround for the pitfalls
Store an uuencode base64 encoded version of the file in the variable, and decode before every usage:
FILE="$(mktemp)"
printf "a\0\n" > "$FILE"
S="$(uuencode -m "$FILE" /dev/stdout)"
uudecode -o /dev/stdout <(printf "$S") | od -tx1
rm "$FILE"
Output:
0000000 61 00 0a
0000003
uuencode and udecode are POSIX 7 but not in Ubuntu 12.04 by default (sharutils
package)... I don't see a POSIX 7 alternative for the bash process <()
substitution extension except writing to another file...
Of course, this is slow and inconvenient, so I guess the real answer is: don't use Bash if the input file may contain NUL characters.
Solution 4:
this works for me:
v=$(cat <file_path>)
echo $v
Solution 5:
With bash you may use read
like this:
#!/usr/bin/env bash
{ IFS= read -rd '' value <config.txt;} 2>/dev/null
printf '%s' "$value"
Notice that:
-
The last newline is preserved.
-
The
stderr
is silenced to/dev/null
by redirecting the whole commands block, but the return status of the read command is preserved, if one needed to handle read error conditions.