How to loop through file to copy into s3 bucket
I have a file (file.txt) containing the absolute path
cat file.txt
#There are white space characters in the file name
/home/user/aws/doc/my doc 1.pdf
/home/user/aws/doc/my doc 2.pdf
IFS=''
for FILE in $(cat "${file.txt}"); do
echo "${FILE}"
/usr/local/bin/aws s3 cp ${FILE} "${S3Path_Bucket}PDF_FOLDER/"
done
exit 0
When I ran the script I am getting an error below
The user-provided path
/home/user/aws/doc/my doc 1.pdf
/home/user/aws/doc/my doc 2.pdf does not exist.
I tried placing the escape character in file.txt
/home/user/aws/doc/my\ doc\ 1.pdf
/home/user/aws/doc/my\ doc\ 1.pdf
I get the same error
The user-provided path
/home/user/aws/doc/my\ doc\ 1.pdf
/home/user/aws/doc/my\ doc\ 2.pdf does not exist.
My file.txt is pretty big and I am trying to figure out how do I copy these files by escaping white space and in a loop.
Thank you!
Solution 1:
You don't escape the white space, you quote your variables correctly. See https://mywiki.wooledge.org/Quotes for more info and copy/paste your code into http://shellcheck.net as the bash tag you used already told you to do then fix the issues it tells you about. Also see https://mywiki.wooledge.org/DontReadLinesWithFor and correct-bash-and-shell-script-variable-capitalization.
Try this:
< file.txt xargs -I {} /usr/local/bin/aws s3 cp -- "{}" "${S3Path_Bucket}PDF_FOLDER/"
or if you really want to use a loop then:
while IFS= read -r file; do
/usr/local/bin/aws s3 cp -- "$file" "${S3Path_Bucket}PDF_FOLDER/"
done < file.txt