Can't run for loops inside script command over ssh conection

I'm trying to run a for loop after using the command

script

to output the command-terminal copy to a txt file (for further checking). This is all being done over an SSH connection on solar-putty.

This is my code:

filename=$(ls /home/*.txt  | xargs -n1 -I{} basename "{}" | head -3) 
echo "$filename"
script /home/test.txt
for f in $filename; do
        echo $f; done
exit

Which does not initiate the for loop. It simply logs in the command above and I can't execute it.

When I run:

for f in $filename; do
        echo $f; done

Everything works fine...

I'm using all of this inside a TMUX terminal as sudo su (because I'm afraid of loosing my terminal over SSH and I need sudo su)


If I understand what you're doing, the problem is that script is starting a new shell (as a subprocess), and it doesn't have the old (parent process) shell's variables. Can you define the variable after starting script, so it's defined in the right shell?

Another possible solution is to export the variable, which converts it from a shell variable to an environment variable, and subprocesses will inherit a copy of it. Note that, depending on which shell you're using, you may need to double-quote the value being assigned to avoid problems with word-splitting:

export filename="$(ls /home/*.txt  | xargs -n1 -I{} basename "{}" | head -3)"

BTW, this way of handling lists of filenames will run into trouble with names that have spaces or some other shell metacharacters. The right way to handle lists of filenames is to store them as arrays, but unfortunately it's not possible to export arrays.

[EDIT:] The problem with filenames with spaces and/or other weird characters is that 1) the way ls outputs filenames is ambiguous and inconsistent, and 2) shell "word splitting" on unquoted variables can parse lists of filenames in ... unfortunate ... ways. For an extreme example, suppose you had a file named /home/this * that.txt -- if that's in a variable, and you use the variable without double-quotes around it, it'll treat /home/this and that.txt as totally separate things, and it'll also expand the * into a list of filenames in the current directory. See this question from yesterday for just one of many examples of this sort of thing happening for real.

To safely handle filenames with weird characters, the basic rules are that to get lists of files you use raw shell wildcards (not ls!) or find with -exec or -print0, always store lists of filenames in arrays (not plain variables), and double-quote all variable (/array) references. See BashFAQ #20: "How can I find and safely handle file names containing newlines, spaces or both?"

In this case, you just need to use a wildcard expression to make an array of paths, then the shell's builtin string manipulation to remove the path prefix:

filepaths=( /home/*.txt )    # Create array of matching files
filenames=( "${filepaths[@]##*/}" )    # Remove path prefixes

You can then use "${filenames[@]:0:3}" to get the first three names from the array. You can either create a new array with just the first three files, or use that directly in the loop:

first3files=( "${filenames[@]:0:3}" )    # ...or...
for f in "${filenames[@]:0:3}"; do
    echo "$f"    # Always double-quote variable references!
done

Note that bash doesn't allow stacking most array/variable modifiers, so getting the array of paths, stripping the prefixes, and selecting just the first few, must be done as three separate steps.