AWS CLI, using `--cli-input-json` in a pipeline
Solution 1:
Found a workaround for the time being with xargs that is quite clean:
cat ./mytask.json \
| xargs -0 aws ecs register-task-definition --cli-input-json
It only adds xargs -0
and requires --cli-input-json
to be the last argument
Solution 2:
I went digging... It looks like aws
will read the indicated file twice, using the second dataset for it's operation. Of course, in a pipeline, the second read()
will get nothing.
I've added a pipe://
prefix/schema (commit) for use in this situation which will cache the value... I've also made a pull request.
Solution 3:
I was able to pass the JSON as a variable on --cli-input-json
and inject bash variables too.
So your example should be:
aws ecs register-task-definition --cli-input-json "$(cat < ./mytask.json)"
In my scenario at hand I have:
-
Either pass a JSON as you would like
aws autoscaling start-instance-refresh --cli-input-json "$(cat < ./options.json )"
-
Or as HEREDOC in order to pass other variables too. (Not requested but may be useful)
asg_nominal_name=AutoScalingGroupName
aws autoscaling start-instance-refresh --cli-input-json "$(cat <<JSON
{
"AutoScalingGroupName": "${asg_nominal_name}",
"Preferences": {
"InstanceWarmup": 300,
"MinHealthyPercentage": 100
}
}
JSON
)"
I am not so familiar to explain the why in bash terms. Feel free.