permission denied on AWS Transfer on SFTP server
I can log into my server with cyberduck or filezilla but cannot read my homedirectory. s3 bucket "mybucket"
exists. In cyber duck I see
"Cannot readdir on root. Please contact your web hosting service provider for assistance." and in Filezilla "Error: Reading directory .: permission denied"
even though I can connect to server.
Am I missing some user permission in the policies below ?
These are my permissions
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "VisualEditor0",
"Effect": "Allow",
"Action": [
"s3:ListBucket",
"s3:GetBucketLocation"
],
"Resource": "arn:aws:s3:::MYBUCKET"
},
{
"Sid": "VisualEditor1",
"Effect": "Allow",
"Action": [
"s3:PutObject",
"s3:GetObject",
"s3:DeleteObject"
],
"Resource": "arn:aws:s3:::MYBUCKET/*"
},
{
"Sid": "VisualEditor2",
"Effect": "Allow",
"Action": "transfer:*",
"Resource": "*"
}
]
}
These are my trust relationships:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": {
"Service": "s3.amazonaws.com"
},
"Action": "sts:AssumeRole"
},
{
"Effect": "Allow",
"Principal": {
"Service": "transfer.amazonaws.com"
},
"Action": "sts:AssumeRole"
}
]
}
Solution 1:
User Role should be:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "AllowListingOfUserFolder",
"Action": [
"s3:ListBucket",
"s3:GetBucketLocation"
],
"Effect": "Allow",
"Resource": [
"arn:aws:s3:::BUCKET_NAME"
]
},
{
"Sid": "HomeDirObjectAccess",
"Effect": "Allow",
"Action": [
"s3:PutObject",
"s3:GetObject",
"s3:DeleteObjectVersion",
"s3:DeleteObject",
"s3:GetObjectVersion"
],
"Resource": "arn:aws:s3:::BUCKET_NAME/*"
}
]
}
Trust relationship of User:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "",
"Effect": "Allow",
"Principal": {
"Service": "transfer.amazonaws.com"
},
"Action": "sts:AssumeRole"
}
]
}
Home directory for your user should be /BUCKET_NAME
Solution 2:
I had issues with this until I added, specifically, the s3:GetObject
permission to the aws_transfer_user
policy. I expected s3:ListBucket
to be enough, but it was not. sftp> ls
would fail until I had GetObject.
Here's the Terraform for it:
resource "aws_transfer_user" "example-ftp-user" {
count = length(var.uploader_users)
user_name = var.uploader_users[count.index].username
server_id = aws_transfer_server.example-transfer.id
role = aws_iam_role.sftp_content_incoming.arn
home_directory_type = "LOGICAL"
home_directory_mappings {
entry = "/"
target = "/my-bucket/$${Transfer:UserName}"
}
policy = <<POLICY
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "AllowSftpUserAccessToS3",
"Effect": "Allow",
"Action": [
"s3:ListBucket",
"s3:PutObject",
"s3:GetObject",
"s3:DeleteObjectVersion",
"s3:DeleteObject",
"s3:GetObjectVersion",
"s3:GetBucketLocation"
],
"Resource": [
"${aws_s3_bucket.bucket.arn}/${var.uploader_users[count.index].username}",
"${aws_s3_bucket.bucket.arn}/${var.uploader_users[count.index].username}/*"
]
}
]
}
POLICY
}
And I define users in a .tfvars
file; e.g.:
uploader_users = [
{
username = "firstuser"
public_key = "ssh-rsa ...."
},
{
username = "seconduser"
public_key = "ssh-rsa ..."
},
{
username = "thirduser"
public_key = "ssh-rsa ..."
}
]
I hope this helps someone. It took me a lot of tinkering before I finally got this working, and I'm not 100% sure of the interactions with other policies might ultimately be in play. But after applying this was the moment I could connect and list bucket contents without getting "Permission denied".