Check if file exists in s3 using ls and wildcard

aws s3 ls does not support globs, but sync does and has a dry run mode. So if you do this (from an empty directory) you should get the results you want:

aws s3 sync s3://my-bucket . --exclude "*" --include "folder/*myfile*" --dryrun

It will produce lines like this for matching files:

(dryrun) download s3://my-bucket/folder/myfile.txt to folder/myfile.txt
(dryrun) download s3://my-bucket/folder/_myfile-foo.xml to folder/_myfile-foo.xml

(re-drafted from comment as it appears this answered the question)

I myself tried, and failed to use wildcards in the aws-cli, and according to the docs, this is not currently supported. Simplest (though least efficient) solution would be to use grep:

aws s3 ls s3://my-bucket/folder/ | grep myfile

Alternatively, you could write a short python/other script to do this more efficiently (but not in a single command)


S3 doesn't support wildcard listing. You need to list all the files and grep it.

aws s3 ls s3://mybucket/folder --recursive 

Above command will give the list of files under your folder, it searches the files inside the folder as well. Just grep your file name

aws s3 ls s3://mybucket/folder --recursive |grep filename

Suppose if you want to find multiple files, create a regular expression of those and grep it.