Upload to Amazon S3 using Boto3 and return public url

The best solution I found is still to use the generate_presigned_url, just that the Client.Config.signature_version needs to be set to botocore.UNSIGNED.

The following returns the public link without the signing stuff.

config = Config(signature_version=botocore.UNSIGNED)
config.signature_version = botocore.UNSIGNED
boto3.client('s3', config=config).generate_presigned_url('get_object', ExpiresIn=0, Params={'Bucket': bucket, 'Key': key})

The relevant discussions on the boto3 repository are:

  • https://github.com/boto/boto3/issues/110
  • https://github.com/boto/boto3/issues/169
  • https://github.com/boto/boto3/issues/1415

Somebody who wants to build up a direct URL for the public accessible object to avoid using generate_presigned_url for some reason.

Please build URL with urllib.parse.quote_plus considering whitespace and special character issue.

  • My object key: 2018-11-26 16:34:48.351890+09:00.jpg please note whitespace and ':'
  • S3 public link in aws console: https://s3.my_region.amazonaws.com/my_bucket_name/2018-11-26+16%3A34%3A48.351890%2B09%3A00.jpg

Below code was OK for me

import boto3    
s3_client = boto3.client
bucket_location = s3_client.get_bucket_location(Bucket='my_bucket_name')
url = "https://s3.{0}.amazonaws.com/{1}/{2}".format(bucket_location['LocationConstraint'], 'my_bucket_name', quote_plus('2018-11-26 16:34:48.351890+09:00.jpg')
print(url)

I'm in the same situation. Not able to find anything in the Boto3 docs beyond generate_presigned_url which is not what I need in my case since I have public readable S3 Objects.

The best I came up with is:

bucket_location = boto3.client('s3').get_bucket_location(Bucket=s3_bucket_name)
object_url = "https://s3-{0}.amazonaws.com/{1}/{2}".format(
    bucket_location['LocationConstraint'],
    s3_bucket_name,
    key_name)

You might try posting on the boto3 github issues list for a better solution.


I had the same issue. Assuming you know the bucket name where you want to store your data, you can then use the following:

import boto3
from boto3.s3.transfer import S3Transfer

credentials = { 
    'aws_access_key_id': aws_access_key_id,
    'aws_secret_access_key': aws_secret_access_key
}

client = boto3.client('s3', 'us-west-2', **credentials)
transfer = S3Transfer(client)

transfer.upload_file('/tmp/myfile', bucket, key,
                     extra_args={'ACL': 'public-read'})

file_url = '%s/%s/%s' % (client.meta.endpoint_url, bucket, key)